Reading, Berkshire, South East, United Kingdom Hybrid / WFH Options
Bowerford Associates
experience. ETL Tools such as Azure Data Fabric (ADF) and Databricks or similar ones. Data Lakes: Azure Data, Delta Lake, Data Lake or Databricks Lakehouse. Certifications: AWS, Azure, or Cloudera certifications are a plus. To be considered for this role you MUST have in-depth experience of Hadoop and HDFS. The role comes with an extensive benefits package including a More ❯
Latest Data Science platforms (e.g., Databricks, Dataiku, AzureML, SageMaker) and frameworks (e.g., TensorFlow, MXNet, scikit-learn) Software engineering practices (coding standards, unit testing, version control, code review) Hadoop distributions (Cloudera, Hortonworks), NoSQL databases (Neo4j, Elastic), streaming technologies (Spark Streaming) Data manipulation and wrangling techniques Development and deployment technologies (virtualisation, CI tools like Jenkins, configuration management with Ansible, containerisation with Docker More ❯
following architectural frameworks (TOGAF, ZACHMAN, FEAF) Cloud Experience: AWS or GCP preferred, particularly around migrations and cloud architecture Good technical knowledge and understanding of big data frameworks like Hadoop, Cloudera etc. Deep technical knowledge of database development, design and migration Experience of deployment in cloud using Terraform or CloudFormation Automation or Scripting experience using languages such as Python & Bash etc. More ❯
and AI solutions. Experience with cloud platforms, preferably IBM Cloud. Contributions to open-source projects or personal projects demonstrating Big Data and Java development skills. Relevant certifications such as Cloudera Certified Associate (CCA) or Hortonworks Certified Developer (HCD) are considered a plus. By joining IBM's Public Sector team as a Big Data Java Developer, you'll have the opportunity More ❯
/AWS/GCP. Experience with building distributed systems, using solutions such as Spark, Big Data Technologies would be preferred but not mandatory. Knowledge of Big Data querying tools (Cloudera stack or similar) e.g. Hive or Impala would be preferred but not mandatory. Experience working on parallel development tracks at the same time is required Experience in leading smaller development More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Tec Partners
or Ansible . Comfortable working with cloud platforms (e.g., AWS, Azure, GCP) and container orchestration tools like Kubernetes . Excellent troubleshooting skills and a collaborative approach. Bonus: Experience with Cloudera , Hadoop , CDSW , or CML is a plus. What's on Offer: Flexible hybrid working arrangements Core benefits including private healthcare, dental, life assurance and pension Optional benefits including health cash More ❯
end users or large data sets with 10M+ database records. This is a very Big Data platform. Experience building REST services (orchestration layer) on CRUD data services based on Cloudera Hadoop stack, with an emphasis on performance optimization. Understanding how to secure data in a REST architecture. Knowledge of scaling web applications, including load balancing, caching, indexing, normalization, etc. Proficiency More ❯
Apache Hadoop We are looking for open-source contributors to Apache projects, who have an in-depth understanding of the code behind the Apache ecosystem, should have experience in Cloudera or similar distribution and should possess in depth knowledge of bigdata tech stack. Requirement: Experience of platform engineering along with application engineering (hands-on) Experience in design of an open More ❯
Apache Hadoop We are looking for open-source contributors to Apache projects, who have an in-depth understanding of the code behind the Apache ecosystem, should have experience in Cloudera or similar distribution and should possess in depth knowledge of bigdata tech stack. Requirement: Experience of platform engineering along with application engineering (hands-on) Experience in design of an open More ❯