technology and consistently apply best practices. Qualifications for Software Engineer Hands-on experience working with technologies like Hadoop, Hive, Pig, Oozie, Map Reduce, Spark, Sqoop, Kafka, Flume, etc. Strong DevOps focus and experience building and deploying infrastructure with cloud deployment technologies like Ansible, Chef, Puppet, etc. Experience with test-driven More ❯
IDL and their association with the different application services on the domain. Key responsibilities :- • Expertise in BDM Installation, upgrades and configuration • Familiar with using Sqoop arguments required to source data from any traditional database systems into Hadoop. • Familiar with configurations needed to route users to different Queues on the cluster. More ❯
IDL and their association with the different application services on the domain. Key responsibilities:- • Expertise in BDM Installation, upgrades and configuration • Familiar with using Sqoop arguments required to source data from any traditional database systems into Hadoop. • Familiar with configurations needed to route users to different Queues on the cluster. More ❯
Deploy BDM components across environments and automate processes - Optimise ETL performance and tune code as needed - Source data from traditional systems to Hadoop using Sqoop - Write and compare Native Hive QL vs. BDM job performance - Configure cluster queues and manage resource usage - Perform capacity planning across DEV, QA, PROD, and More ❯
Norwich, Norfolk, East Anglia, United Kingdom Hybrid / WFH Options
Stott & May Professional Search Limited
Deploy BDM components across environments and automate processes - Optimise ETL performance and tune code as needed - Source data from traditional systems to Hadoop using Sqoop - Write and compare Native Hive QL vs. BDM job performance - Configure cluster queues and manage resource usage - Perform capacity planning across DEV, QA, PROD, and More ❯
Deploy BDM components across environments and automate processes - Optimise ETL performance and tune code as needed - Source data from traditional systems to Hadoop using Sqoop - Write and compare Native Hive QL vs. BDM job performance - Configure cluster queues and manage resource usage - Perform capacity planning across DEV, QA, PROD, and More ❯
Deploy BDM components across environments and automate processes - Optimise ETL performance and tune code as needed - Source data from traditional systems to Hadoop using Sqoop - Write and compare Native Hive QL vs. BDM job performance - Configure cluster queues and manage resource usage - Perform capacity planning across DEV, QA, PROD, and More ❯
their association with the different application services on the domain. Key responsibilities: . Expertise in BDM Installation, upgrades and configuration . Familiar with using Sqoop arguments required to source data from any traditional database systems into Hadoop. . Familiar with configurations needed to route users to different Queues on the More ❯
london, south east england, United Kingdom Hybrid / WFH Options
Focused Futures Consultancy LTD
Role: Data Architect (AWS) Location: London, United Kingdom (Flexible Hybrid Working) Employment Type: Permanent Business Unit: Data Management/Analytics Our client, a global consultancy, harnesses the power of data, artificial intelligence, and deep industry expertise to reinvent business models More ❯
Norwich, Norfolk, United Kingdom Hybrid / WFH Options
Hamilton Barnes
day-to-day platform administration, including service monitoring and environment health checks. Move components between environments and contribute to the automation of deployments. Use Sqoop to extract data from traditional databases into Hadoop environments, ensuring efficient ingestion. Run and optimise HiveQL scripts to benchmark and validate load performance against BDM … Ideally Bring: Strong experience in Informatica BDM administration. Hands-on knowledge of BDM/EDC/IDL configuration and service management. Experience working with Sqoop, HiveQL, and Hadoop-based data environments. Familiarity with cluster queue management and big data platform tuning. Experience with environment automation and Scripting for deployment workflows. More ❯