Sittingbourne, Kent, South East, United Kingdom Hybrid / WFH Options
Southern Housing
requirements into system functional and non-functional requirements Experience with popular database programming languages including SQL, PL/SQL, possibly extending to NoSQL/Hadoop oriented, and nonrelational databases. In your supporting statement, it is important that you address how you meet each of the above criteria providing real more »
Experience of Data Lake/Hadoop platform implementation Hands-on experience in implementation and performance tuning Hadoop/Spark implementations Experience ApacheHadoop and the Hadoop ecosystem Experience with one or more relevant tools (Sqoop, Flume, Kafka, Oozie, Hue, Zookeeper, Hcatalog, Solr, Avro) Experience with one … or more SQL-on-Hadoop technology (Hive, Impala, Spark SQL, Presto) Experience developing software code in one or more programming languages (Java, Python, etc.) Preferred Qualifications Masters or PhD in Computer Science, Physics, Engineering or Math Hands on experience leading large-scale global data warehousing and analytics projects Ability more »
instrumental in ensuring the security and efficiency of the data handling and reporting processes. Key Responsibilities: Data Processing: Utilize Apache Spark, AWS RDS, and Hadoop to process large datasets efficiently and securely. Reporting: Generate comprehensive and insightful reports using Tableau. Business Rules Management: Implement and manage business rules with more »
EC1N, Farringdon Without, Greater London, United Kingdom
Damia Group Ltd
your key responsibilities will be to : Manage operational procedures. Transform and process data using Apache Spark. Administer AWS RDS with MySQL. Work with the Hadoop platform. Create reports using Tableau. Utilize Red Hat Decision Central Skills/Experience Required of the SC Cleared DevSecOps Engineer: Strong operational procedures knowledge. more »
Employment Type: Permanent
Salary: £50000 - £65000/annum 15% cash flex and 10% bonus
Red Hat Decision Central Key Responsibilities: Manage operational procedures. Transform and process data using Apache Spark. Administer AWS RDS with MySQL. Work with the Hadoop platform. Create reports using Tableau. Utilize Red Hat Decision Central About Capgemini Capgemini is a global leader in partnering with companies to transform and more »
Data Lake/Hadoop platform implementation Good level hands-on experience in implementation and performance tuning Hadoop/Spark implementations Experience ApacheHadoop and the Hadoop ecosystem Experience with one or more relevant tools (Sqoop, Flume, Kafka, Oozie, Hue, Zookeeper, Hcatalog, Solr, Avro) Experience with one … or more SQL-on-Hadoop technology (Hive, Impala, Spark SQL, Presto) Experience developing software code in one or more programming languages (Java, Python, etc.) Preferred Qualifications: Masters or PhD in Computer Science, Physics, Engineering or Maths Hands on experience leading large-scale global data warehousing and analytics projects Ability more »
Experience with RDS like MySQL or PostgreSQL or others, and experienced with NoSQL like Redis or MongoDB or others; * Experience with BigData technologies like Hadoop eco-system is a plus. * Excellent writing, proof-reading and editing skills. Able to create documentation that can express cloud architectures using text and more »
Key Skills 3+ years of Python experience Highly statistical and Analytical Exposure to Google Cloud Platform ( BigQuery, GCS, Datalab, Dataproc, Cloud ML (desirable) Spark & Hadoop experience Strong communication skills Good problem solving skills Qualifications Bachelor's degree or equivalent experience in a quantative field (Statistics, Mathematics, Computer Science, Engineering … and classification techniques, and algorithms Fluency in a programming language (Python, C,C++, Java, SQL) Familiarity with Big Data frameworks and visualization tools (Cassandra, Hadoop, Spark, Tableau) This is a permanent position, and offers flexibility with Hybrid working, 2-3 days per week in the office, depending on workload more »
3 Development resources (London) with experience in designing and building platforms, and supporting applications both in cloud environments and on-premises. These resources are expected to be open-source contributors to Apache projects, have an in-depth understanding of the more »
Version 1 has celebrated over 26 years in the Technology industry and continues to be trusted by global brands to deliver IT solutions that drive customer success. Version 1 has several strategic technology partners including Microsoft, AWS, Oracle, Red Hat more »
Version 1 has celebrated over 26 years in the Technology industry and continues to be trusted by global brands to deliver IT solutions that drive customer success. Version 1 has several strategic technology partners including Microsoft, AWS, Oracle, Red Hat more »
Senior Software Engineer - Data Initial 6-month contract role Remote Working £525 - £625, Inside IR35 We're recruiting on behalf of an IT Services Provider for a Senior Software Engineer (Data) to design and develop modern data management tools. As more »
Key responsibilities: Develop robust architectures and designs for big data platform and applications within the ApacheHadoop ecosystem. Implement and deploy big data platform and solutions on-premises and in hybrid cloud environments. Read, understand, and modify open-source code to implement bug fixes and perform upgrades. Ensure all … Your Profile Key Skills/Knowledge/Experience: Proven experience in architecting, designing, building, and deploying big data platforms and applications using the ApacheHadoop ecosystem in hybrid cloud and private cloud scenarios. Experience with hybrid cloud big data platform designs and deployments, especially in AWS, Azure, or Google … Cloud Platform. Experience in large-scale data platform builds and application migrations. Expert knowledge of ApacheHadoop ecosystem and associated Apache projects (eg, HDFS, Hive, HBase, Spark, Ranger, Kafka, Yarn etc.). Proficiency in Kubernetes for container orchestration. Strong understanding of security practices within big data environments. Ability to more »
Leading ecommerce client are now searching for a Senior Data Engineer to contribute towards the delivery of their data strategy. This engineer will shape our client’s data function, delivering end to end solutions for an array of customer data more »
Key responsibilities: Develop robust architectures and designs for big data platform and applications within the ApacheHadoop ecosystem. Implement and deploy big data platform and solutions on-premises and in hybrid cloud environments. Read, understand, and modify open-source code to implement bug fixes and perform upgrades. Ensure all … Your Profile Key Skills/Knowledge/Experience: Proven experience in architecting, designing, building, and deploying big data platforms and applications using the ApacheHadoop ecosystem in hybrid cloud and private cloud scenarios. Experience with hybrid cloud big data platform designs and deployments, especially in AWS, Azure, or Google … Cloud Platform. Experience in large-scale data platform builds and application migrations. Expert knowledge of ApacheHadoop ecosystem and associated Apache projects (eg, HDFS, Hive, HBase,... more »
least 2 matches) - Python, Groovy, JavaScript/TypeScript Experience in at least 2 of these techs - Confluent Kafka, Mongo DB, Streamsets, IBM CDC, Hive, Hadoop, API, Informatica, Airflow, and other similar technologies Banking experience Some experience with AWS cloud tech stack and native services Previous experience of building and … around them so it is key they have experience of working with Streaming & Batch technology stack - Confluent Kafka, Mongo DB, Streamsets, IBM CDC, Hive, Hadoop, API, Informatica, Airflow, and other similar technologies SME level skills and experience of troubleshooting and resolving day to day issues relating to test automation more »
Director of Data & AI London based We are searching for a Director of Data and Artificial Intelligence- someone with hands on experience designing AI solutions to solve complex business problems. Your new role is a leadership position at a business more »
Senior Vice President Data & AI London based We are searching for a Senior Vice President of Data and Artificial Intelligence- someone with hands on experience designing AI solutions to solve complex business problems. Your new role is a leadership position more »
to translate business requirements into high and low-level designs. You'll also define architecture and technical designs, create data flows and integrations using Hadoop, and work closely with product teams throughout testing. Key Responsibilities: Lead Java and Python project development. Design and develop API integrations using Spark. Collaborate … client teams. Stay updated with the latest trends and best practices. Qualifications: Expertise in Java and Python development (Essential). Experience with Spark or Hadoop (Essential). Knowledge of Trino or Airflow (Desirable). Proven ability to design and implement scalable and secure solutions. Excellent communication and collaboration skills. more »
with JIRA, strong knowledge of Python for automation, and expertise in test strategy, test management, and defect management. Additionally, familiarity with SQL, Big Data (Hadoop), ETL, and basic understanding of AWS is desirable. Responsibilities: Configure and manage JIRA projects, dashboards, defects, and test cases. Develop and maintain in-house … on experience with Python scripting. Strong understanding of test management and defect tracking. Knowledge of SQL for querying and reporting. Familiarity with Big Data (Hadoop) and ETL processes. Basic understanding of AWS services. If you are interested, apply here more »