london, south east england, united kingdom Hybrid / WFH Options
InterEx Group
experience in Big Data implementation projects Experience in the definition of Big Data architecture with different tools and environments: Cloud (AWS, Azure and GCP), Cloudera, No-sql databases (Cassandra, Mongo DB), ELK, Kafka, Snowflake, etc. Past experience in Data Engineering and data quality tools (Informatica, Talend, etc.) Previous involvement in More ❯
a technical discipline (e.g., cloud, artificial intelligence, machine learning, mobile, etc.) Preferred qualifications, capabilities, and skills Knowledge of AWS Knowledge of Databricks Understanding of Cloudera Hadoop, Spark, HDFS, HBase, Hive. Understanding of Maven or Gradle About the Team J.P. Morgan is a global leader in financial services, providing strategic advice More ❯
cloud platforms, preferably IBM Cloud. Contributions to open-source projects or personal projects demonstrating Big Data and Java development skills. Relevant certifications such as Cloudera Certified Associate (CCA) or Hortonworks Certified Developer (HCD) are considered a plus. By joining IBM's Public Sector team as a Big Data Java Developer More ❯
DV Clearance. WE NEED THE DATA ENGINEER TO HAVE…. Current DV clearance MOD or Enhanced Experience with big data tools such as Hadoop, Cloudera or Elasticsearch Experience With Palantir Foundry Experience working in an Agile Scrum environment with tools such as Confluence/Jira Experience in design, development, test More ❯
best practices to streamline data workflows and reduce manual interventions. Must have: AWS, ETL, EMR, GLUE, Spark/Scala, Java, Python. Good to have: Cloudera - Spark, Hive, Impala, HDFS, Informatica PowerCenter, Informatica DQ/DG, Snowflake Erwin. Qualifications: Bachelor's or Master's degree in Computer Science, Data Engineering, or … years of experience in data engineering, including working with AWS services. Proficiency in AWS services like S3, Glue, Redshift, Lambda, and EMR. Knowledge of Cloudera-based Hadoop is a plus. Strong ETL development skills and experience with data integration tools. Knowledge of data modeling, data warehousing, and data transformation techniques. More ❯
expertise in Terraform, Kubernetes, Shell/Powershell scripting, CI/CD pipelines (GitLab, Jenkins), Azure DevOps, IaC, and experience with big data platforms like Cloudera, Spark, and Azure Data Factory/DataBricks. Key Responsibilities: Implement and maintain Infrastructure as Code (IaC) using Terraform, Shell/Powershell scripting, and CI/… the Technical and Solution Architect teams to design the overall solution architecture for end-to-end data flows. Utilize big data technologies such as Cloudera, Hue, Hive, HDFS, and Spark for data processing and storage. Ensure smooth data management for marketing consent and master data management (MDM) systems. Key Skills … integration and delivery for streamlined development workflows. Azure Data Factory/DataBricks : Experience with these services is a plus for handling complex data processes. Cloudera (Hue, Hive, HDFS, Spark) : Experience with these big data tools is highly desirable for data processing. Azure DevOps, Vault : Core skills for working in Azure More ❯