Greater London, England, United Kingdom Hybrid / WFH Options
InterEx Group
experience in Big Data implementation projects Experience in the definition of Big Data architecture with different tools and environments: Cloud (AWS, Azure and GCP), Cloudera, No-sql databases (Cassandra, Mongo DB), ELK, Kafka, Snowflake, etc. Past experience in Data Engineering and data quality tools (Informatica, Talend, etc.) Previous involvement in more »
Greater London, England, United Kingdom Hybrid / WFH Options
First Derivative
development and the opportunity to design your own path. We support a variety of external training courses and accreditations such as AWS, GCP, Azure, Cloudera to name a few and are truly passionate about our Mentor Program, through which our senior colleagues generously set aside personal time to coach and more »
implement and manage data lake/data warehouse platforms. (Some of the following types of providers: AWS, Microsoft Azure, Google Cloud Platform, Databricks, Snowflake, Cloudera, Spark, MongoDB) Done this at companies using high volumes of data, ideally in retailing. Other sectors where used high volume data would also be relevant more »
attention to detail, and problem-solving ability to produce high-quality data solutions and products. Experience with analysing very large datasets , ideally using the Cloudera Data Platform. Experience in asset management with strong knowledge of asset management/accounting type data. Well-versed in data modelling, error-handling, version control more »
years Glide experience Proficiency in PL/SQL and an additional object-oriented programming language Experience in big data instances such as Cloudera, Azure, Snowflake, etc. Ability to roll out analytics dashboards (e.g., Power BI, Tableau Server) Ability to translate business needs into technical requirements Qualifications Knowledge of the ServiceNow more »
equivalent analyst position Proficiency in PL/SQL and an additional object-oriented programming language (Highly desirable) Experience in big data instances such as Cloudera, Azure, Snowflake, etc. Structured thinking with the ability to break down ambiguous problems and propose impactful data modeling designs Ability to roll out analytics dashboards more »
make corrective recommendations. Monitoring – Be able to monitor Spark jobs using wider tools such as Grafana to see whether there are Cluster level failures. Cloudera (CDP) – Knowledge of understanding how Cloudera Spark is set up and how the run time libraries are used by PySpark code. Prophecy – High level understanding more »