Greater London, England, United Kingdom Hybrid / WFH Options
InterEx Group
experience in Big Data implementation projects Experience in the definition of Big Data architecture with different tools and environments: Cloud (AWS, Azure and GCP), Cloudera, No-sql databases (Cassandra, Mongo DB), ELK, Kafka, Snowflake, etc. Past experience in Data Engineering and data quality tools (Informatica, Talend, etc.) Previous involvement in more »
Greater London, England, United Kingdom Hybrid / WFH Options
First Derivative
development and the opportunity to design your own path. We support a variety of external training courses and accreditations such as AWS, GCP, Azure, Cloudera to name a few and are truly passionate about our Mentor Program, through which our senior colleagues generously set aside personal time to coach and more »
Belfast, Northern Ireland, United Kingdom Hybrid / WFH Options
First Derivative
development and the opportunity to design your own path. We support a variety of external training courses and accreditations such as AWS, GCP, Azure, Cloudera to name a few and are truly passionate about our Mentor Program, through which our senior colleagues generously set aside personal time to coach and more »
implement and manage data lake/data warehouse platforms. (Some of the following types of providers: AWS, Microsoft Azure, Google Cloud Platform, Databricks, Snowflake, Cloudera, Spark, MongoDB) Done this at companies using high volumes of data, ideally in retailing. Other sectors where used high volume data would also be relevant more »
attention to detail, and problem-solving ability to produce high-quality data solutions and products. Experience with analysing very large datasets , ideally using the Cloudera Data Platform. Experience in asset management with strong knowledge of asset management/accounting type data. Well-versed in data modelling, error-handling, version control more »
Philadelphia, Pennsylvania, United States Hybrid / WFH Options
Comcast Corporation
an Agile development environment; perform DevOps processes using Concourse, Docker, and Kubernetes; perform large-scale data processing using Apache Spark; manage big data on Cloudera; perform Machine Learning, including developing and deploying predictive models leveraging ML algorithms; use AWS cloud platform; deploy tools and applications on Unix; write SQL and … one (1) year of experience programming using Python and Scala; using Jira; performing large-scale data processing using Apache Spark; managing big data on Cloudera; performing Machine Learning; using AWS cloud platform; deploying tools and applications on Unix; and writing SQL in Hive and NoSQL databases. Disclaimer: This information has more »
years Glide experience Proficiency in PL/SQL and an additional object-oriented programming language Experience in big data instances such as Cloudera, Azure, Snowflake, etc. Ability to roll out analytics dashboards (e.g., Power BI, Tableau Server) Ability to translate business needs into technical requirements Qualifications Knowledge of the ServiceNow more »
make corrective recommendations. Monitoring – Be able to monitor Spark jobs using wider tools such as Grafana to see whether there are Cluster level failures. Cloudera (CDP) – Knowledge of understanding how Cloudera Spark is set up and how the run time libraries are used by PySpark code. Prophecy – High level understanding more »