is clearly related to the position Active TS/SCI clearance with polygraph Ability to develop Elasticsearch & MapReduce Analytics Experience with Java MapReduce, the Hadoop Distributed File System (HDFS), and technologies such as Hadoop Hive, Pig, Etc. Experience with distributed scalable Big Data Store (NoSQL) such as H more »
the position Computer Science (CS) degree or related field TS/SCI clearance with polygraph Strong Java skills Customer GHOSTMACHINE analytic development Experience with Hadoop (Map Reduce & Accumulo) Experience with software configuration using Gitlab, Git, Gitflo and Nexus Experience with Linux You Might Also Have: Some networking knowledge Familiarity more »
Cycle (SDLC). Manage and analyze large volumes of unstructured data from internal and external sources using Hive, Impala, Oozie, SPARK (Scala), Sqoop, Flume, Hadoop API, and HDFS to optimize data loads and data transformations. Apply testing techniques, including unit, system, and regression testing, to verify deployed components work … requirements. Two (2) years of experience implementing Machine Learning Modeling to model data science and monitor performance. And two (2) years of experience utilizing Hadoop (including Hive, Impala and Oozie) to analyze large volumes of unstructured data from internal and external sources. In lieu of a Bachelor's degree more »
problems and solutions. Advanced experience developing business deliverables that leverage business intelligence platforms, data management platforms, or SQL-based languages (Tableau, Business Objects, Snowflake, Hadoop, Netezza, NoSQL, ANSI SQL, or related). Proven, expert ability to use or build business knowledge through meaningful partnerships at the individual contributor, leadership more »
problems and solutions. Advanced experience developing business deliverables that leverage business intelligence platforms, data management platforms, or SQL-based languages (Tableau, Business Objects, Snowflake, Hadoop, Netezza, NoSQL, ANSI SQL, or related). Proven, expert ability to use or build business knowledge through meaningful partnerships at the individual contributor, leadership more »
years of Data Architecture Experience with cloud-based data platforms (e.g., AWS, Azure, Google Cloud Platform). Familiarity with big data technologies (e.g., Hadoop, Spark, Kafka). Experience with marketing analytics tools (e.g., Google Analytics, Adobe Analytics, Salesforce Marketing Cloud) and how to drive key performance indicators. Knowledge of more »
APIs and microservices. Familiarity with data engineering concepts and tools such as data pipelines, ETL processes, SQL, and big data technologies (e.g., Apache Spark, Hadoop). Strong problem-solving skills, analytical thinking, and attention to detail. Excellent communication and collaboration skills, with the ability to work effectively in cross more »
AWS SageMaker, or Azure Machine Learning for model development and deployment. Data Analytics and Big Data Technologies: Proficient in big data technologies such as Hadoop, Spark, and Kafka for handling large datasets. Experience with data visualization tools like Tableau, Power BI, or Qlik for deriving actionable insights from data. more »
data preprocessing, feature engineering, and data visualization techniques.Proficiency in working with large-scale datasets, SQL and NoSQL databases, and big data processing frameworks (e.g., Hadoop, Spark).Familiarity with software engineering best practices, including version control, testing, and code review.Strong mathematical and statistical skills, with the ability to apply statistical more »
data preprocessing, feature engineering, and data visualization techniques.Proficiency in working with large-scale datasets, SQL and NoSQL databases, and big data processing frameworks (e.g., Hadoop, Spark).Familiarity with software engineering best practices, including version control, testing, and code review.Strong mathematical and statistical skills, with the ability to apply statistical more »
edinburgh, central scotland, United Kingdom Hybrid / WFH Options
Change Digital – Digital & Tech Recruitment
Glue, AWS Redshift, and Python Experience with ETL processes, data integration, and data warehousing. Strong SQL skills Experience with Big Data technologies such as Hadoop, Spark, and Kafka Familiarity with cloud platforms (AWS, Azure, Google Cloud) Working knowledge of data visualisation tools (PowerBI, Tableau, Qlik Sense) Additional skills: Client more »
large-scale data science/data analytics projects Ability to lead effectively across organizations Hands-on experience with Data Analytics technologies such as AWS, Hadoop, Spark, Spark SQL, Mlib or Storm/Samza Implementing AWS services in a variety of distributed computing, enterprise environments Proficiency with at least one more »
Docker, Kubernetes). CI/CD pipelines and tools (e.g. DBT, Jenkins, GitLab CI) Desirable: Experience with analytics tools and frameworks (e.g., Apache Spark, Hadoop). SQL Sagemaker, DataRobot Google Cloud and Azure Data platform metadata driven frameworks to ingest, transform and manage data more »
Docker, Kubernetes). CI/CD pipelines and tools (e.g. DBT, Jenkins, GitLab CI) Desirable: Experience with analytics tools and frameworks (e.g., Apache Spark, Hadoop). SQL Sagemaker, DataRobot Google Cloud and Azure Data platform metadata driven frameworks to ingest, transform and manage data more »
modeling, data access, and data storage techniques. Excellent problem-solving skills and the ability to think algorithmically. Desirable Skills: Knowledge of big data technologies (Hadoop, Spark, Kafka) is highly desirable. Familiarity with data governance and compliance requirements. more »
modeling, data access, and data storage techniques. Excellent problem-solving skills and the ability to think algorithmically. Desirable Skills: Knowledge of big data technologies (Hadoop, Spark, Kafka) is highly desirable. Familiarity with data governance and compliance requirements. more »
as DBT, FiveTran, etc. Understanding of Agile Delivery best practice Good knowledge of the relevant technologies e.g. SQL, Oracle, PostgreSQL, Python, ETL pipelines, Airflow, Hadoop, Parquet. Strong problem-solving and analytical abilities. Ability to present solutions and limitations to non-IT business experts ABOUT YOU Integrity, respect, intellectually curious more »
as DBT, FiveTran, etc. Understanding of Agile Delivery best practice Good knowledge of the relevant technologies e.g. SQL, Oracle, PostgreSQL, Python, ETL pipelines, Airflow, Hadoop, Parquet. Strong problem-solving and analytical abilities. Ability to present solutions and limitations to non-IT business experts ABOUT YOU Integrity, respect, intellectually curious more »
through improved data handling and analysis. Responsibilities: Build predictive models using machine-learning techniques that generate data-driven insights on modern data platforms (Spark, Hadoop and other map-reduce tools); Develop and productionalize containerized algos for deployment in hybrid cloud environments (GCP, Azure) Connect and blend data from various more »
guildford, south east england, United Kingdom Hybrid / WFH Options
Hawksworth
data warehousing and ETL frameworks Proficiency in working with relational databases (e.g., Oracle, PostgreSQL), Parquet/Delta files and big data technologies (e.g. Synapse, Hadoop, Spark, Kafka) Experience working with Microsoft Azure and associated data services Strong analytical and data interpretation skills, with the ability to communicate findings to more »
London, England, United Kingdom Hybrid / WFH Options
Global Relay
JMeter or similar tools Web services technology such as REST, JSON or Thrift Testing web applications with Selenium WebDriver Big data technology such as Hadoop, MongoDB, Kafka or SQL Network principles and protocols such as HTTP, TLS and TCP Continuous integration systems such as Jenkins or Bamboo Continuous delivery more »
Do you tick these 4 boxes? If so, please read on..... Must haves: Python Data modelling, data warehousing and ETL frameworks Oracle, PostgreSQL Synapse, Hadoop, Spark, Kafka Exciting times to be part of an enterprise environment like this one, there won't be many on the same scale! The more »
and reports. 7. Knowledge of data integration techniques and tools (e.g., SSIS, Informatica) is desirable. 8. Experience in working with big data technologies (e.g., Hadoop, Spark) is a plus. 9. Excellent communication and collaboration skills, with the ability to effectively interact with technical and non-technical stakeholders. 10. Strong more »
and reports. 7. Knowledge of data integration techniques and tools (e.g., SSIS, Informatica) is desirable. 8. Experience in working with big data technologies (e.g., Hadoop, Spark) is a plus. 9. Excellent communication and collaboration skills, with the ability to effectively interact with technical and non-technical stakeholders. 10. Strong more »
Newcastle Upon Tyne, England, United Kingdom Hybrid / WFH Options
G.Digital
experience applying ML models Python, SQL or R Any cloud data platforms would be great - AWS, Azure or GCP Any big data technologies like Hadoop would be great Strong communication and stakeholder engagement What’s on offer to you? 💰 £85-95k Bonus of up to 10% Flexible working more »