Graduate degree in Computer Science, Statistics, Informatics, Information Systems, or another quantitative field. They should also have experience using the following software/tools: Experience with big data tools: Hadoop, Spark, Kafka, etc. Experience with relational SQL and NoSQL databases, including Postgres and Cassandra. Experience with data pipeline and workflow management tools: Azkaban, Luigi, Airflow, etc. Experience with AWS More ❯
with cloud platforms (AWS, Azure, GCP) and deploying models. Ability to use data visualization tools like Tableau or Power BI. Nice-to-Have: Familiarity with big data tools like Hadoop, Kafka, Spark. Knowledge of data governance and validation standards in energy. Experience with distributed computing and large-scale deployment. Strong communication skills for explaining complex validation results. At GE More ❯
projects demonstrating your ability to solve problems with innovative ideas or coding challenges. Familiarity with Agile practices in a collaborative team setting. Exposure to big data tools, such as Hadoop and Spark, for managing large-scale datasets. Experience with cloud platforms like Microsoft Azure. Why Join? They have created a super collaborative culture offering a great balance of impact More ❯
curious, and driven to innovate continuously. Ability to manage multiple tasks and projects under tight deadlines. Experience with software development (e.g., Python, Java, Scala) and Big Data tools (e.g., Hadoop, Spark), ideally in a cloud-based environment (AWS, Azure, or GCP). Proficiency in SQL. Experience using SAP SuccessFactors and certification in SuccessFactors modules. An Inclusive Workplace City Football More ❯
scale data pipelines. Experience in dealing with streaming and batch compute frameworks like Spring Kafka, Kafka Streams, Flink, Spark Streaming, Spark. Experience with large-scale computing platforms such as Hadoop, Hive, Spark, and NoSQL stores. Exposure to UI development of visualisations in modern web apps (e.g. Streamlit, Retool) is nice to have. Experience with AWS, GCP. Benefits Roku is More ❯
Telford, England, United Kingdom Hybrid / WFH Options
hackajob
development environment Experience estimating task effort and identifying dependencies Excellent communication skills Familiarity with Python and its numerical, data and machine learning libraries Favourable If You Have Experience of Hadoop and Jenkins Azure Certified AWS Certified Familiarity with Java This position is a full time, permanent role and applicants must have (or be able to acquire) SC clearance. Ad More ❯
automated testing and deployments using CI/CD pipelines. Continual learning through internal and external training. What you'll bring Mandatory Proficient in at least two of the following: Hadoop, GCP or AWS for creating Big Data solutions. Skilled in using technologies such as Scala or Java. Hands-on experience with Continuous Integration and Deployment strategies. Solid understanding of More ❯
Belfast, Northern Ireland, United Kingdom Hybrid / WFH Options
Citi
Citi Belfast, Northern Ireland, United Kingdom Join or sign in to find your next job Join to apply for the Application Senior Support Manager - VP role at Citi Citi Belfast, Northern Ireland, United Kingdom 3 days ago Be among the More ❯
Terraform, Jenkins, Bamboo, Concourse etc. Monitoring utilising products such as: Prometheus, Grafana, ELK, filebeat etc. Observability - SRE Big Data solutions (ecosystems) and technologies such as: Apache Spark and the Hadoop Ecosystem Edge technologies e.g. NGINX, HAProxy etc. Excellent knowledge of YAML or similar languages The following Technical Skills & Experience would be desirable for Data Devops Engineer: Jupyter Hub Awareness More ❯
of predictive modelling, machine learning, clustering and classification techniques. Fluency in a programming language (Python, C, C++, Java, SQL). Familiarity with Big Data frameworks and visualization tools (Cassandra, Hadoop, Spark, Tableau). More ❯
Experience with data analysis and visualization tools (e.g., Matplotlib, Seaborn, Tableau). Ability to work independently and lead projects from inception to deployment. Experience with big data technologies (e.g., Hadoop, Spark) and cloud platforms (e.g., AWS, GCP, Azure) is desirable. MSc or PhD in Computer Science, Data Science, Artificial Intelligence, or related field is preferred. #J-18808-Ljbffr More ❯
learn). Proficiency in working with large datasets, data wrangling, and data preprocessing. Experience in data science, statistical modelling, and data analytics techniques. Experience with big data technologies (e.g., Hadoop, Spark) and cloud platforms (e.g., AWS, GCP, Azure) is desirable. MSc or PhD in Computer Science, Data Science, Artificial Intelligence, or related field is preferred. More ❯
. Strong statistical, mathematical, and data modelling skills. Experience with data visualisation tools (e.g., Power BI, Tableau, matplotlib, Plotly). Familiarity with big data tools and cloud platforms (e.g., Hadoop, Spark, Azure, AWS). Ability to work with sensitive or classified data in secure environments. Desirable: Experience in applying data science to one or more of the following: cyber More ❯
tools Experience with SQL Experience working directly with business stakeholders to translate between data and business needs Experience using Cloud Storage and Computing technologies such as AWS Redshift, S3, Hadoop, etc. Experience with data modeling, warehousing and building ETL pipelines PREFERRED QUALIFICATIONS Experience managing, analyzing and communicating results to senior leadership Experience with statistical analytics and programming languages such More ❯
Stevenage, England, United Kingdom Hybrid / WFH Options
Capgemini
Experience with data analysis and visualization tools (e.g., Matplotlib, Seaborn, Tableau). • Ability to work independently and lead projects from inception to deployment. • Experience with big data technologies (e.g., Hadoop, Spark) and cloud platforms (e.g., AWS, GCP, Azure) is desirable. • MSc or PhD in Computer Science, Data Science, Artificial Intelligence, or related field is preferred. What does ‘Get The More ❯
scikit-learn, pandas, R, Weka; excellence in more than one of these is highly desirable Good understanding of SQL and NoSQL (plus) database technologies; understanding of big data technologies: Hadoop, Spark (plus) Good knowledge of statistics: hypothesis testing, confidence intervals and A/B tests Ability to demonstrate good understanding of statistical models/machine learning algorithms Why Join More ❯
scikit-learn, pandas, R, Weka; excellence in more than one of these is highly desirable Good understanding of SQL and NoSQL (plus) database technologies; understanding of big data technologies: Hadoop, Spark (plus) Good knowledge of statistics: hypothesis testing, confidence intervals and A/B tests Ability to demonstrate good understanding of statistical models/machine learning algorithms Why Join More ❯
Manchester, England, United Kingdom Hybrid / WFH Options
Scot JCB Ltd
scikit-learn, pandas, R, Weka; excellence in more than one of these is highly desirable Good understanding of SQL and NoSQL (plus) database technologies; understanding of big data technologies: Hadoop, Spark (plus) Good knowledge of statistics: hypothesis testing, confidence intervals and A/B tests Ability to demonstrate good understanding of statistical models/machine learning algorithms Why Join More ❯
Leeds, England, United Kingdom Hybrid / WFH Options
Scot JCB Ltd
scikit-learn, pandas, R, Weka; excellence in more than one of these is highly desirable Good understanding of SQL and NoSQL (plus) database technologies; understanding of big data technologies: Hadoop, Spark (plus) Good knowledge of statistics: hypothesis testing, confidence intervals and A/B tests Ability to demonstrate good understanding of statistical models/machine learning algorithms Why Join More ❯
to line manage small team of junior data engineers. Continual learning through internal and external training. What you'll bring Mandatory Proficient in at least two of the following: Hadoop, GCP or AWS for creating Big Data solutions. Skilled in using technologies such as Scala or Java. Hands-on experience with Continuous Integration and Deployment strategies. Solid understanding of More ❯
Belfast, Northern Ireland, United Kingdom Hybrid / WFH Options
Citigroup Inc
Development role. Demonstrated execution capabilities. Strong analytical and quantitative skills; Data driven and results-oriented Experience with Core Java required (Spark a plus) Experience with SQL Experience working with Hadoop, Hive, Sqoop and other technologies in Cloudera's CDP distribution. Understanding of version control (git) Experience working as part of an agile team. Excellent written and oral communication skills More ❯
Belfast, Northern Ireland, United Kingdom Hybrid / WFH Options
Citi
Senior Applications Support Analyst - AVP About the Team: Data Solutions Technology strives to provide measurable competitive advantage to our business by delivering high quality, innovative and cost effective reference data technology and operational; solutions in order to meet the needs More ❯
Leeds, England, United Kingdom Hybrid / WFH Options
Widen the Net Limited
team! You will develop scalable data pipelines, ensure data quality, and support business decision-making with high-quality datasets. -Work across technology stack: SQL, Python, ETL, Big Query, Spark, Hadoop, Git, Apache Airflow, Data Architecture, Data Warehousing -Design and develop scalable ETL pipelines to automate data processes and optimize delivery -Implement and manage data warehousing solutions, ensuring data integrity More ❯
Edinburgh, Scotland, United Kingdom Hybrid / WFH Options
Widen the Net Limited
team! You will develop scalable data pipelines, ensure data quality, and support business decision-making with high-quality datasets. -Work across technology stack: SQL, Python, ETL, Big Query, Spark, Hadoop, Git, Apache Airflow, Data Architecture, Data Warehousing -Design and develop scalable ETL pipelines to automate data processes and optimize delivery -Implement and manage data warehousing solutions, ensuring data integrity More ❯
Manchester, England, United Kingdom Hybrid / WFH Options
Ripjar
in production and have a curiosity and interest in learning more. In this role, you will be using python (specifically pyspark) and Node.js for processing data, backed by various Hadoop stack technologies such as HDFS and HBase. MongoDB and Elasticsearch are used for indexing smaller datasets. Airflow & Nifi are used to co-ordinate the processing of data, while Jenkins … software systems in production and have a curiosity and interest in learning more. You will be using Python (specifically pyspark) and Node.js for processing data You will be using Hadoop stack technologies such as HDFS and HBase Experience using MongoDB and Elasticsearch for indexing smaller datasets would be beneficial Experience using Airflow & Nifi to co-ordinate the processing of More ❯