london, south east england, United Kingdom Hybrid / WFH Options
Solirius Consulting
or Django, Docker Experience working with ETL pipelines is desirable e.g. Luigi, Airflow or Argo Experience with big data technologies, such as Apache Spark, Hadoop, Kafka, etc. Data acquisition and development of data sets and improving data quality Preparing data for predictive and prescriptive modelling Hands on coding experience more »
ETL processes, and data warehousing. - Significant exposure and hands on at least 2 of the programming languages - Python, Java, Scala, GoLang. - Significant experience with Hadoop, Spark and other distributed processing platforms and frameworks. - Experience working with Open table/storage formats like delta lake, apache iceberg or apache hudi. more »
london, south east england, United Kingdom Hybrid / WFH Options
McGregor Boyall
models, ETL processes, and data warehousing solutions. Programming: Utilize Python, Java, Scala, or GoLang to build and optimize data pipelines. Distributed Processing: Work with Hadoop, Spark, and other platforms for large-scale data processing. Real-Time Data Streaming: Develop and manage pipelines using CDC, Kafka, and Apache Spark. Database more »
Docker, Kubernetes). CI/CD pipelines and tools (e.g. DBT, Jenkins, GitLab CI) Desirable: Experience with analytics tools and frameworks (e.g., Apache Spark, Hadoop). SQL Sagemaker, DataRobot Google Cloud and Azure Data platform metadata driven frameworks to ingest, transform and manage data more »
one leading business intelligence platform (e.g. Microsoft, Crystal, Qlik, SAP, Tableau). Good understanding of open source, big data, and cloud data platforms (e.g. Hadoop, Spark, Hive, Pentaho, AWS, Azure); given a business problem, you can analyse and evaluate options and recommend solutions. Proven experience in designing, building and more »
Must have 8+ years' Experience with Relational Databases like Oracle, NoSQL Databases and/or Big Data technologies (e.g. Oracle, SQL Server, Postgres, Spark, Hadoop, other Open Source). Must have experience in Data Security Solutions (Identity and Access Management and Data Security Access Management) Must have 3+ years more »
and classification techniques, and algorithms Fluency in a programming language (Python, C,C++, Java, SQL) Familiarity with Big Data frameworks and visualization tools (Cassandra, Hadoop, Spark, Tableau more »
and classification techniques, and algorithms Fluency in a programming language (Python, C,C++, Java, SQL) Familiarity with Big Data frameworks and visualization tools (Cassandra, Hadoop, Spark, Tableau more »
Analytics. Strong SQL and Python skills. Experience with data modeling, ETL processes, and data warehousing. Knowledge of big data technologies such as Spark and Hadoop is a plus. Excellent problem-solving skills and attention to detail. Strong communication and collaboration skills. Experience in the healthcare sector is a plus more »
industry experience Experience in distributed system design Experience with Pure/Alloy Working knowledge of open-source tools such as AWS lambda, Prometheus Spark, Hadoop orSnowflake knowledge would be a plus. Additional Information Location: This role can be delivered in a hybrid nature from one of these offices Dublin more »
within a typical retail trading environment is key. Experience required: A background in leveraging hands on skills using tools such as Python, R, Spark, Hadoop, SQL and cloud based platforms such as GCP, Azure and AWS to manipulate and analyse various data sets in large volumes Background in data more »
london, south east england, United Kingdom Hybrid / WFH Options
Anson McCade
and NoSQL databases Programming languages such as Spark or Python Amazon Web Services, Microsoft Azure or Google Cloud and distributed processing technologies such as Hadoop Benefits: Base Salary: £45,000 - £75,000 (DoE) Discretionary Bonus DV Bonus Flex Fund: £5000 Health: Private Medical Insurance Annual Leave: 25 Days plus more »
on experience with analytic tools like R & Python; & visualization tools like Tableau & Power BI Exposure to cloud platforms and big data systems such as Hadoop HDFS, and Hive is a plus Ability to work with IT and Data Engineering teams to help embed analytic outputs in business processes Graduate more »
experience Demostrate In-depth knowledge of large-scale data platforms (Databricks, Snowflake) and cloud-native tools (Azure Synapse, RedShift) Experience of analytics technologies (Spark, Hadoop, Kafka) Have familiarity with Data Lakehouse architecture, SQL Server, DataOps, and data lineage concepts Demonstrate In-depth knowledge of large-scale data platforms (Databricks more »
Reading, Berkshire, South East, United Kingdom Hybrid / WFH Options
Bright Purple
have: A passion for manipulation and visualization of data. Working knowledge of Linux. Knowledge of network protocols and operation. Data analysis and visualization in Hadoop/Splunk. Experience with network security products and solutions. Ability to work with Python, HTML, CSS and JavaScript experience. Ifyou are a driven and more »
share knowledge with the team. Qualifications You will have expertise within the following: Java and Python development knowledge (Essential) Previous experience with Spark or Hadoop (Essential) Trino orAirflow (Desirable) Architecture and capabilities. Designing and implementing complex solutions with a focus on scalability and security. Excellent communication and collaboration skills. more »
Woking, Surrey, United Kingdom Hybrid / WFH Options
CBSbutler Holdings Limited trading as CBSbutler
Experience with cloud technologies i.e. AWS or Azure Programming language experience i.e. Java, Python, node.js or SQL Data technologies experience i.e. PostgreSQL, MongoDB, kafka, Hadoop If you are interested in discussing this DevOps Engineer role further, please apply or send a copy of your updated CV to (url removed more »
Romsey, Hampshire, United Kingdom Hybrid / WFH Options
CBSbutler Holdings Limited trading as CBSbutler
Experience with cloud technologies i.e. AWS or Azure Programming language experience i.e. Java, Python, node.js or SQL Data technologies experience i.e. PostgreSQL, MongoDB, kafka, Hadoop If you are interested in discussing this DevOps Engineer role further, please apply or send a copy of your updated CV to (url removed more »
Founding Data Engineer Role update - this role is now contract only. And really a mix of data science and engineering... Central London 3 days a week, hybrid NB – we are looking for someone with specific experience around fintech, finance, banking more »
Experience with RDS like MySQL or PostgreSQL or others, and experienced with NoSQL like Redis or MongoDB or others; * Experience with BigData technologies like Hadoop eco-system is a plus. * Excellent writing, proof-reading and editing skills. Able to create documentation that can express cloud architectures using text and more »
Version 1 has celebrated over 26 years in the Technology industry and continues to be trusted by global brands to deliver IT solutions that drive customer success. Version 1 has several strategic technology partners including Microsoft, AWS, Oracle, Red Hat more »
Romsey, Hampshire, United Kingdom Hybrid / WFH Options
CBSbutler Holdings Limited trading as CBSbutler
e.g., PostgreSQL), NoSQL databases (e.g., MongoDB), and implement data streaming solutions (e.g., Kafka) for efficient data handling. Work with big data technologies such as Hadoop to manage and analyse large datasets. Implement security best practices and ensure compliance with cybersecurity standards. Collaborate with development teams to integrate security practices … Familiarity with SQL and database management systems, including relational and NoSQL databases. Knowledge of data streaming technologies (e.g., Kafka) and big data platforms (e.g., Hadoop). Understanding of cybersecurity principles and best practices. Familiarity with architectural styles and experience in implementing DevSecOps practices. Excellent problem-solving skills and attention more »