/CD. Strong design and coding skills (e.g. Python, Scala, JavaScript). Experience with Microsoft or AWS data stack e.g. Microsoft Azure Data Lake, Hadoop (preferably with Spark), Cosmos DB, HDInsight/HBase, MongoDB, Redis, Azure Table/Blob stores etc. Exposure to tools like SAP technologies and Alteryx more »
Surrey, England, United Kingdom Hybrid / WFH Options
The JM Longbridge Group
e.g., PostgreSQL), NoSQL databases (e.g., MongoDB), and implement data streaming solutions (e.g., Kafka) for efficient data handling. Work with big data technologies such as Hadoop to manage and analyse large datasets. Qualifications : Bachelor’s degree in computer science, Engineering, or related field (or equivalent experience). Experience with cloud more »
PostgreSQL), NoSQL databases (e.g., M MongoDB), and implement data streaming solutions (e.g., Kafka) for efficient data handling. - Work with big data technologies such as Hadoop to manage and analyse large datasets. Qualifications: - Bachelor's Degree in Computer Science or Engineering. - Experience with cloud technologies, particularly Azure and AWS. - Proficiency more »
Employment Type: Permanent
Salary: £52000 - £62000/annum Bonus + Full Benefits
at scale. What we expect from you Strong experience building python packages, installable with pip/conda Experience processing big data, ideally in a Hadoop/Spark environment Experience working with relational databases, and SQL-like operations Experience with Airflow/orchestration tooling is beneficial Understanding of Continuous Integration more »
Warehousing: Familiarity with data warehousing concepts and technologies, including star and snowflake schema designs. Big Data Technologies: Understanding of big data platforms such as Hadoop, Spark, and tools like Hive, Pig, and HBase. Data Integration: Ability to integrate data from disparate sources using middleware or integration tools as well more »
and NoSQL databases Programming languages such as Spark or Python Amazon Web Services, Microsoft Azure or Google Cloud and distributed processing technologies such as Hadoop Benefits : Base Salary: £45,000 - £75,000 (DoE) Discretionary Bonus: Circa 10% per annum DV Bonus: Circa £5,000 Flex Fund: £5000 Health: Private more »
partners Preferred Requirements Experience or strong interest in blockchain and other Web 3.0 technologies Experience with OLAP technologies, such as, Presto/Trino, Spark, Hadoop, Athena, or BigQuery is a plus Experience in Golang or any other strongly-typed programming language Experience mentoring and supporting fellow engineers Our Selection more »
schema design and with both OLTP and OLAP systems. You'll also have the chance to work with Cassandra (NoSQL) and highly scalable, distributed Hadoop clusters and ensure performance is as high as possible with 24/7 availability for high volumes. You’ll have the opportunity to work more »
have: A passion for manipulation and visualization of data. Working knowledge of Linux. Knowledge of network protocols and operation. Data analysis and visualization in Hadoop/Splunk. Experience with network security products and solutions. Ability to work with Python, HTML, CSS and JavaScript experience. Ifyou are a driven and more »
such as Python, C#, .Net and/or JavaScript is highly desirable. Experience with cloud platforms (e.g., Azure) and data technologies (e.g., SQL, NoSQL, Hadoop, Spark). PLEASE NOTE: You must have either UK citizenship or permanent leave to remain in the UK. Due to the high volume of more »
following: .NET (VB, C#, ASP.NET, .NET CORE) MVC Framework Python JavaScript (REACT, Bootstrap Frameworks) Database design SQL/SQL Server NoSQL technologies e.g., MongoDB, Hadoop, etc. Your Experience If you’re the right person for the role, you’ll bring experience of working on a range of applications across more »
following: .NET (VB, C#, ASP.NET, .NET CORE) MVC Framework Python JavaScript (REACT, Bootstrap Frameworks) Database design SQL/SQL Server NoSQL technologies e.g., MongoDB, Hadoop, etc. If you’re the right person for the role, you’ll bring experience of working on a range of applications across the development more »
or managing direct report(s) Travel 5%-10% Ability to program in newer and emerging languages such as R and Python; working knowledge of Hadoop and other big data technologies Additional Information About Epsilon Epsilon is a global advertising and marketing technology company positioned at the center of Publicis more »
more) Experience in Data mining, Data warehousing, ETL Experience in handling large volumes of data on SQL, NoSQL and Big Data databases Experience in Hadoop ecosystem: Hadoop, Spark, Hive, and/or Scala Experience in programming languages: PHP, Python, C++/Java Experience in Web development in Laravel more »
the Backstage framework for developer portals. - React/Typescript full stack development - Data Science technologies including knowledge of spark, dask, parquet, iceberg formats, ApacheHadoop, Hive, Presto, SQL, Postgres, Immuta Roles & Responsibilities Implement build and deployment pipelines for applications in Gitlab - Assist in conversion of - Convert existing applications to more »
Woking, Surrey, United Kingdom Hybrid / WFH Options
CBSbutler Holdings Limited trading as CBSbutler
Experience with cloud technologies i.e. AWS or Azure Programming language experience i.e. Java, Python, node.js or SQL Data technologies experience i.e. PostgreSQL, MongoDB, kafka, Hadoop If you are interested in discussing this DevOps Engineer role further, please apply or send a copy of your updated CV to (url removed more »
Romsey, Hampshire, United Kingdom Hybrid / WFH Options
CBSbutler Holdings Limited trading as CBSbutler
Experience with cloud technologies i.e. AWS or Azure Programming language experience i.e. Java, Python, node.js or SQL Data technologies experience i.e. PostgreSQL, MongoDB, kafka, Hadoop If you are interested in discussing this DevOps Engineer role further, please apply or send a copy of your updated CV to (url removed more »
warehousing. Min 7yrs with Python Big Data & Data lake solutions; PostgreSQL, Clickhouse or SnowFlake etc Cloud Infrasutcurre (AWS services) Data processing pipelines using Kafka, Hadoop, Hive, Storm, or Zookeeper Hands-on team leadership The Reward Joining a fast-growth, successful blockchain business. The role offers fully remote work, a more »
manage multiple tasks and projects simultaneously. Preferred Qualifications AWS Certified Solutions Architect or other relevant AWS certifications. Experience with big data technologies such as Hadoop, Spark, or similar. Knowledge of data governance and data quality best practices. Familiarity with machine learning and AI concepts and tools. more »
best of breed Java toolsets - focused on MicroServices Architectures, powerful front- and backend frameworks, RESTful services, and everything from NoSQL databases like MongoDB and Hadoop, high-performance data grids like HazelCast to multi-node relational systems. You will be working in a Scrum Team of cross-functional skills in more »
and performance of technical proofs of concept. · Experience working in cloud environments with AWS, Azure, or GCP. · Platform implementation experience with technologies like ApacheHadoop, Kafka, Storm, Spark, Cassandra, Elasticsearch, and others. · Conduct technical architecture reviews, code reviews, and performance of technical proofs of concept. · Ability to assess IT more »
knowledge of relevant NHS information processes (eg Commissioning & Payment by Result mechanisms, Secondary Uses Service) oExperience of working with 'big data' technologies, such as Hadoop, NoSQL, Orange or Python Disclosure and Barring Service Check This post is subject to the Rehabilitation of Offenders Act (Exceptions Order) 1975 and as more »
Data Engineer £40,000-£45,000 Stoke-On-Trent, Hybrid Permanent role Responsibilities: Design and implement data pipelines to collect, process, and analyse aviation-related data from various sources. Develop and maintain data models, schemas, and databases optimised for performance more »
Experience of Data Lake/Hadoop platform implementation Hands-on experience in implementation and performance tuning Hadoop/Spark implementations Experience ApacheHadoop and the Hadoop ecosystem Experience with one or more relevant tools (Sqoop, Flume, Kafka, Oozie, Hue, Zookeeper, Hcatalog, Solr, Avro) Experience with one … or more SQL-on-Hadoop technology (Hive, Impala, Spark SQL, Presto) Experience developing software code in one or more programming languages (Java, Python, etc.) Preferred Qualifications Masters or PhD in Computer Science, Physics, Engineering or Math Hands on experience leading large-scale global data warehousing and analytics projects Ability more »