in computer science, Mathematics, Statistics or similar engineering discipline. Working knowledge of Linux. Knowledge of network protocols and operation. Data analysis and visualization in Hadoop, Splunk. Data manipulation using tools like MapReduce or SQL. Experience with network security products and solutions. HTML, CSS and Javascript experience. Python. Ifyou are more »
best of breed Java toolsets - focused on MicroServices Architectures, powerful front- and backend frameworks, RESTful services, and everything from NoSQL databases like MongoDB and Hadoop, high-performance data grids like HazelCast to multi-node relational systems. You will be working in a Scrum Team of cross-functional skills in more »
programming language, ideally Python but can also be Java or C/c++ SQL expeirence Familiarity with Big Data frameworks and visualization tools (Cassandra, Hadoop, Spark, Tableau) Get in touch with Ella Alcott - Ella@engagewithus.com more »
Bedford, England, United Kingdom Hybrid / WFH Options
Understanding Recruitment
PyTorch etc.) MLOps experience Nice to have: Familiarity with Git or other Version Control Systems Computer Vision Library exposure Understanding of Big Data Technologies (Hadoop, Spark etc) Experience with Cloud platforms (AWS, GCP or Azure) This is a fully remote role, but may require very occasional travel (once a more »
Manchester, England, United Kingdom Hybrid / WFH Options
Made Tech
how one could deploy infrastructure into different environments. Knowledge of handling and transforming various data types (JSON, CSV, etc) with Apache Spark, Databricks or Hadoop Good understanding of possible architectures involved in modern data system design (Data Warehouse, Data Lakes, Data Meshes) Ability to create data pipelines on a more »
Bristol, England, United Kingdom Hybrid / WFH Options
Made Tech
how one could deploy infrastructure into different environments. Knowledge of handling and transforming various data types (JSON, CSV, etc) with Apache Spark, Databricks or Hadoop Good understanding of possible architectures involved in modern data system design (Data Warehouse, Data Lakes, Data Meshes) Ability to create data pipelines on a more »
Job Description Software Engineering Lead Data- SQL Server, Oracle, Python, Hadoop, ETL Do you enjoy leading a team that develops high-quality code? Are you A highly visible champion with a ‘can do’ attitude and enthusiasm that inspires others? About the Business : LexisNexis Risk Solutions is the essential partner … point of escalation for software development issues within specific area of responsibility Requirements Have experience of working with Databases for e.g. SQL Server, MYSQL, Hadoop, Oracle or any other RDBMS. Be able to partner and lead internal and external technology resources in solving complex business needs Be able to more »
Cloud ML Engine , Azure Data Lake , Azure Databricks or GCP Cloud Dataproc . Familiarity with big data technologies and distributed computing frameworks, such as Hadoop, Spark, or Apache Flink. Experience scaling an “API-Ecosystem ”, designing, and implementing “API-First” integration patterns. Experience working with authentication and authorisation protocols/ more »
the following platforms: MySQL or Cassandra. Experience of developing and deploying applications into AWS or a private cloud. Exposure to any of the following: Hadoop, JMS, Zookeeper, Spring, JavaScript, UI Development. Our Offer to You An inclusive culture strongly reflecting our core values: Act Like an Owner, Delight Our more »
Technical Discipline. Technical Expertise: Proficiency in SQL and experience with cloud-based data pipelines (Azure, AWS, GCP). Familiarity with big data tools like Hadoop and Spark. Data Management Skills: Hands-on experience working with large data sets, data pipelines, workflow management tools, and Azure cloud services. Exposure to more »
in computer science, Mathematics, Statistics or similar engineering discipline. Working knowledge of Linux. Knowledge of network protocols and operation. Data analysis and visualization in Hadoop, Splunk. Data manipulation using tools like MapReduce or SQL. Experience with network security products and solutions. HTML, CSS and Javascript experience. Python. Ifyou are more »
best of breed Java toolsets - focused on MicroServices Architectures, powerful front- and backend frameworks, RESTful services, and everything from NoSQL databases like MongoDB and Hadoop, high-performance data grids like HazelCast to multi-node relational systems. You will be working in a Scrum Team of cross-functional skills in more »
best of breed Java toolsets - focused on MicroServices Architectures, powerful front- and backend frameworks, RESTful services, and everything from NoSQL databases like MongoDB and Hadoop, high-performance data grids like HazelCast to multi-node relational systems. You will be working in a Scrum Team of cross-functional skills in more »
best-of-breed Java toolsets - focused on MicroServices Architectures, powerful front- and backend frameworks, RESTful services, and everything from NoSQL databases like MongoDB and Hadoop, high-performance data grids like HazelCast to multi-node relational systems. You will be working in a Scrum Team of cross-functional skills in more »
best of breed Java toolsets - focused on MicroServices Architectures, powerful front- and backend frameworks, RESTful services, and everything from NoSQL databases like MongoDB and Hadoop, high-performance data grids like HazelCast to multi-node relational systems. You will be working in a Scrum Team of cross-functional skills in more »
the initial point of escalation for software development issues within specific area of responsibility Requirements Have experience of working with Databases SQL Server, MYSQL, Hadoop, Oracle or any other RDBMS. Be able to partner and lead internal and external technology resources in solving complex business needs Be able to more »
Woking, Surrey, United Kingdom Hybrid / WFH Options
CBSbutler Holdings Limited trading as CBSbutler
Experience with cloud technologies i.e. AWS or Azure Programming language experience i.e. Java, Python, node.js or SQL Data technologies experience i.e. PostgreSQL, MongoDB, kafka, Hadoop If you are interested in discussing this DevOps Engineer role further, please apply or send a copy of your updated CV to (url removed more »
Romsey, Hampshire, United Kingdom Hybrid / WFH Options
CBSbutler Holdings Limited trading as CBSbutler
Experience with cloud technologies i.e. AWS or Azure Programming language experience i.e. Java, Python, node.js or SQL Data technologies experience i.e. PostgreSQL, MongoDB, kafka, Hadoop If you are interested in discussing this DevOps Engineer role further, please apply or send a copy of your updated CV to (url removed more »
Monitoring utilising products such as: Prometheus, Grafana, ELK, filebeat etc. Observability - SRE Big Data solutions (ecosystems) and technologies such as: Apache Spark and the Hadoop Ecosystem Edge technologies eg NGINX, HAProxy etc. Excellent knowledge of YAML or similar languages Desirable Requirements: Jupyter Hub Awareness Minio or similar S3 storage more »
Edinburgh, Scotland, United Kingdom Hybrid / WFH Options
Workday
algorithms and data structures A proactive mindset with excellent problem-solving and communication skills Experience with big data technologies such as Apache Kafka, Spark, Hadoop, or similar systems. Preferred Skills: Demonstrated experience with scripting languages like Python, Bash, etc Testing and troubleshooting skills with the ability to walk from more »
limited to: Backend technology, Python. Databases like MSSQL. Front-end technology, Java. Cloud platform, AWS. Programming language, JavaScript (React.js) Big data technologies such as Hadoop, Spark, or Kafka. What We Need from You: Essential Skills: A degree in Computer Science, Engineering, or a related field, or equivalent experience. Proficiency more »
Manchester, England, United Kingdom Hybrid / WFH Options
Version 1
BASED WITH VERY OCCASIONAL TRAVEL TO CLIENT SITES AND OFFICE. Would you like to the opportunity to expand your skillset across Java, Python, Spark, Hadoop, Trino & Airflow across the Banking & Financial Services industries? How about if you worked with an Innovation Partner of the Year Winner (2023 Oracle EMEA … with the latest trends and best practices, and share knowledge with the team. Qualifications You will have expertise within the following: Java, Python, Spark, Hadoop (Essential) Trino, Airflow (Desirable) Architecture and capabilities. Designing and implementing complex solutions with a focus on scalability and security. Excellent communication and collaboration skills. more »
Azure SQL Data Warehouse, Azure Data Lake, AWS S3,AWS RDS,AWS Lambda or similar Have experience with Open Source big data products i.e. Hadoop Hive, Pig, Impala or similar Have experience with Open Source non-relational or NoSQL data repositories such as: MongoDB, Cassandra, Neo4J or similar Be more »
Azure SQL Data Warehouse, Azure Data Lake, AWS S3,AWS RDS,AWS Lambda or similar Have experience with Open Source big data products i.e. Hadoop Hive, Pig, Impala or similar Have experience with Open Source non-relational or NoSQL data repositories such as: MongoDB, Cassandra, Neo4J or similar Be more »