Technical Discipline. Technical Expertise: Proficiency in SQL and experience with cloud-based data pipelines (Azure, AWS, GCP). Familiarity with big data tools like Hadoop and Spark. Data Management Skills: Hands-on experience working with large data sets, data pipelines, workflow management tools, and Azure cloud services. Exposure to more »
Edinburgh, Scotland, United Kingdom Hybrid / WFH Options
Workday
algorithms and data structures A proactive mindset with excellent problem-solving and communication skills Experience with big data technologies such as Apache Kafka, Spark, Hadoop, or similar systems. Preferred Skills: Demonstrated experience with scripting languages like Python, Bash, etc Testing and troubleshooting skills with the ability to walk from more »
Maidstone, Kent, United Kingdom Hybrid / WFH Options
Worley
of technology to automate data pipelines and build analytical warehouses· Deep understanding of cloud-based data platforms (Azure SQL DB, Azure Synapse, ADLS, AWS, Hadoop, Spark, Snowflake, No-SQL etc).· Proficient scripting in programming languages such as Java, Python, Scala· Expert in SQLMachine Learning· Good basic understanding of more »
Reading, Berkshire, United Kingdom Hybrid / WFH Options
Deloitte
governance (Unity Catalogue), security and access design on Cloud technologies (AWS, GCP, Azure).Experience with design and implementation in big data technologies such as Hadoop, NoSQL, MPP, OLTP, and OLAP or full lifecycle data science solutions.Experience with Data Acquisition, Integration & Transformation solutions leveraging Batch, Micro-batch, CDC and Event more »
automation & configuration management; Ansible (plus Puppet, Saltstack), Terraform, CloudFormation; NodeJS, REACT/MaterialUI (plus Angular), Python, JavaScript; Big data processing and analysis, e.g. ApacheHadoop (CDH), Apache Spark; RedHat Enterprise Linux, CentOS, Debian or Ubuntu. Java 8, Spring framework (preferably Spring boot), AMQP RabbitMQ, Open source technologies; Experience of more »
the following platforms: MySQL or Cassandra. Experience of developing and deploying applications into AWS or a private cloud. Exposure to any of the following: Hadoop, JMS, Zookeeper, Spring, JavaScript, UI Development. Our Offer to You An inclusive culture strongly reflecting our core values: Act Like an Owner, Delight Our more »
Manchester, England, United Kingdom Hybrid / WFH Options
Made Tech
how one could deploy infrastructure into different environments. Knowledge of handling and transforming various data types (JSON, CSV, etc) with Apache Spark, Databricks or Hadoop Good understanding of possible architectures involved in modern data system design (Data Warehouse, Data Lakes, Data Meshes) Ability to create data pipelines on a more »
Bristol, England, United Kingdom Hybrid / WFH Options
Made Tech
how one could deploy infrastructure into different environments. Knowledge of handling and transforming various data types (JSON, CSV, etc) with Apache Spark, Databricks or Hadoop Good understanding of possible architectures involved in modern data system design (Data Warehouse, Data Lakes, Data Meshes) Ability to create data pipelines on a more »
limited to: Backend technology, Python. Databases like MSSQL. Front-end technology, Java. Cloud platform, AWS. Programming language, JavaScript (React.js) Big data technologies such as Hadoop, Spark, or Kafka. What We Need from You: Essential Skills: A degree in Computer Science, Engineering, or a related field, or equivalent experience. Proficiency more »
capabilities such as GitHub, Jenkins and UrbanCode methodologies, demonstrating engineering excellence and a passion for automation Additionally, data domain technology experience including: Teradata, Oracle, Hadoop, DB2 and familiarity with NoSQL databases such as Cassandra and HBase Expertise in Cloud Native technologies including networking & security is a plus Understanding how more »
with a statistical programming language (Python or R) and experience with libraries specifically for Machine Learning or Data Analytics as well as knowledge on Hadoop/MapReduce is a plus. Hands-on experience building and delivering large scale enterprise systems/products.Experience attracting, hiring and retaining top engineering talent.Experience more »
Cloud ML Engine , Azure Data Lake , Azure Databricks or GCP Cloud Dataproc . Familiarity with big data technologies and distributed computing frameworks, such as Hadoop, Spark, or Apache Flink. Experience scaling an “API-Ecosystem ”, designing, and implementing “API-First” integration patterns. Experience working with authentication and authorisation protocols/ more »
Bedford, Bedfordshire, United Kingdom Hybrid / WFH Options
Understanding Recruitment
ML frameworks (TensorFlow, PyTorch etc.)MLOps experienceNice to have:Familiarity with Git or other Version Control SystemsComputer Vision Library exposureUnderstanding of Big Data Technologies (Hadoop, Spark etc)Experience with Cloud platforms (AWS, GCP or Azure)This is a fully remote role, but may require very occasional travel (once a more »
Bedford, England, United Kingdom Hybrid / WFH Options
Understanding Recruitment
PyTorch etc.) MLOps experience Nice to have: Familiarity with Git or other Version Control Systems Computer Vision Library exposure Understanding of Big Data Technologies (Hadoop, Spark etc) Experience with Cloud platforms (AWS, GCP or Azure) This is a fully remote role, but may require very occasional travel (once a more »
Northampton, Northamptonshire, East Midlands, United Kingdom Hybrid / WFH Options
Dupen Ltd
Linux, APIs, infrastructure design load balancing, VMs, PostgreSQL, vector dbs. Senior ML Learning Engineer desirable skills: Version control (Git), computer vision libraries, Big Data (Hadoop, Spark), Cloud AWS, Google Cloud, Azure, and a knowledge of secure coding techniques PCI-DSS, PA-DSS, ISO27001. This is a fantastic opportunity to more »
environments: Sybase ASE/IQ, Oracle or DB2 It would be great if you have: Experience in Cluster Computing and Big Data solutions: Spark, Hadoop, HDSF, XRS using public cloud Our Commitment to Diversity & Inclusion: Did you know that Apexon has been Certified™ by Great Place To Work®, the more »
Manchester, North West, United Kingdom Hybrid / WFH Options
Dupen Ltd
Systems: Linux, APIs, infrastructure design – load balancing, VMs, PostgreSQL, vector dbs. ML Learning Engineer – desirable skills: Version control (Git), computer vision libraries, Big Data (Hadoop, Spark), Cloud – AWS, Google Cloud, Azure, and a knowledge of secure coding techniques – PCI-DSS, PA-DSS, ISO27001. Note: as there are actually two more »
Employment Type: Permanent
Salary: £50000 - £60000/annum To £60,000 + range of benefits
Milton Keynes, Buckinghamshire, South East, United Kingdom Hybrid / WFH Options
Dupen Ltd
Systems: Linux, APIs, infrastructure design load balancing, VMs, PostgreSQL, vector dbs. ML Learning Engineer desirable skills: Version control (Git), computer vision libraries, Big Data (Hadoop, Spark), Cloud AWS, Google Cloud, Azure, and a knowledge of secure coding techniques PCI-DSS, PA-DSS, ISO27001. Note: as there are actually two more »
Milton Keynes, Bedfordshire, South East, Woolstone, Buckinghamshire, United Kingdom Hybrid / WFH Options
Dupen Ltd
Systems: Linux, APIs, infrastructure design – load balancing, VMs, PostgreSQL, vector dbs. ML Learning Engineer – desirable skills: Version control (Git), computer vision libraries, Big Data (Hadoop, Spark), Cloud – AWS, Google Cloud, Azure, and a knowledge of secure coding techniques – PCI-DSS, PA-DSS, ISO27001. Note: as there are actually two more »
Employment Type: Permanent
Salary: £50000 - £60000/annum To £60,000 + range of benefits
Experience with modeling tools such as PyTorch, scikit-learn, Spark MLLib, MxNet, Tensorflow, numpy, scipy etc. Experience with large scale distributed systems such as Hadoop, Spark etc. Experience in building speech recognition, machine translation and natural language processing systems (e.g., commercial speech products or government speech projects) Amazon is more »
best of breed Java toolsets - focused on MicroServices Architectures, powerful front- and backend frameworks, RESTful services, and everything from NoSQL databases like MongoDB and Hadoop, high-performance data grids like HazelCast to multi-node relational systems. You will be working in a Scrum Team of cross-functional skills in more »
best of breed Java toolsets - focused on MicroServices Architectures, powerful front- and backend frameworks, RESTful services, and everything from NoSQL databases like MongoDB and Hadoop, high-performance data grids like HazelCast to multi-node relational systems. You will be working in a Scrum Team of cross-functional skills in more »
best-of-breed Java toolsets - focused on MicroServices Architectures, powerful front- and backend frameworks, RESTful services, and everything from NoSQL databases like MongoDB and Hadoop, high-performance data grids like HazelCast to multi-node relational systems. You will be working in a Scrum Team of cross-functional skills in more »
Jenkins and UrbanCode methodologies, demonstrating engineering excellence and a passion for automation Additionally, data domain technology experience including: Database technology such as Teradata, Oracle, Hadoop, DB2 and familiarity with NoSQL databases such as Cassandra and HBase Expertise in Container & Virtualization/Hypervisor technologies such as K8,firecracker/gVisor more »