Maidenhead, England, United Kingdom Hybrid / WFH Options
ZipRecruiter
experience designing or delivering large-scale pricing, AI or recommendation systems. Deep technical knowledge of ML frameworks (TensorFlow, PyTorch), cloud platforms (AWS, GCP, Azure), and big data tools (Spark, Hadoop). Demonstrated success in building business-critical, real-time algorithmic solutions. Strong communication and stakeholder engagement skills – translating complexity into business value. A commercial, data-led mindset and a More ❯
London, England, United Kingdom Hybrid / WFH Options
Deutsche Bank
PL SQL, Linux, Shell Script. UI development experience using modern JavaScript frameworks (ReactJS strongly preferred) Experience with relational databases (Oracle and PostgreSQL preferred) and/or Big Data technologies (Hadoop and Spark preferred) Experience with CI/CD pipelines and cloud-relevant technologies like Kubernetes, Helm Charts, Docker, Terraform, and at least one major cloud provider (GCP preferred) Bachelor More ❯
Herndon, Virginia, United States Hybrid / WFH Options
Maxar Technologies
developing and deploying web services working with open-source resources in a government computing environment Maintaining backend GIS technologies ICD 503 Big data technologies such as Accumulo , Spark, Hive, Hadoop , or ElasticSearch F amiliarity with : hybrid cloud/on-prem architecture, AWS, C2S, and OpenStack . concepts such as Data visualization; Data management, Data integration, User Interfaces, Databases CompTIA More ❯
to implement them through libraries. Experience with programming, ideally Python, and the ability to quickly pick up handling large data volumes with modern data processing tools, e.g. by using Hadoop/Spark/SQL Experience with or ability to quickly learn open-source software including machine learning packages, such as Pandas, scikit-learn, along with data visualisation technologies. Experience More ❯
London, England, United Kingdom Hybrid / WFH Options
S.i. Systems
ECS) Knowledge or experience with FAISS, vector search, or NLP Knowledge of scripting Experience with CI/CD and Gitlab Experience working with Big Data tools such as PySpark, Hadoop, Data Bricks, MongoDB, Apache Spark, Apache Kafka Experience with monitoring tools such as Splunk, Grafana, and Prometheus #J-18808-Ljbffr More ❯
your workflow supported by Jira. Qualifications Skill requirements: • At least 3 years of programming with Python, including extensive experience with Pandas for data manipulation and analysis • Solid understanding of Hadoop ecosystem • Ability to analyze complex datasets and extract actionable insights • ETL pipeline development • Familiarity with machine learning workflows Optional skills: • Hands-on experience with Apache Spark (preferably PySpark) and More ❯
Bristol, Gloucestershire, United Kingdom Hybrid / WFH Options
Curo Resourcing Ltd
and CI/CD paradigms and systems such as: Ansible, Terraform, Jenkins, Bamboo, Concourse etc. Observability - SRE Big Data solutions (ecosystems) and technologies such as: Apache Spark and the Hadoop Ecosystem Excellent knowledge of YAML or similar languages The following Technical Skills & Experience would be desirable: Jupyter Hub Awareness RabbitMQ or other common queue technology e.g. ActiveMQ NiFi Rego More ❯
London, England, United Kingdom Hybrid / WFH Options
ZipRecruiter
Experience: 5+ years of experience with Microsoft data technologies. Strong expertise in SQL Server Integration Services, SQL, Data Warehousing, and ETL/ELT processes. Proficiency in Python, Power BI, Hadoop, and Spark for data processing and analytics. Strong stakeholder management and communication skills. Why Join? Hybrid working with a requirement to attend our office at least once a week. More ❯
Platform (GCP) including network architectures (Shared VPC, Hub-and-Spoke), security implementations (IAM, Secret Manager, firewalls, Identity-Aware Proxy), DNS configuration, VPN, and Load Balancing. Data Processing & Transformation: Utilize Hadoop cluster with Hive for querying data, and PySpark for data transformations. Implement job orchestration using Airflow. Core GCP Services Management: Work extensively with services like Google Kubernetes Engine (GKE More ❯
Mountain View, California, United States Hybrid / WFH Options
LinkedIn
Scala or other relevant coding languages. -Experience in architecting, building, and running large-scale distributed systems -Experience with industry, opensource, and/or academic research in technologies such as Hadoop, Spark, Kubernetes, Feather, GraphQL, GRPC, Apache Kafka, Pinot, Samza or Venice -Experience with open-source project management and governance Suggested Skills: Distributed systems Backend Systems Infrastructure Java/Golang More ❯
Mountain View, California, United States Hybrid / WFH Options
LinkedIn
Kubernetes (or similar) Ecosystem - Experience with Kubernetes controller development, automating cluster management - Golang coding experience - Experience with industry, open source, and/or academic research in technologies such as Hadoop, Spark, Kubernetes, Feather, GraphQL, GRPC, Apache Kafka, Pinot, Samza or Venice - Experience with open-source project management and governance Suggested Skills: -Distributed Systems -Golang -Technical Leadership You will Benefit More ❯
excellent project management skills and technical experience to ensure successful delivery of complex data projects on time and within budget. Responsibilities: Lead and manage legacy data platform migration (Teradata, Hadoop), data lake build, and data analytics projects from initiation to completion. Develop comprehensive project plans, including scope, timelines, resource allocation, and budgets. Monitor project progress, identify risks, and implement More ❯
London, England, United Kingdom Hybrid / WFH Options
Bison Global Technology Search
excellent project management skills and technical experience to ensure successful delivery of complex data projects on time and within budget. Responsibilities: Lead and manage legacy data platform migration (Teradata, Hadoop), data lake build, and data analytics projects from initiation to completion. Develop comprehensive project plans, including scope, timelines, resource allocation, and budgets. Monitor project progress, identify risks, and implement More ❯
Greater London, England, United Kingdom Hybrid / WFH Options
Quant Capital
multi-threaded Java – if you haven’t been siloed in a big firm then don’t worry. Additional exposure to the following is desired : - Tech Stack you will learn Hadoop and Flink RUST, Javascript, React, Redux, Flow Linux, Jenkins Kafka, Avro, Kubernetes, Puppet Involvement in the Java community My client is based London. Home work is encouraged but you More ❯
slough, south east england, united kingdom Hybrid / WFH Options
Quant Capital
multi-threaded Java – if you haven’t been siloed in a big firm then don’t worry. Additional exposure to the following is desired : - Tech Stack you will learn Hadoop and Flink RUST, Javascript, React, Redux, Flow Linux, Jenkins Kafka, Avro, Kubernetes, Puppet Involvement in the Java community My client is based London. Home work is encouraged but you More ❯
london, south east england, united kingdom Hybrid / WFH Options
Quant Capital
multi-threaded Java – if you haven’t been siloed in a big firm then don’t worry. Additional exposure to the following is desired : - Tech Stack you will learn Hadoop and Flink RUST, Javascript, React, Redux, Flow Linux, Jenkins Kafka, Avro, Kubernetes, Puppet Involvement in the Java community My client is based London. Home work is encouraged but you More ❯
London, England, United Kingdom Hybrid / WFH Options
Bison Global Technology Search
excellent project management skills and technical experience to ensure successful delivery of complex data projects on time and within budget. Responsibilities: Lead and manage legacy data platform migration (Teradata, Hadoop), data lake build, and data analytics projects from initiation to completion. Develop comprehensive project plans, including scope, timelines, resource allocation, and budgets. Monitor project progress, identify risks, and implement More ❯
London, England, United Kingdom Hybrid / WFH Options
BrightBox Group Ltd
sensitive nature of the data we handle. Required - Proficiency in AWS services and tools related to data storage, processing, and analytics. - Strong experience with big data technologies such as Hadoop, Spark, or similar frameworks. - Active SC clearance is essential for this role. Seniority level Seniority level Mid-Senior level Employment type Employment type Contract Job function Job function Information More ❯
be able to demonstrate how you have: Worked hands-on with data engineering tools and technologies (databases, data warehouses and/or data integration solutions, eg, SQL, Python, Spark, Hadoop etc) Knowledge of data architecture and best practice data integration approaches Experience working with Microsoft Data solutions, such as Fabric and/or solutions such as Snowflake, Redshift Engaged More ❯
London, England, United Kingdom Hybrid / WFH Options
Global Relay
an asset: Testing performance with JMeter or similar tools Web services technology such as REST, JSON or Thrift Testing web applications with Selenium WebDriver Big data technology such as Hadoop, MongoDB, Kafka or SQL Network principles and protocols such as HTTP, TLS and TCP Continuous integration systems such as Jenkins or Bamboo Continuous delivery concepts What you can expect More ❯
London, England, United Kingdom Hybrid / WFH Options
Global Relay
an asset: Testing performance with JMeter or similar tools Web services technology such as REST, JSON or Thrift Testing web applications with Selenium WebDriver Big data technology such as Hadoop, MongoDB, Kafka or SQL Network principles and protocols such as HTTP, TLS and TCP Continuous integration systems such as Jenkins or Bamboo Continuous delivery concepts What you can expect More ❯
London, England, United Kingdom Hybrid / WFH Options
Global Relay
an asset: Testing performance with JMeter or similar tools Web services technology such as REST, JSON or Thrift Testing web applications with Selenium WebDriver Big data technology such as Hadoop, MongoDB, Kafka or SQL Network principles and protocols such as HTTP, TLS and TCP Continuous integration systems such as Jenkins or Bamboo Continuous delivery concepts What you can expect More ❯
Milton Keynes, England, United Kingdom Hybrid / WFH Options
Santander USA
our community. We are the Insurance and Asset Management product IT team, delivering and maintaining applications across various technologies, including online applications (APIs, Microservices), batch processes (Spark/Scala-Hadoop/Big Data Ecosystem), and legacy components. Our main objective is to deliver innovative solutions that meet business requirements, enabling us to sell Insurance and Wealth products through all More ❯
Sunnyvale, California, United States Hybrid / WFH Options
LinkedIn
natural language processing. Experience with iterative, test-driven development. Experience with configuration management (SVN, GIT, ant, maven, etc.). Experience with developing and designing consumer-facing products. Experience with Hadoop, Pig, or other MapReduce paradigms. Knowledge of internals Lucene/SOLR or other information retrieval systems. Published work in academic conferences or industry circles. Suggested Skills: Experience in Machine More ❯
an asset: Testing performance with JMeter or similar tools Web services technology such as REST, JSON or Thrift Testing web applications with Selenium WebDriver Big data technology such as Hadoop, MongoDB, Kafka or SQL Network principles and protocols such as HTTP, TLS and TCP Continuous integration systems such as Jenkins or Bamboo What you can expect: At Global Relay More ❯