cloud platforms (e.g., AWS , Azure , GCP ). Experience deploying ML models or managing AI/ML workflows in production. Working knowledge of big data technologies like Spark , Hive , or Hadoop . Familiarity with MLOps tools (e.g., MLflow , Kubeflow , DataRobot ). Education Bachelor’s degree in Computer Science , Software Engineering , or a related technical field — or equivalent practical experience. Why More ❯
S3, Lambda, BigQuery, or Databricks. Solid understanding of ETL processes , data modeling, and data warehousing. Familiarity with SQL and relational databases. Knowledge of big data technologies , such as Spark, Hadoop, or Kafka, is a plus. Strong problem-solving skills and the ability to work in a collaborative team environment. Excellent verbal and written communication skills. Bachelor's degree in More ❯
engineering, architecture, or platform management roles, with 5+ years in leadership positions. Expertise in modern data platforms (e.g., Azure, AWS, Google Cloud) and big data technologies (e.g., Spark, Kafka, Hadoop). Strong knowledge of data governance frameworks, regulatory compliance (e.g., GDPR, CCPA), and data security best practices. Proven experience in enterprise-level architecture design and implementation. Hands-on knowledge More ❯
functional teams . Preferred Skills High-Performance Computing (HPC) and AI workloads for large-scale enterprise solutions. NVIDIA CUDA, cuDNN, TensorRT experience for deep learning acceleration. Big Data platforms (Hadoop, Spark) for AI-driven analytics in professional services. Pls share CV at payal.c@hcltech.com More ❯
English is required. Preferred Skills: Experience in commodities markets or broader financial markets. Knowledge of quantitative modeling, risk management, or algorithmic trading. Familiarity with big data technologies like Kafka, Hadoop, Spark, or similar. Why Work With Us? Impactful Work: Directly influence the profitability of the business by building technology that drives trading decisions. Innovative Culture: Be part of a More ❯
orchestration tools like Apache Airflow or Cloud Composer. Hands-on experience with one or more of the following GCP data processing services: Dataflow (Apache Beam), Dataproc (Apache Spark/Hadoop), or Composer (Apache Airflow). Proficiency in at least one scripting/programming language (e.g., Python, Java, Scala) for data manipulation and pipeline development. Scala is mandated in some More ❯
Guildford, Surrey, United Kingdom Hybrid / WFH Options
Actica Consulting Limited
build scalable data infrastructure, develop machine learning models, and create robust solutions that enhance public service delivery. Working in classified environments, you'll tackle complex challenges using tools like Hadoop, Spark, and modern visualisation frameworks while implementing automation that drives government efficiency. You'll collaborate with stakeholders to transform legacy systems, implement data governance frameworks, and ensure solutions meet … R; Collaborative, team-based development; Cloud analytics platforms e.g. relevant AWS and Azure platform services; Data tools hands on experience with Palantir ESSENTIAL; Data science approaches and tooling e.g. Hadoop, Spark; Data engineering approaches; Database management, e.g. MySQL, Postgress; Software development methods and techniques e.g. Agile methods such as SCRUM; Software change management, notably familiarity with git; Public sector More ❯
Engineering, Mathematics, Finance, etc. Proficiency in Python, SQL , and one or more: R, Java, Scala Experience with relational/NoSQL databases (e.g., PostgreSQL, MongoDB) Familiarity with big data tools (Hadoop, Spark, Kafka), cloud platforms (Azure, AWS, GCP), and workflow tools (Airflow, Luigi) Bonus: experience with BI tools , API integrations , and graph databases Why Join Us? Work with large-scale More ❯
of predictive modelling, machine learning, clustering and classification techniques. Fluency in a programming language (Python, C, C++, Java, SQL). Familiarity with Big Data frameworks and visualization tools (Cassandra, Hadoop, Spark, Tableau). More ❯
.NET code Experience working on distributed systems Strong knowledge of Kubernetes and Kafka Experience with Git, and Deployment Pipelines Having worked with at least one of the following stacks: Hadoop, Apache Spark, Presto AWS Redshift, Azure Synapse or Google BigQuery Experience profiling performance issues in database systems Ability to learn and/or adapt quickly to complex issues Happy More ❯
South East London, England, United Kingdom Hybrid / WFH Options
Anson McCade
Knowledge of CI/CD processes and infrastructure-as-code. • Eligible for SC clearance (active clearance highly advantageous). Desirable Skills • Exposure to large data processing frameworks (e.g., Spark, Hadoop). • Experience deploying data via APIs and microservices. • AWS certifications (Solution Architect Associate, Data Analytics Speciality, etc.). • Experience in public sector programmes or government frameworks. Package & Benefits More ❯
Azure Functions. Strong knowledge of scripting languages (e.g., Python, Bash, PowerShell) for automation and data transformation. Proficient in working with databases, data warehouses, and data lakes (e.g., SQL, NoSQL, Hadoop, Redshift). Familiarity with APIs and web services for integrating external systems and applications into orchestration workflows. Hands-on experience with data transformation and ETL (Extract, Transform, Load) processes. More ❯
security principles, system performance optimisation, and infrastructure reliability. Experience working on large-scale, production-grade systems with distributed architectures. Nice to Have: Exposure to tools like Elasticsearch/Kibana, Hadoop/HBase, OpenSearch, or VPN/proxy architectures. Ideal Candidate will: Bring technical vision, initiative, and a passion for exploring and implementing emerging technologies. Are a natural technical leader More ❯
with Temporal or similar orchestration tools. • Messaging & Streaming: Exposure to RabbitMQ or similar message/streaming broker technologies. • Advanced Technologies: Interest or experience in big data technologies (e.g., HBase, Hadoop), machine learning frameworks (e.g., Spark), and orbit dynamics. Why Join Us? • Innovative Environment: Be part of projects at the cutting edge of space systems and security. • Agile Culture: Work More ❯
often (in days) to receive an alert: Create Alert Supermicro is a Top Tier provider of advanced server, storage, and networking solutions for Data Center, Cloud Computing, Enterprise IT, Hadoop/Big Data, Hyperscale, HPC and IoT/Embedded customers worldwide. We are amongst the fastest growing company among the Silicon Valley Top 50 technology firms. Our unprecedented global More ❯
Financial Services, Manufacturing, Life Sciences and Healthcare, Technology and Services, Telecom and Media, Retail and CPG, and Public Services. Consolidated revenues as of $13+ billion. Location- London Skill - ApacheHadoop We are looking for open-source contributors to Apache projects, who have an in-depth understanding of the code behind the Apache ecosystem, should have experience in Cloudera or … hybrid cloud environment Ability to do debug & fix code in the open source Apache code and should be an individual contributor to open source projects. Job description: The ApacheHadoop project requires up to 3 individuals with experience in designing and building platforms, and supporting applications both in cloud environments and on-premises. These resources are expected to be … to support all developers in migrating and debugging various RiskFinder critical applications. They need to be "Developers" who are expert in designing and building Big Data platforms using ApacheHadoop and support ApacheHadoop implementations both in cloud environments and on-premises. More ❯