analysis. Expertise in Python, with proficiency in ML and NLP libraries such as Scikit-learn, TensorFlow, Faiss, LangChain, Transformers and PyTorch. Experience with big data tools such as Hadoop, Spark, and Hive. Familiarity with CI/CD and MLOps frameworks for building end-to-end ML pipelines. Proven ability to lead and deliver data science projects in an agile More ❯
communication and stakeholder management skills when engaging with customers Significant experience of coding in Python and Scala or Java Experience with big data processing tools such as Hadoop or Spark Cloud experience; GCP specifically in this case, including services such as Cloud Run, Cloud Functions, BigQuery, GCS, Secret Manager, Vertex AI etc. Experience with Terraform Prior experience in a More ❯
part of an Agile engineering or development team Strong hands-on experience and understanding of working in a cloud environment such as AWS Experience with EMR (Elastic Map Reduce), Spark Strong experience with CI/CD pipelines with Jenkins Experience with the following technologies: SpringBoot, Gradle, Terraform, Ansible, GitHub/GitFlow, PCF/OCP/Kubernetes technologies, Artifactory, IaC More ❯
programming languages such as Python, Java and Scala, and experience with ML frameworks like TensorFlow, PyTorch, and scikit-learn. Experience with cloud platforms (e.g., AWS), big data technologies (e.g., Spark) as well as other technologies used to deploy models to production (e.g., Kubernetes, GHA, Airflow, Docker etc.). Accommodation requests If you need assistance with any part of the More ❯
applications, including secure coding practices, data protection, and compliance with industry standards (e.g., GDPR, SOC 2). Experience with data engineering: Familiarity with big data architectures, streaming technologies (Kafka, Spark), or large-scale data processing pipelines. Cross-functional collaboration: Experience working with product, design, and other cross-functional teams to ensure seamless integration of technology solutions with business goals. More ❯
Bristol, Gloucestershire, United Kingdom Hybrid / WFH Options
Curo Resourcing Ltd
etc. Infrastructure as Code and CI/CD paradigms and systems such as: Ansible, Terraform, Jenkins, Bamboo, Concourse etc. Observability - SRE Big Data solutions (ecosystems) and technologies such as: ApacheSpark and the Hadoop Ecosystem Excellent knowledge of YAML or similar languages The following Technical Skills & Experience would be desirable: Jupyter Hub Awareness RabbitMQ or other common queue More ❯
Platform to unify and democratize data, analytics and AI. Databricks is headquartered in San Francisco, with offices around the globe and was founded by the original creators of Lakehouse, ApacheSpark, Delta Lake and MLflow. To learn more, follow Databricks on Twitter ,LinkedIn and Facebook . Benefits At Databricks, we strive to provide comprehensive benefits and perks that More ❯
its native tech stack in designing and building data & AI solutions Experience with data modeling, ETL processes, and data warehousing Knowledge of big data tools and frameworks such as Spark, Hadoop, or Kafka More ❯
ETL processes is mandatory Experience building components for enterprise data platforms (data warehouses, Operational Data Stores, API access layers, file extracts, user queries) Hands-on experience with SQL, Python, Spark, Kafka Excellent communication skills, with proficiency in verbal and written English About this Job This role involves developing and maintaining real-time data processing pipelines for enterprise customer data. More ❯
ETL processes is mandatory Experience building components for enterprise data platforms (data warehouses, Operational Data Stores, API access layers, file extracts, user queries) Hands-on experience with SQL, Python, Spark, Kafka Excellent communication skills, with proficiency in verbal and written English About this Job This role involves developing and maintaining real-time data processing pipelines for enterprise customer data. More ❯
infrastructure from the ground up. Familiarity with AWS services like S3, EMR, and technologies like Terraform and Docker. Know the ins and outs of current big data frameworks like Spark or Flink, but this is not an absolute requirement - youre a quick learner! This role is open to individuals based in or willing to relocate to London. #J More ❯
working practices CI/CD tooling Scripting experience (Python, Perl, Bash, etc.) ELK (Elastic stack) JavaScript Cypress Linux experience Search engine technology (e.g., Elasticsearch) Big Data Technology experience (Hadoop, Spark, Kafka, etc.) Microservice and cloud native architecture Desirable skills Able to demonstrate experience of troubleshooting and diagnosis of technical issues. Able to demonstrate excellent team-working skills. Strong interpersonal More ❯
align tech strategy with business objectives and cost efficiency. Security & Compliance : Strong understanding of GDPR, API authentication, and observability. Big Data : Experience with data lakes, warehouses, and tools like Spark, Kafka, and Airflow. ETL Expertise : Ability to evaluate and optimize data ingestion and transformation pipelines. DevOps & CI/CD : Hands-on experience with Jenkins, GitHub Actions, Terraform, and CloudFormation. More ❯
align tech strategy with business objectives and cost efficiency. Security & Compliance : Strong understanding of GDPR, API authentication, and observability. Big Data : Experience with data lakes, warehouses, and tools like Spark, Kafka, and Airflow. ETL Expertise : Ability to evaluate and optimize data ingestion and transformation pipelines. DevOps & CI/CD : Hands-on experience with Jenkins, GitHub Actions, Terraform, and CloudFormation. More ❯
handle multiple assignments or projects. Familiarity with version control tools (e.g., Git) and Agile work practices. Exposure to cloud platforms (e.g., AWS, GCP, Azure) or big data tools (e.g., Spark) is a plus. Retail/apparel industry knowledge is advantageous More ❯
infrastructure from the ground up. Familiarity with AWS services like S3, EMR, and technologies like Terraform and Docker. Know the ins and outs of current big data frameworks like Spark or Flink, but this is not an absolute requirement - youre a quick learner! This role is open to individuals based in or willing to relocate to London Seniority level More ❯
solutions in Python and related ML libraries Strong background in applied machine learning, model development and data engineering Experience with cloud environments (Azure, AWS, GCP) and tools such as Spark, Hive, Redshift Demonstrated ability to lead cross-functional teams and mentor junior practitioners Ability to communicate complex technical concepts clearly to non-technical audiences Bonus Points For Participation in More ❯
Engineering using a high level language like Go, Java, JavaScript, Python Distributed Software Architecture exposure in high volume production scenarios Working with Data Mesh, BigData technologies such as EMR, Spark, Databricks Designing, tracking and testing to SLOs and Chaos Engineering to Error Budgets Implementing Business Continuity (BCP) and Disaster Recovery (DRP) plans including tracking RTO and RPO CryptoCurrency domain More ❯
Manchester, Lancashire, United Kingdom Hybrid / WFH Options
Northrop Grumman Corp. (JP)
developing & deploying scalable backend systems. Familiarity with CICD, containerisation, deployment technologies & cloud platforms (Jenkins, Kubernetes, Docker, AWS) or Familiarity with Big Data and Machine Learning technologies (NumPy, PyTorch, TensorFlow, Spark). Excellent communication, collaboration & problem solving skills, ideally with some experience in agile ways of working. Security clearance: You must be able to gain and maintain the highest level More ❯
management and monitoring. Hands-on experience with AWS Have a good grasp of IaC (Infrastructure-as-code) tools like Terraform and CloudFormation. Previous exposure to additional technologies like Python, Spark, Docker, Kubernetes is desirable. Ability to develop across a diverse technology stack and willingness and ability to take on new technologies. Demonstrated experience participating on cross functional teams in More ❯
management and monitoring. Hands-on experience with AWS Have a good grasp of IaC (Infrastructure-as-code) tools like Terraform and CloudFormation. Previous exposure to additional technologies like Python, Spark, Docker, Kubernetes is desirable. Ability to develop across a diverse technology stack and willingness and ability to take on new technologies. Demonstrated experience participating on cross functional teams in More ❯
management and monitoring. Hands-on experience with AWS Have a good grasp of IaC (Infrastructure-as-code) tools like Terraform and CloudFormation. Previous exposure to additional technologies like Python, Spark, Docker, Kubernetes is desirable. Ability to develop across a diverse technology stack and willingness and ability to take on new technologies. Demonstrated experience participating on cross functional teams in More ❯
with modern inference servers and API gateways for AI applications Nice to have: Infrastructure as Code experience with Terraform, Ansible, or CloudFormation Distributed computing experience with Databricks, Ray, or Spark for large-scale AI workloads AI safety & governance experience with model evaluation, bias detection, and responsible AI practices Multi-modal AI experience with vision-language models, speech processing, or More ❯
Research/Statistics or other quantitative fields. Experience in NLP, image processing and/or recommendation systems. Hands on experience in data engineering, working with big data framework like Spark/Hadoop. Experience in data science for e-commerce and/or OTA. We welcome both local and international applications for this role. Full visa sponsorship and relocation assistance More ❯
deep learning, GenAI, LLM, etc. as well as hands on experience on AWS services like SageMaker and Bedrock, and programming skills such as Python, R, SQL, Java, Julia, Scala, Spark/Numpy/Pandas/scikit, JavaScript Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace More ❯