awareness, able to prioritize across several projects and to lead and coordinate larger initiatives. Good Python and SQL skills, experience with the AWS stack, Spark, Databricks and/or Snowflake desirable. Solid understanding of statistical modelling and machine learning algorithms, and experience deploying and managing models in production. Experience More ❯
such as Bloomberg, Refinitiv, or Open Banking. Experience with cloud platforms (AWS, GCP, or Azure) for model deployment. Understanding of big data technologies like Spark or Hadoop. Knowledge of algorithmic trading, credit risk modelling, or payment fraud detection . Benefits 💰 Competitive Salary & Bonus: £35,000 - £45,000 plus performance More ❯
london, south east england, United Kingdom Hybrid / WFH Options
Intellect Group
such as Bloomberg, Refinitiv, or Open Banking. Experience with cloud platforms (AWS, GCP, or Azure) for model deployment. Understanding of big data technologies like Spark or Hadoop. Knowledge of algorithmic trading, credit risk modelling, or payment fraud detection . Benefits 💰 Competitive Salary & Bonus: £35,000 - £45,000 plus performance More ❯
including OAuth, JWT, and data encryption. • Fluent in English with strong communication and collaboration skills. Preferred Qualifications: • Experience with big data processing frameworks like Apache Flink or Spark. • Familiarity with machine learning models and AI-driven analytics. • Understanding of front-end and mobile app interactions with backend services. • Expertise More ❯
london (hounslow), south east england, united kingdom
eTeam
including OAuth, JWT, and data encryption. • Fluent in English with strong communication and collaboration skills. Preferred Qualifications: • Experience with big data processing frameworks like Apache Flink or Spark. • Familiarity with machine learning models and AI-driven analytics. • Understanding of front-end and mobile app interactions with backend services. • Expertise More ❯
Knowledge of cloud platforms (e.g., Azure). Familiarity with containerization is a plus (e.g., Docker, Kubernetes). Knowledge of big data technologies (e.g., Hadoop, Spark). Knowledge of data lifecycle management. Strong problem-solving skills and attention to detail. Ability to work in an agile development environment. Excellent communication More ❯
Deep expertise with Databricks and modern data platforms in the cloud (Azure, AWS, or GCP). Strong technical background in big data frameworks (e.g., Spark, Kafka), distributed systems, and scalable data architectures. Excellent understanding of data governance, security, and privacy, with practical knowledge of GDPR compliance. Track record of More ❯
london, south east england, united kingdom Hybrid / WFH Options
Merlin Entertainments
Deep expertise with Databricks and modern data platforms in the cloud (Azure, AWS, or GCP). Strong technical background in big data frameworks (e.g., Spark, Kafka), distributed systems, and scalable data architectures. Excellent understanding of data governance, security, and privacy, with practical knowledge of GDPR compliance. Track record of More ❯
awareness, able to prioritise across several projects and to lead and coordinate larger initiatives. Good Python and SQL skills, experience with the AWS stack, Spark, Databricks and/or Snowflake desirable. Solid understanding of statistical modelling and machine learning algorithms, and experience deploying and managing models in production. Experience More ❯
working with hierarchical reference data models. Proven expertise in handling high-throughput, real-time market data streams. Familiarity with distributed computing frameworks such as Apache Spark. Operational experience supporting real-time systems. Equal Opportunity Workplace We are proud to be an equal opportunity workplace. We do not discriminate based More ❯
building ETL pipelines Experience with SQL Experience mentoring team members on best practices PREFERRED QUALIFICATIONS Experience with big data technologies such as: Hadoop, Hive, Spark, EMR Experience operating large data warehouses Amazon is an equal opportunities employer. We believe passionately that employing a diverse workforce is central to our More ❯
and machine learning and extensive practical experience with it. Expert knowledge and experience with relevant programming languages (incl. Python), frameworks (incl. Pycharm, OpenAI, HuggingFace, Spark, Azure, AWS). Extensive experience with cloud environments (AWS, Azure, GCP). Ability to write highly performant code working with big data. Bachelor's More ❯
in cloud architecture and implementation Bachelor's degree in Computer Science, Engineering, related field, or equivalent experience Experience in database (e.g., SQL, NoSQL, Hadoop, Spark, Kafka, Kinesis) Experience in consulting, design and implementation of serverless distributed solutions Experience in software development with object-oriented language AWS experience preferred, with More ❯
etc.) Have experience productionising machine learning models Are an expert in one of predictive modeling, classification, regression, optimisation or recommendation systems Have experience with Spark Have knowledge of DevOps technologies such as Docker and Terraform and ML Ops practices and platforms like ML Flow Have experience with agile delivery More ❯
experience with SQL, Python, R, or similar languages for data analysis. Familiarity with cloud platforms (e.g., AWS, Google Cloud) and big data tools (e.g., Spark, Snowflake). Exceptional leadership, project management, and interpersonal skills with a proven ability to manage and scale teams. Strong business acumen with the ability More ❯
or development team Strong hands-on experience and understanding of working in a cloud environment such as AWS Experience with EMR (Elastic Map Reduce), Spark Strong experience with CI/CD pipelines with Jenkins Experience with the following technologies: SpringBoot, Gradle, Terraform, Ansible, GitHub/GitFlow, PCF/OCP More ❯
Strong hands-on experience and understanding of working in a cloud environment such as AWS/li li Experience with EMR (Elastic Map Reduce), Spark/li li Strong experience with CI/CD pipelines with Jenkins/li li Experience with the following technologies: SpringBoot, Gradle, Terraform, Ansible More ❯
as well as hands on experience on AWS services like SageMaker and Bedrock, and programming skills such as Python, R, SQL, Java, Julia, Scala, Spark/Numpy/Pandas/scikit, JavaScript Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a More ❯
in the legal domain. Ability to communicate with multiple stakeholders, including non-technical legal subject matter experts. Experience with big data technologies such as Spark, Hadoop, or similar. Experience conducting world-leading research, e.g. by contributions to publications at leading ML venues. Previous experience working on large-scale data More ❯
3rd party packages, including MLFlow, Seldon for ML model tracking and deployment, Kubernetes for hosting models, Argo and Git for CI/CD automation, Spark for big data processing. This is a rapidly changing field and we are deeply involved in open source community to help shape the technology … to adopt in order to enhance our platform. What you'll do: Develop across our evolving technology stack - we're using Python, Java, Kubernetes, ApacheSpark, Postgres, ArgoCD, Argo Workflow, Seldon, MLFlow and more. We are migrating into AWS cloud and adopting many services that are available in … desire to write clean, correct and efficient code. Sense of ownership, urgency and pride in your work. Experience with Python, Java, Docker, Kubernetes, Argo, Spark and AWS cloud services a plus. Exposure to Machine Learning practices a plus. We strive to create an accessible and inclusive experience for all More ❯
3rd party packages, including MLFlow, Seldon for ML model tracking and deployment, Kubernetes for hosting models, Argo and Git for CI/CD automation, Spark for big data processing. This is a rapidly changing field and we are deeply involved in open source community to help shape the technology … to adopt in order to enhance our platform. What you'll do: Develop across our evolving technology stack - we're using Python, Java, Kubernetes, ApacheSpark, Postgres, ArgoCD, Argo Workflow, Seldon, MLFlow and more. We are migrating into AWS cloud and adopting many services that are available in … desire to write clean, correct and efficient code. Sense of ownership, urgency and pride in your work. Experience with Python, Java, Docker, Kubernetes, Argo, Spark and AWS cloud services a plus. Exposure to Machine Learning practices a plus. We strive to create an accessible and inclusive experience for all More ❯
QUALIFICATIONS - Implementation experience with AWS services - Hands on experience leading large-scale global data warehousing and analytics projects. - Experience using some of the following: ApacheSpark/Hadoop ,Flume, Kinesis, Kafka, Oozie, Hue, Zookeeper, Ranger, Elasticsearch, Avro, Hive, Pig, Impala, Spark SQL, Presto, PostgreSQL, Amazon EMR,Amazon More ❯
Purview or equivalent for data governance and lineage tracking Experience with data integration, MDM, governance, and data quality tools . Hands-on experience with ApacheSpark, Python, SQL, and Scala for data processing. Strong understanding of Azure networking, security, and IAM , including Azure Private Link, VNETs, Managed Identities More ❯
data, analytics and AI. Databricks is headquartered in San Francisco, with offices around the globe and was founded by the original creators of Lakehouse, ApacheSpark, Delta Lake and MLflow. Benefits At Databricks, we strive to provide comprehensive benefits and perks that meet the needs of all of More ❯
Nottingham, Nottinghamshire, United Kingdom Hybrid / WFH Options
Experian Group
Significant experience of programming using Scala and Python Experience of using Terraform to provision and deploy cloud services and components Experience of developing on ApacheSpark Experience of developing with AWS cloud services including (but not limited to) AWS Glue, S3, Step Functions, Lambdas, EventBridge and SQS BDD More ❯