City of London, London, United Kingdom Hybrid / WFH Options
Anson McCade
Looker. • Knowledge of CI/CD processes and infrastructure-as-code. • Eligible for SC clearance (active clearance highly advantageous). Desirable Skills • Exposure to large data processing frameworks (e.g., Spark, Hadoop). • Experience deploying data via APIs and microservices. • AWS certifications (Solution Architect Associate, Data Analytics Speciality, etc.). • Experience in public sector programmes or government frameworks. Package & Benefits More ❯
Looker. • Knowledge of CI/CD processes and infrastructure-as-code. • Eligible for SC clearance (active clearance highly advantageous). Desirable Skills • Exposure to large data processing frameworks (e.g., Spark, Hadoop). • Experience deploying data via APIs and microservices. • AWS certifications (Solution Architect Associate, Data Analytics Speciality, etc.). • Experience in public sector programmes or government frameworks. Package & Benefits More ❯
South East London, England, United Kingdom Hybrid / WFH Options
Anson McCade
Looker. • Knowledge of CI/CD processes and infrastructure-as-code. • Eligible for SC clearance (active clearance highly advantageous). Desirable Skills • Exposure to large data processing frameworks (e.g., Spark, Hadoop). • Experience deploying data via APIs and microservices. • AWS certifications (Solution Architect Associate, Data Analytics Speciality, etc.). • Experience in public sector programmes or government frameworks. Package & Benefits More ❯
or similar languages (e.g. Java or Python) Software collaboration and revision control (e.g. Git or SVN) Desired skills and experiences: ElasticSearch/Kibana Cloud computing (e.g. AWS) Hadoop/Spark etc. Graph Databases Educational level: Master Degree Tagged as: Clustering , Data Mining , Industry , Information Retrieval , Master Degree , Sentiment Analysis , United Kingdom More ❯
TensorFlow, PyTorch, or scikit-learn. Familiarity with cloud platforms like AWS, GCP, or Azure. Strong written and spoken English skills. Bonus Experience: Experience with big data tools (e.g., Hadoop, Spark) and distributed computing. Knowledge of NLP techniques and libraries. Familiarity with Docker, Kubernetes, and deploying machine learning models in production. Experience with visualization tools like Tableau, Power BI, or More ❯
MXNet, Caffe2, TensorFlow, Theano, CNTK, Keras) and ML tools (SparkML, AML). 7+ years in IT platform implementation, consulting, and distributed solutions design. Experience with databases (SQL, NoSQL, Hadoop, Spark, Kafka, Kinesis), cloud solutions (AWS or equivalent), systems, networks, and operating systems. If you need accommodations during the application process, please visit this link . More ❯
mindset with ability to think strategically about business, product, and technical challenges in an enterprise environment - Extensive hands-on experience with data platform technologies, including at least three of: Spark, Hadoop ecosystem, orchestration frameworks, MPP databases, NoSQL, streaming technologies, data catalogs, BI and visualization tools - Proficiency in at least one programming language (e.g., Python, Java, Scala), infrastructure as code More ❯
would really make yourapplication stand out: Implementationexperience with Machine Learning models and applications Knowledgeof cloud-based Machine Learning engines (AWS, Azure, Google, etc.) Experiencewith large scale data processing tools (Spark, Hadoop, etc.) Abilityto query and program databases (SQL, No SQL) Experiencewith distributed ML frameworks (TensorFlow, PyTorch, etc.) Familiaritywith collaborative software tools (Git, Jira, etc.) Experiencewith user interface libraries/ More ❯
in Python, and deploying in production environments like AWS. Explore & Prototype: Help bring new ideas to life by quickly prototyping new models and frameworks that solve business problems or spark client interest. Own & Iterate: Take ownership of smaller workstreams within larger projects, with opportunities to grow into leading entire projects. Solve Across the Stack: You'll work end-to More ❯
solving real world business problems using machine learning, deep learning, data mining and statistical algorithms - • Strong hands-on programming skills in Python, SQL, Hadoop/Hive. Additional knowledge of Spark, Scala, R, Java desired but not mandatory - • Strong analytical thinking - • Ability to creatively solve business problems, innovating new approaches where required and articulating ideas to a wide range of More ❯
developing & deploying scalable backend systems. Familiarity with CICD, containerisation, deployment technologies & cloud platforms (Jenkins, Kubernetes, Docker, AWS) or Familiarity with Big Data and Machine Learning technologies (NumPy, PyTorch, TensorFlow, Spark). Excellent communication, collaboration & problem solving skills, ideally with some experience in agile ways of working. Security clearance: You must be able to gain and maintain the highest level More ❯
management and monitoring. Hands-on experience with AWS Have a good grasp of IaC (Infrastructure-as-code) tools like Terraform and CloudFormation. Previous exposure to additional technologies like Python, Spark, Docker, Kubernetes is desirable. Ability to develop across a diverse technology stack and willingness and ability to take on new technologies. Demonstrated experience participating on cross functional teams in More ❯
management and monitoring. Hands-on experience with AWS Have a good grasp of IaC (Infrastructure-as-code) tools like Terraform and CloudFormation. Previous exposure to additional technologies like Python, Spark, Docker, Kubernetes is desirable. Ability to develop across a diverse technology stack and willingness and ability to take on new technologies. Demonstrated experience participating on cross functional teams in More ❯
a platform team Have experience of building and leading a distributed team Be passionate about internal quality, good code and effective technical practice Know our tech stack: Terrafrom, Python, Spark, Kafka, Kubernetes, Databricks and AWS services YOU WILL ENJOY 25 days holiday + bank holidays + your birthday + volunteer day Regular shutdown days, including the festive period Hybrid More ❯
languages such as Python, Java, or Scala, and experience with ML frameworks like TensorFlow, PyTorch, or scikit-learn Experience with large-scale distributed systems and big data technologies (e.g., Spark, Hadoop, Kafka) Accommodation requests If you need assistance with any part of the application or recruiting process due to a disability, or other physical or mental health conditions, please More ❯
some of the brightest technical minds in the industry today. BASIC QUALIFICATIONS - 10+ years of technical specialist, design and architecture experience - 10+ years of database (eg. SQL, NoSQL, Hadoop, Spark, Kafka, Kinesis) experience - 10+ years of consulting, design and implementation of serverless distributed solutions experience - Australian citizen with ability to obtain security clearance. PREFERRED QUALIFICATIONS - AWS Professional level certification More ❯
current cyber security threats, actors and their techniques. Experience with data science, big data analytics technology stack, analytic development for endpoint and network security, and streaming technologies (e.g., Kafka, Spark Streaming, and Kinesis). Strong sense of ownership combined with collaborative approach to overcoming challenges and influencing organizational change. Amazon is an equal opportunities employer. We believe passionately that More ❯
Days Ago Infrastructure Architect Sr., Cloud Control Management locations 5 Locations time type Full time posted on Posted 16 Days Ago Senior Software Engineer (Java, Python, Kafka, Jenkins, Spark) locations 5 Locations time type Full time posted on Posted 13 Days Ago Top Reasons to Join PNC Being a great place to work means we are making a lasting More ❯
Principal Data Engineer, Consulting Leeds Based You must be eligible for SC Clearance Role Overview The Principal Data Engineer will be responsible for designing and implementing cloud-based data solutions using a range of AWS services. This role involves working More ❯
Principal Data Engineer, Consulting Bristol Based You must be eligible for SC Clearance Role Overview The Principal Data Engineer will be responsible for designing and implementing cloud-based data solutions using a range of AWS services. This role involves working More ❯
Principal Data Engineer, Consulting London Based You must be eligible for SC Clearance Role Overview The Principal Data Engineer will be responsible for designing and implementing cloud-based data solutions using a range of AWS services. This role involves working More ❯
Principal Data Engineer, Consulting London Based You must be eligible for SC Clearance Role Overview The Principal Data Engineer will be responsible for designing and implementing cloud-based data solutions using a range of AWS services. This role involves working More ❯
In this position,you'll be based in the London office for a minimum of three days a week, with the flexibility to work from home for some of your workingweek.Find out more about our flexible work culture at We More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Opus Recruitment Solutions
Our client is a leading UK-based consultancy seeking a skilled professional to shape data strategies, mentor dynamic teams, and deliver cutting-edge solutions. With hands-on expertise in Spark, SQL, and cloud platforms like Azure, you’ll lead end-to-end projects, drive innovation, and collaborate with clients across industries. What You’ll Do: Lead complex data engineering … multi-project environments experience. Expertise in ETL, data modelling, and Azure Data Services. Experience in designing and implementing data pipelines, data lakes, and data warehouses. Hands-on experience with ApacheSpark and bonus points for Microsoft Fabric Any certifications are a bonus. Benefits: Competitive base salary Hybrid work once a week into their Central London office 25 days More ❯
collaboration, life-long learning, and driving business value through ML Company first focus and collaborative individuals - we work better when we work together. Preferred Experience working with Databricks and ApacheSpark Preferred Experience working in a customer-facing role About Databricks Databricks is the data and AI company. More than 10,000 organizations worldwide - including Comcast, Condé Nast … Platform to unify and democratize data, analytics and AI. Databricks is headquartered in San Francisco, with offices around the globe and was founded by the original creators of Lakehouse, ApacheSpark, Delta Lake and MLflow. Benefits At Databricks, we strive to provide comprehensive benefits and perks that meet the needs of all of our employees. For specific details More ❯