Glasgow, Scotland, United Kingdom Hybrid / WFH Options
Synchro
we encourage you to apply. Python App Developer Requirements: Proficiency with Python or PySpark. Exposure to cloud technologies such as Airflow, Astronomer, Kubernetes, AWS, Spark, Kafka. Experience with Big Data solutions or Relational DB. Demonstrated knowledge of software applications and technical processes within a cloud or microservice architecture. Hands more »
DevOps experience in CI/CD. Experience using Tensorflow, Kubernetes, MLFlow, Kafka, and Airflow. Experience using Python is a must (tools like AWS and Spark are beneficial) Excellent communication skills and team and colleague engagement. A keen interest in problem-solving and using scalable machine learning to solve the more »
DevOps experience in CI/CD. Experience using Tensorflow, Kubernetes, MLFlow, Kafka, and Airflow. Experience using Python is a must (tools like AWS and Spark are beneficial) Excellent communication skills and team and colleague engagement. A keen interest in problem-solving and using scalable machine learning to solve the more »
and implement pre-processing pipelines for large data, create visualisation and reports for model performance, whilst collaborating with various engineers to improve knowledge and spark innovation. As the Machine Learning Engineer you will ideally have a degree in a relevant industry; Computer Science, Maths, AI, or similar, at least more »
DevOps experience in CI/CD. Experience using Tensorflow, Kubernetes, MLFlow, Kafka, and Airflow. Experience using Python is a must (tools like AWS and Spark are beneficial) Excellent communication skills and team and colleague engagement. A keen interest in problem-solving and using scalable machine learning to solve the more »
in Python, R, and SQL. Extensive experience (over 5 years) in building Machine Learning models. Understanding of underlying data systems like Cloud architectures, K8S, Spark, and SQL. Fluency in English and German, with French being a plus. Desirable experience in Consulting or Customer-facing Data Science roles, Data Engineering more »
you! Minimum Qualifications Bachelors or Masters Degree in Engineering or Computer Applications Hands-on experience with MS SQL Server and GCP Familiarity with BQ, Spark, Hive, Pig, and other analytical tools. Understanding of finance domain. Preferred Qualification Experience in SAP data modelling Genpact is an Equal Opportunity Employer and more »
wrangling, modelling, feature engineering and deployment What's equally relevant is programming knowledge and a gift for coding using: R, Python, SQL, SAS or Spark You'll also need the capability to deliver projects, collaborate with operational teams and be able to explain technical concepts to non-specialists What more »
EC1N, Farringdon Without, Greater London, United Kingdom
Damia Group Ltd
skilled SC Cleared DevSecOps Engineer to join their team where your key responsibilities will be to : Manage operational procedures. Transform and process data using Apache Spark. Administer AWS RDS with MySQL. Work with the Hadoop platform. Create reports using Tableau. Utilize Red Hat Decision Central Skills/Experience Required … of the SC Cleared DevSecOps Engineer: Strong operational procedures knowledge. Proficient in ApacheSpark, AWS RDS (MySQL), and Hadoop. Knowledge of Tableau and Red Hat Decision Central If this SC Cleared DevSecOps Engineer role sounds a good fit for you get in touch now for a prompt discussion more »
Employment Type: Permanent
Salary: £50000 - £65000/annum 15% cash flex and 10% bonus
engineer will supply advice to analytical users on how they can access and utilise the new datasets. Qualities Comfortable with Python - ideally experience with ApacheSpark and Pyspark Previous data analytics software experience Able to scope new integrations and translate analytical user needs into technical requirement. UK based more »
ideally in a start-up or scale-up. - Machine learning libraries and frameworks (TensorFlow, PyTorch, scikit-learn). - Python - Big data processing tools (e.g., Spark). The role offers a salary range of between £70-100K depending on experience. The successful candidate must be able to work from more »
in programming languages commonly used in machine learning, preferably Python. Experience with machine learning frameworks and libraries, such as TensorFlow, PyTorch, scikit-learn, or Apache Spark. Proven track record of developing and implementing machine learning solutions in a professional setting. Passion for exploring new technologies and driving innovation in more »
product experimentation, Causal AI, and advanced statistical techniques. Deep knowledge of data science tools (e.g., scikit-learn, TensorFlow, PyTorch) and big data technologies (e.g., Spark). Proficiency in Python for data manipulation, model building, and scripting. Strong communication skills to present findings to both technical and non-technical audiences. more »
build their AI practice and a team around you. Required Skills: Building cloud and native machine learning architecture with: LLamaIndex, HuggingFace, SentenceTransformers, PyTorch, Python, Apache Spark. Experience with practical application of AI and scaling AI with these tools Experience in Health Care is essential We would love to share more »
issues The person Degree in Engineering, Technology, or related fields/equivalent 3+ years in AI solution delivery Experienced in relevant technologies (Python, TensorFlow, Spark, Azure Cloud, Git, Docker) Strong analytical and communication skills This comes with a fantastic salary and full benefits package – happy to discuss in full. more »
You love people. Nothing excites you more than spotting that one thing in a person others may have missed, that one spark or twinkle that peaks your interest during an interview. Then watching that same person progress up the career ladder you implemented, using the techniques you taught them more »
ingestion pipelines. Requirements: Proven experience working with Python or Java or C# Experience working with ELT/ELT technologies such as Airflow, Argo, Dagster, Spark, Hive Strong technical expertise, especially in data processing and exploration, with a willingness to learn new technologies. A passion for automation and driving continual more »
major advantage Experience in building and enhancing compute, storage, and data platforms with exposure to open source products like Kubernetes, Knative, Ceph, Rook, Cassandra, Spark, Nate and the like. Hands-on with infrastructure-as-code tools and automation, such as Terraform, Ansible, or Helm. The role Tech Lead responsible more »
network programming. Experience in building and enhancing compute, storage, and data platforms with exposure to open source products like Kubernetes, Knative, Ceph, Rook, Cassandra, Spark, Nate and the like. Hands-on with infrastructure-as-code tools and automation, such as Terraform, Ansible, or Helm. The role Tech Lead responsible more »
or Rust. Experience in building and enhancing compute, storage, and data platforms with exposure to open source products like Kubernetes, Knative, Ceph, Rook, Cassandra, Spark, Nate etc. Hands-on exp. with IaC tools and automation, such as Terraform, Ansible, or Helm. Active engagement or contributions to the open-source more »
network programming. Experience in building and enhancing compute, storage, and data platforms with exposure to open source products like Kubernetes, Knative, Ceph, Rook, Cassandra, Spark, Nate and the like. Hands-on with infrastructure-as-code tools and automation, such as Terraform, Ansible, or Helm. The role Tech Lead responsible more »
or Rust. Experience in building and enhancing compute, storage, and data platforms with exposure to open source products like Kubernetes, Knative, Ceph, Rook, Cassandra, Spark, Nate etc. Hands-on exp. with IaC tools and automation, such as Terraform, Ansible, or Helm. Active engagement or contributions to the open-source more »
or Rust. Experience in building and enhancing compute, storage, and data platforms with exposure to open source products like Kubernetes, Knative, Ceph, Rook, Cassandra, Spark, Nate etc. Hands-on exp. with IaC tools and automation, such as Terraform, Ansible, or Helm. Active engagement or contributions to the open-source more »
equity financing to mid-market and late-stage companies. Liquidity Group is backed by leading global financial institutions including Japan’s largest bank, MUFG, Spark Capital, and Apollo Asset Management. About the role We're on the lookout for accomplished credit professionals to assume the role of Director within more »
Accelerate customers’ advertising campaigns. The team collaborates closely with other ML-oriented teams at Liftoff. We use many modern ML frameworks and architectures, including Spark and PyTorch to build deep neural networks.Location:This role is located in Liftoff's Hub in Paris, France or can be in any of more »