Lexington Park, Maryland, United States Hybrid / WFH Options
Spalding, a Saalex Company
Experience with data science packages desired. Knowledge of various machine learning techniques (e.g., clustering, decision trees, neural networks), model evaluation, and deployment desired. Familiarity with big data platforms (e.g., Hadoop, Spark) and cloud services (e.g., AWS). Experience working with Application Lifecycle and Configuration and Knowledge Management software is highly desired. Experience with GitLab is preferred. Experience with UNIX More ❯
like TensorFlow, PyTorch, or scikit-learn. Familiarity with cloud platforms like AWS, GCP, or Azure. Strong written and spoken English skills. Bonus Experience: Experience with big data tools (e.g., Hadoop, Spark) and distributed computing. Knowledge of NLP techniques and libraries. Familiarity with Docker, Kubernetes, and deploying machine learning models in production. Experience with visualization tools like Tableau, Power BI More ❯
City of London, England, United Kingdom Hybrid / WFH Options
McGregor Boyall
. Strong knowledge of LLM algorithms and training techniques . Experience deploying models in production environments. Nice to Have: Experience in GenAI/LLMs Familiarity with distributed computing tools (Hadoop, Hive, Spark). Background in banking, risk management, or capital markets . Why Join? This is a unique opportunity to work at the forefront of AI innovation in financial More ❯
Experience with data analysis and visualization tools (e.g., Matplotlib, Seaborn, Tableau). Ability to work independently and lead projects from inception to deployment. Experience with big data technologies (e.g., Hadoop, Spark) and cloud platforms (e.g., AWS, GCP, Azure) is desirable. MSc or PhD in Computer Science, Data Science, or related field is preferred. Don't meet every single requirement More ❯
Qualifications Skill requirements: • Professional experiences in advanced analytics, machine learning or AI • Enhances experiences in Data Engineering (ETL, validation, analysis, deductions) • Big Data skills in Elastic Search, SQL, Python, Hadoop, PySpark, Linux-Shell • Expert knowledge in automated provisioning and pre-processing of various data sets • Experienced in developing ML models in Python and technical and methodological aspects of running More ❯
London, England, United Kingdom Hybrid / WFH Options
Delta Capita
data models Programming languages such as Python, SQL, or Spark Expertise in data visualisation tools such as Qlik, Power BI (Business Intelligence) or Tableau Data engineering tools such as Hadoop, Kafka, SQL Experience of analytics and being able to review and present insights in large data sets formed of complex data, including with software such as Alteryx or DI More ❯
Maidenhead, England, United Kingdom Hybrid / WFH Options
BookFlowGo
and deploying real-time pricing or recommendation systems Deep technical knowledge of machine learning frameworks (e.g., TensorFlow, PyTorch), cloud platforms (AWS, GCP, Azure), and big data tools (e.g., Spark, Hadoop) Experience designing and/or implementing business-critical operational algorithms Clear communication skills; experienced in communicating with senior stakeholders and translating complex technical solutions into business impact Excellent problem More ❯
a degree or interest in the legal domain. Ability to communicate with multiple stakeholders, including non-technical legal subject matter experts. Experience with big data technologies such as Spark, Hadoop, or similar. Experience conducting world-leading research, e.g. by contributions to publications at leading ML venues. Previous experience working on large-scale data processing systems. Strong software and/ More ❯
Lexington, Massachusetts, United States Hybrid / WFH Options
Equiliem
reinforcement learning concepts, frameworks, and environments, such as Pandas, TensorFlow, and Jupyter Notebook. • Broad knowledge of the general features, capabilities, and trade-offs of common data warehouse (e.g. ApacheHadoop); workflow orchestration (e.g. Apache Beam); data extract, transform and load (ETL); and stream processing (e.g. Kafka) technologies. Hands-on experience with several of these technologies. -This position can be More ❯
Bristol, England, United Kingdom Hybrid / WFH Options
Ripjar
in production and have a curiosity and interest in learning more. In this role, you will be using python (specifically pyspark) and Node.js for processing data, backed by various Hadoop stack technologies such as HDFS and HBase. MongoDB and Elasticsearch are used for indexing smaller datasets. Airflow & Nifi are used to co-ordinate the processing of data, while Jenkins … software systems in production and have a curiosity and interest in learning more. You will be using Python (specifically pyspark) and Node.js for processing data You will be using Hadoop stack technologies such as HDFS and HBase Experience using MongoDB and Elasticsearch for indexing smaller datasets would be beneficial Experience using Airflow & Nifi to co-ordinate the processing of More ❯
London, England, United Kingdom Hybrid / WFH Options
Microsoft
/CD, Docker, or REST API. Proficiency in data science tools, deployment technologies, and frameworks, such as Azure Machine Learning, Azure Cognitive Services, Azure Databricks, Kubernetes or Azure Functions, Hadoop, Spark, Delta Lake, MLflow, or TensorFlow. Proficiency in communicating with clarity and impact, such as influencing others, oral communication, storytelling, technical communication, or written communication. Responsibilities Understand the business …/CD, Docker, or REST API. Proficiency in data science tools, deployment technologies, and frameworks, such as Azure Machine Learning, Azure Cognitive Services, Azure Databricks, Kubernetes or Azure Functions, Hadoop, Spark, Delta Lake, MLflow, or TensorFlow. Proficiency in communicating with clarity and impact, such as influencing others, oral communication, storytelling, technical communication, or written communication. #Azurecorejobs Responsibilities Understand the More ❯
London, England, United Kingdom Hybrid / WFH Options
Ripjar
in production and have a curiosity and interest in learning more. In this role, you will be using python (specifically pyspark) and Node.js for processing data, backed by various Hadoop stack technologies such as HDFS and HBase. MongoDB and Elasticsearch are used for indexing smaller datasets. Airflow & Nifi are used to co-ordinate the processing of data, while Jenkins … software systems in production and have a curiosity and interest in learning more. You will be using Python (specifically pyspark) and Node.js for processing data You will be using Hadoop stack technologies such as HDFS and HBase Experience using MongoDB and Elasticsearch for indexing smaller datasets would be beneficial Experience using Airflow & Nifi to co-ordinate the processing of More ❯
help shape our digital platforms 🧠 What we’re looking for Proven experience as a Data Engineer in cloud environments (Azure ideal) Proficiency in Python, SQL, Spark, Databricks Familiarity with Hadoop, NoSQL, Delta Lake Bonus: Azure Functions, Logic Apps, Django, CI/CD tools 💼 What you’ll get from Mars A competitive salary & bonus Hybrid working with flexibility built in More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Mars
help shape our digital platforms 🧠 What we’re looking for Proven experience as a Data Engineer in cloud environments (Azure ideal) Proficiency in Python, SQL, Spark, Databricks Familiarity with Hadoop, NoSQL, Delta Lake Bonus: Azure Functions, Logic Apps, Django, CI/CD tools 💼 What you’ll get from Mars A competitive salary & bonus Hybrid working with flexibility built in More ❯
and continuous delivery Excellent problem-solving skills and a collaborative mindset Agile development experience in a team setting Bonus Skills (nice to have) Experience with big data tools like Hadoop, Spark, or Scala Exposure to fraud, payments , or financial services platforms Understanding of cloud-native development and container orchestration Knowledge of test-driven development and modern code quality practices More ❯
Telford, England, United Kingdom Hybrid / WFH Options
Supermercados Guanabara
/machine learning libraries Background in agile software development environments Ability to estimate effort, manage dependencies, and communicate effectively with technical and non-technical stakeholders Desirable skills: Experience with Hadoop and Jenkins AWS or Azure certifications Familiarity with Java If you would like to learn more about the role, please apply through the advert and we will be in More ❯
London, England, United Kingdom Hybrid / WFH Options
Canonical
Like Work across the entire Linux stack, from kernel, networking, storage, to applications. Design and deliver open source code using Python. Architect cloud infrastructure solutions like OpenStack, Kubernetes, Ceph, Hadoop, and Spark on-premises or in public cloud platforms (AWS, Azure, Google Cloud). Mentor and develop colleagues with insights. Foster a healthy, collaborative engineering culture aligned with company More ❯
if you Have experience with Cloud-based or SaaS products and a good understanding of Digital Marketing and Marketing Technologies. Have experience working with Big Data technologies (such as Hadoop, MapReduce, Hive/Pig, Cassandra, MongoDB, etc) An understanding of web technologies such as Javascript, node.js and html. Some level of understanding or experience in AI/ML. Physical More ❯
London, England, United Kingdom Hybrid / WFH Options
Veeva Systems, Inc
Knowledge of fuzzing, memory corruption, and exploit development Familiar with Jenkins, Bamboo, CI/CD Pipelines, and other automation tools Experience with Big Data technologies such as Elastic, Cloudera, Hadoop, Datadog, or others Experience maintaining security tools and automation scripts to streamline security processes #RemoteUK Veeva’s headquarters is located in the San Francisco Bay Area with offices in More ❯
Go) University degree (IT/math) or equivalent experience The following additional qualifications are a significant plus: Kubernetes knowledge and operating experience Experience with big data stack components like Hadoop, Spark, Kafka, Nifi, Experience with data science/data analysis Knowledge of SRE/DevOP stacks - monitoring/system management tools (Prometheus, Ansible, ELK, ) Version control using git A More ❯
bring to the role? As you take on the Data Solution Architect role, you will have to come with previous experience of working with big data platforms such as Hadoop and Spark. Other key skills required for the role include: Extensive experience in data architecture and analytics platform solutions. Expertise in all things data including lifecycle, technologies, data-patterns More ❯
Birmingham, England, United Kingdom Hybrid / WFH Options
Ripjar
and data science libraries such as PyTorch, scikit-learn, numpy and scipy Good communication and interpersonal skills Experience working with large-scale data processing systems such as Spark and Hadoop Experience in software development in agile environments and an understanding of the software development lifecycle Experience using or implementing ML Operations approaches is valuable Working knowledge of statistics and More ❯
Bristol, England, United Kingdom Hybrid / WFH Options
Ripjar
and data science libraries such as PyTorch, scikit-learn, numpy and scipy Good communication and interpersonal skills Experience working with large-scale data processing systems such as Spark and Hadoop Experience in software development in agile environments and an understanding of the software development lifecycle Experience using or implementing ML Operations approaches is valuable Working knowledge of statistics and More ❯
Belfast, Northern Ireland, United Kingdom Hybrid / WFH Options
Danske Bank
adapt message for different stakeholder groups, including technical and non-technical audiences. Preferred 3rd Level degree in IT or related STEM discipline. Good working knowledge of Python. Experience with Hadoop and Spark. 1+ years' experience with IBM DataStage. Knowledge of cloud technologies such as AWS, data bricks, snowflake. HOW WE WORK Our belief is that we are " Better when More ❯
Cheltenham, England, United Kingdom Hybrid / WFH Options
Ripjar
Ripjar specialises in the development of software and data products that help governments and organisations combat serious financial crime. Our technology is used to identify criminal activity such as money laundering and terrorist financing, enabling organisations to enforce sanctions at More ❯