NoSQL databases such as MongoDB, ElasticSearch, MapReduce, and HBase.1. Demonstrated experience working with big data processing and NoSQL databases such as MongoDB, ElasticSearch, MapReduce, and HBase. Demonstrated experience with Apache NiFi. Demonstrated experience with the Extract, Transform, and Load (ETL) processes. Demonstrated experience managing and mitigating IT security vulnerabilities using Plans of Actions and Milestones (POAMs). Demonstrated experience More ❯
with new methodologies to enhance the user experience. Key skills: Senior Data Scientist experience Commercial experience in Generative AI and recommender systems Strong Python and SQL experience Spark/Apache Airflow LLM experience MLOps experience AWS Additional information: This role offers a strong salary of up to 95,000 (Depending on experience/skill) with hybrid working (2 days More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Cathcart Technology
with new methodologies to enhance the user experience. Key skills: ** Senior Data Scientist experience ** Commercial experience in Generative AI and recommender systems ** Strong Python and SQL experience ** Spark/Apache Airflow ** LLM experience ** MLOps experience ** AWS Additional information: This role offers a strong salary of up to £95,000 (Depending on experience/skill) with hybrid working (2 days More ❯
Central London, London, United Kingdom Hybrid / WFH Options
Cathcart Technology
with new methodologies to enhance the user experience. Key skills: ** Senior Data Scientist experience ** Commercial experience in Generative AI and recommender systems ** Strong Python and SQL experience ** Spark/Apache Airflow ** LLM experience ** MLOps experience ** AWS Additional information: This role offers a strong salary of up to £95,000 (Depending on experience/skill) with hybrid working (2 days More ❯
of in bedrijfskunde of gelijkwaardig door ervaring (meer dan 5 jaar). Je bent gewend om te werken met verschillende datatools zoals SSAS, SSIS, SSRS, AWS, Azure data factory, Apache Spark, SQL, Python, Tableau Je hebt een goede database kennis (SQL en NoSQL) Je spreekt vloeiend Frans of Nederlands en hebt volledige professionele capaciteiten in het Engels Je weet More ❯
City of London, London, Tottenham Court Road, United Kingdom Hybrid / WFH Options
Cathcart Technology
with new methodologies to enhance the user experience. Key skills: ** Senior Data Scientist experience ** Commercial experience in Generative AI and recommender systems ** Strong Python and SQL experience ** Spark/Apache Airflow ** LLM experience ** MLOps experience ** AWS Additional information: This role offers a strong salary of up to £95,000 (Depending on experience/skill) with hybrid working (2 days More ❯
Microsoft Azure and application workload management. Proficiency in object-oriented development techniques. Understanding of modern data processing architectures such as Data Lakehouse. Experience in data engineering technologies such as Apache Spark, Databricks, Python, and Scala. Experience working collaboratively as part of an Agile development squad. Degree in Computer Science or related subject. Bonus Points: Experience in the analytics industry. More ❯
Jenkins, TeamCity Scripting languages such as PowerShell, bash Observability/Monitoring: Prometheus, Grafana, Splunk Containerisation tools such as Docker, K8S, OpenShift, EC, containers Hosting technologies such as IIS, nginx, Apache, App Service, LightSail Analytical and creative approach to problem solving We encourage you to apply , even if you don't meet all of the requirements. We value your growth More ❯
both strategic and hands-on levels. Prior experience contributing to open-source projects or standards bodies (e.g., JCP). Some familiarity with the Hazelcast platform or similar technologies (e.g., Apache Ignite, Redis, AWS ElastiCache, Oracle Coherence, Kafka, etc.). Experience writing technical whitepapers or benchmark reports. BENEFITS 25 days annual leave + Bank holidays Group Company Pension Plan Private More ❯
future-proofing of the data pipelines. ETL and Automation Excellence: Lead the development of specialized ETL workflows, ensuring they are fully automated and optimized for performance using tools like Apache Airflow, Snowflake, and other cloud-based technologies. Drive improvements across all stages of the ETL cycle, including data extraction, transformation, and loading. Infrastructure & Pipeline Enhancement: Spearhead the upgrading of More ❯
in Python with libraries like TensorFlow, PyTorch, or Scikit-learn for ML, and Pandas, PySpark, or similar for data processing. Experience designing and orchestrating data pipelines with tools like Apache Airflow, Spark, or Kafka. Strong understanding of SQL, NoSQL, and data modeling. Familiarity with cloud platforms (AWS, Azure, GCP) for deploying ML and data solutions. Knowledge of MLOps practices More ❯
in Python with libraries like TensorFlow, PyTorch, or Scikit-learn for ML, and Pandas, PySpark, or similar for data processing. Experience designing and orchestrating data pipelines with tools like Apache Airflow, Spark, or Kafka. Strong understanding of SQL, NoSQL, and data modeling. Familiarity with cloud platforms (AWS, Azure, GCP) for deploying ML and data solutions. Knowledge of MLOps practices More ❯
between systems Experience with Google Cloud Platform (GCP) is highly preferred.(Experience with other cloud platforms like AWS, Azure can be considered.) Familiarity with data pipeline scheduling tools like Apache Airflow Ability to design, build, and maintain data pipelines for efficient data flow and processing Understanding of data warehousing best practices and experience in organising and cleaning up messy More ❯
pipelines. Implement and optimize automation processes using infrastructure-as-code (Terraform) Build and maintain data pipelines using Airflow. Manage our tech stack including Python, Node.js, PostgreSQL, MongoDB, Kafka, and Apache Iceberg. Optimize infrastructure costs and develop strategies for efficient resource utilization. Provide critical support by monitoring services and resolving production issues. Contribute to the development of new services as More ❯
pipelines. Implement and optimize automation processes using infrastructure-as-code (Terraform) Build and maintain data pipelines using Airflow. Manage our tech stack including Python, Node.js, PostgreSQL, MongoDB, Kafka, and Apache Iceberg. Optimize infrastructure costs and develop strategies for efficient resource utilization. Provide critical support by monitoring services and resolving production issues. Contribute to the development of new services as More ❯
environments, and better development practices Excellent written and verbal communication skills Experience with DevOps frameworks Entity Framework or similar ORM. Continuous Integration, Configuration Management. Enterprise Service Bus (ESB) Management (Apache Active MQ or NIFI) Technical Writing. Past Intelligence Systems experience. Experience with Test Driven Development Some system administration experience Experience with Jira, Confluence U.S. Citizen Must be able to More ❯
technologies. Experience with CI/CD pipelines and integrating automated tests within them - Jenkins, BitBucket required. Familiarity with performance testing, security testing, and other non-functional testing approaches - JMeter, Apache Benchmark preferred. Good experience of working on cloud technologies and services on AWS. Strong practical experience in Flyway or Liquibase. Strong understanding of modern technologies and adoption of advanced More ❯
environments, and better development practices • Excellent written and verbal communication skills • Experience with DevOps frameworks • Entity Framework or similar ORM. • Continuous Integration, Configuration Management. • Enterprise Service Bus (ESB) Management (Apache Active MQ or NIFI) • Technical Writing • Past Intelligence Systems experience. • Experience with Test Driven Development • Some system administration experience • Experience with Jira, Confluence Desired Qualification: • AWS, biometric, Microservices, User More ❯
data engineering or a related field. Proficiency in programming languages such as Python, Spark, SQL. Strong experience with SQL databases. Expertise in data pipeline and workflow management tools (e.g., Apache Airflow, ADF). Experience with cloud platforms (Azure preferred) and related data services. There's no place quite like BFS and we're proud of that. And it's More ❯
are some things Naimuri have worked on recently that might give you a better sense of what you'll be doing day to day: Improving systems integration performance, using Apache Ni-Fi, by balancing scaling and flow improvements through chunking Implementing AWS Security Control Policies to manage global access privileges. Validating and Converting data into a common data format More ❯
data engineering or a related field. Proficiency in programming languages such as Python, Spark, SQL. Strong experience with SQL databases. Expertise in data pipeline and workflow management tools (e.g., Apache Airflow, ADF). Experience with cloud platforms (Azure preferred) and related data services. There's no place quite like BFS and we're proud of that. And it's More ❯
Azure) Experience managing PKI/X.509 certificate infrastructure. Extensive experience supporting and implementing TLS/SSL certificate management systems Proficient with Token-based authentication services, Perfect Forward Security (PFS), Apache, Nginx, HAProxy Solid knowledge of Linux security and system operations. Benefits Roku is committed to offering a diverse range of benefits as part of our compensation package to support More ❯
substituted for a degree) • 15+ years of relevant experience in software development, ranging from work in a DevOps environment to full stack engineering • Proficiency in the following technologies: • Java • Apache NiFi workflow configuration and deployment • Databases such as PostgreSQL and MongoDB • Python and machine learning • Docker • Kubernetes • Cloud-like infrastructure • Experience with Jenkins for pipeline integration and deployment • Familiarity More ❯
languages such as Python, Java, or C++. Experience with machine learning frameworks and libraries (e.g., TensorFlow, PyTorch, Scikit-learn). Familiarity with data processing tools and platforms (e.g., SQL, Apache Spark, Hadoop). Knowledge of cloud computing services (e.g., AWS, Google Cloud, Azure) and containerization technologies (e.g., Docker, Kubernetes) is a plus. Hugging Face Ecosystem: Demonstrated experience using Hugging More ❯
Gloucester, Gloucestershire, United Kingdom Hybrid / WFH Options
Leonardo UK Ltd
attitude, capable of acquiring new skills. Objective and logical with an enquiring and creative mind. It would be nice if you had : Data Engineering - experience of one or more: Apache ecosystem, SQL, Python. Web - HTML, CSS, JavaScript, XML, SOAP. Experience with Secure DevSecOps within an Agile/SAFe environment. Containerisation & Orchestration - Docker, Podman, Kubernetes, Rancher etc. Software development capability. More ❯