with Data Quality and Data Governance concepts and experience. 11. (Desired) Demonstrated experience maintaining, supporting, and improving the ETL process through the implementation and standardization of data flows with Apache Nifi and other ETL tools. 12. (Desired) Demonstrated experience with Apache Spark B4CORP Company Information B4Corp is a small defense contracting company that focuses on providing an optimum More ❯
Substantial experience using tools for statistical modelling of large data sets Some familiarity with data workflow management tools such as Airflow as well as big data technologies such as Apache Spark or other caching and analytics technologies Expertise in model training, Statistics, model evaluation, deployment and optimisation, including RAG-based architectures. More ❯
Substantial experience using tools for statistical modelling of large data sets Some familiarity with data workflow management tools such as Airflow as well as big data technologies such as Apache Spark or other caching and analytics technologies Expertise in model training, Statistics, model evaluation, deployment and optimisation, including RAG-based architectures. More ❯
SQL and at least one programming language (e.g., Python) Familiarity with relational databases and data warehousing concepts Understanding of ETL concepts and tools Exposure to workflow orchestration tools like Apache Airflow, NiFi and Kafka Strong analytical and problem-solving skills Excellent communication and teamwork abilities Eagerness to learn and grow in a fast-paced environment Experience in Jupyter Notebooks More ❯
Salisbury, Wiltshire, South West, United Kingdom Hybrid / WFH Options
Anson Mccade
modern data lake/lakehouse architectures Strong grasp of cloud data platforms (AWS, Azure, GCP, Snowflake) Understanding of Data Mesh , Data Fabric , and data product-centric approaches Familiarity with Apache Spark , Python , and ETL/ELT pipelines Strong knowledge of data governance, lifecycle management, and compliance (e.g. GDPR) Consulting experience delivering custom data solutions across sectors Excellent leadership, communication More ❯
NoSQL databases such as MongoDB, ElasticSearch, MapReduce, and HBase.1. Demonstrated experience working with big data processing and NoSQL databases such as MongoDB, ElasticSearch, MapReduce, and HBase. Demonstrated experience with Apache NiFi. Demonstrated experience with the Extract, Transform, and Load (ETL) processes. Demonstrated experience managing and mitigating IT security vulnerabilities using Plans of Actions and Milestones (POAMs). Demonstrated experience More ❯
with new methodologies to enhance the user experience. Key skills: Senior Data Scientist experience Commercial experience in Generative AI and recommender systems Strong Python and SQL experience Spark/Apache Airflow LLM experience MLOps experience AWS Additional information: This role offers a strong salary of up to 95,000 (Depending on experience/skill) with hybrid working (2 days More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Cathcart Technology
with new methodologies to enhance the user experience. Key skills: ** Senior Data Scientist experience ** Commercial experience in Generative AI and recommender systems ** Strong Python and SQL experience ** Spark/Apache Airflow ** LLM experience ** MLOps experience ** AWS Additional information: This role offers a strong salary of up to £95,000 (Depending on experience/skill) with hybrid working (2 days More ❯
Central London, London, United Kingdom Hybrid / WFH Options
Cathcart Technology
with new methodologies to enhance the user experience. Key skills: ** Senior Data Scientist experience ** Commercial experience in Generative AI and recommender systems ** Strong Python and SQL experience ** Spark/Apache Airflow ** LLM experience ** MLOps experience ** AWS Additional information: This role offers a strong salary of up to £95,000 (Depending on experience/skill) with hybrid working (2 days More ❯
City of London, London, Tottenham Court Road, United Kingdom Hybrid / WFH Options
Cathcart Technology
with new methodologies to enhance the user experience. Key skills: ** Senior Data Scientist experience ** Commercial experience in Generative AI and recommender systems ** Strong Python and SQL experience ** Spark/Apache Airflow ** LLM experience ** MLOps experience ** AWS Additional information: This role offers a strong salary of up to £95,000 (Depending on experience/skill) with hybrid working (2 days More ❯
Jenkins, TeamCity Scripting languages such as PowerShell, bash Observability/Monitoring: Prometheus, Grafana, Splunk Containerisation tools such as Docker, K8S, OpenShift, EC, containers Hosting technologies such as IIS, nginx, Apache, App Service, LightSail Analytical and creative approach to problem solving We encourage you to apply , even if you don't meet all of the requirements. We value your growth More ❯
both strategic and hands-on levels. Prior experience contributing to open-source projects or standards bodies (e.g., JCP). Some familiarity with the Hazelcast platform or similar technologies (e.g., Apache Ignite, Redis, AWS ElastiCache, Oracle Coherence, Kafka, etc.). Experience writing technical whitepapers or benchmark reports. BENEFITS 25 days annual leave + Bank holidays Group Company Pension Plan Private More ❯
future-proofing of the data pipelines. ETL and Automation Excellence: Lead the development of specialized ETL workflows, ensuring they are fully automated and optimized for performance using tools like Apache Airflow, Snowflake, and other cloud-based technologies. Drive improvements across all stages of the ETL cycle, including data extraction, transformation, and loading. Infrastructure & Pipeline Enhancement: Spearhead the upgrading of More ❯
between systems Experience with Google Cloud Platform (GCP) is highly preferred.(Experience with other cloud platforms like AWS, Azure can be considered.) Familiarity with data pipeline scheduling tools like Apache Airflow Ability to design, build, and maintain data pipelines for efficient data flow and processing Understanding of data warehousing best practices and experience in organising and cleaning up messy More ❯
pipelines. Implement and optimize automation processes using infrastructure-as-code (Terraform) Build and maintain data pipelines using Airflow. Manage our tech stack including Python, Node.js, PostgreSQL, MongoDB, Kafka, and Apache Iceberg. Optimize infrastructure costs and develop strategies for efficient resource utilization. Provide critical support by monitoring services and resolving production issues. Contribute to the development of new services as More ❯
environments, and better development practices Excellent written and verbal communication skills Experience with DevOps frameworks Entity Framework or similar ORM. Continuous Integration, Configuration Management. Enterprise Service Bus (ESB) Management (Apache Active MQ or NIFI) Technical Writing. Past Intelligence Systems experience. Experience with Test Driven Development Some system administration experience Experience with Jira, Confluence U.S. Citizen Must be able to More ❯
technologies. Experience with CI/CD pipelines and integrating automated tests within them - Jenkins, BitBucket required. Familiarity with performance testing, security testing, and other non-functional testing approaches - JMeter, Apache Benchmark preferred. Good experience of working on cloud technologies and services on AWS. Strong practical experience in Flyway or Liquibase. Strong understanding of modern technologies and adoption of advanced More ❯
environments, and better development practices • Excellent written and verbal communication skills • Experience with DevOps frameworks • Entity Framework or similar ORM. • Continuous Integration, Configuration Management. • Enterprise Service Bus (ESB) Management (Apache Active MQ or NIFI) • Technical Writing • Past Intelligence Systems experience. • Experience with Test Driven Development • Some system administration experience • Experience with Jira, Confluence Desired Qualification: • AWS, biometric, Microservices, User More ❯
data engineering or a related field. Proficiency in programming languages such as Python, Spark, SQL. Strong experience with SQL databases. Expertise in data pipeline and workflow management tools (e.g., Apache Airflow, ADF). Experience with cloud platforms (Azure preferred) and related data services. There's no place quite like BFS and we're proud of that. And it's More ❯
are some things Naimuri have worked on recently that might give you a better sense of what you'll be doing day to day: Improving systems integration performance, using Apache Ni-Fi, by balancing scaling and flow improvements through chunking Implementing AWS Security Control Policies to manage global access privileges. Validating and Converting data into a common data format More ❯
data engineering or a related field. Proficiency in programming languages such as Python, Spark, SQL. Strong experience with SQL databases. Expertise in data pipeline and workflow management tools (e.g., Apache Airflow, ADF). Experience with cloud platforms (Azure preferred) and related data services. There's no place quite like BFS and we're proud of that. And it's More ❯
Azure) Experience managing PKI/X.509 certificate infrastructure. Extensive experience supporting and implementing TLS/SSL certificate management systems Proficient with Token-based authentication services, Perfect Forward Security (PFS), Apache, Nginx, HAProxy Solid knowledge of Linux security and system operations. Benefits Roku is committed to offering a diverse range of benefits as part of our compensation package to support More ❯
substituted for a degree) • 15+ years of relevant experience in software development, ranging from work in a DevOps environment to full stack engineering • Proficiency in the following technologies: • Java • Apache NiFi workflow configuration and deployment • Databases such as PostgreSQL and MongoDB • Python and machine learning • Docker • Kubernetes • Cloud-like infrastructure • Experience with Jenkins for pipeline integration and deployment • Familiarity More ❯
languages such as Python, Java, or C++. Experience with machine learning frameworks and libraries (e.g., TensorFlow, PyTorch, Scikit-learn). Familiarity with data processing tools and platforms (e.g., SQL, Apache Spark, Hadoop). Knowledge of cloud computing services (e.g., AWS, Google Cloud, Azure) and containerization technologies (e.g., Docker, Kubernetes) is a plus. Hugging Face Ecosystem: Demonstrated experience using Hugging More ❯