pipelines. Implement and optimize automation processes using infrastructure-as-code (Terraform) Build and maintain data pipelines using Airflow. Manage our tech stack including Python, Node.js, PostgreSQL, MongoDB, Kafka, and Apache Iceberg. Optimize infrastructure costs and develop strategies for efficient resource utilization. Provide critical support by monitoring services and resolving production issues. Contribute to the development of new services as More ❯
environments, and better development practices Excellent written and verbal communication skills Experience with DevOps frameworks Entity Framework or similar ORM. Continuous Integration, Configuration Management. Enterprise Service Bus (ESB) Management (Apache Active MQ or NIFI) Technical Writing. Past Intelligence Systems experience. Experience with Test Driven Development Some system administration experience Experience with Jira, Confluence U.S. Citizen Must be able to More ❯
technologies. Experience with CI/CD pipelines and integrating automated tests within them - Jenkins, BitBucket required. Familiarity with performance testing, security testing, and other non-functional testing approaches - JMeter, Apache Benchmark preferred. Good experience of working on cloud technologies and services on AWS. Strong practical experience in Flyway or Liquibase. Strong understanding of modern technologies and adoption of advanced More ❯
environments, and better development practices • Excellent written and verbal communication skills • Experience with DevOps frameworks • Entity Framework or similar ORM. • Continuous Integration, Configuration Management. • Enterprise Service Bus (ESB) Management (Apache Active MQ or NIFI) • Technical Writing • Past Intelligence Systems experience. • Experience with Test Driven Development • Some system administration experience • Experience with Jira, Confluence Desired Qualification: • AWS, biometric, Microservices, User More ❯
data engineering or a related field. Proficiency in programming languages such as Python, Spark, SQL. Strong experience with SQL databases. Expertise in data pipeline and workflow management tools (e.g., Apache Airflow, ADF). Experience with cloud platforms (Azure preferred) and related data services. There's no place quite like BFS and we're proud of that. And it's More ❯
are some things Naimuri have worked on recently that might give you a better sense of what you'll be doing day to day: Improving systems integration performance, using Apache Ni-Fi, by balancing scaling and flow improvements through chunking Implementing AWS Security Control Policies to manage global access privileges. Validating and Converting data into a common data format More ❯
data engineering or a related field. Proficiency in programming languages such as Python, Spark, SQL. Strong experience with SQL databases. Expertise in data pipeline and workflow management tools (e.g., Apache Airflow, ADF). Experience with cloud platforms (Azure preferred) and related data services. There's no place quite like BFS and we're proud of that. And it's More ❯
Azure) Experience managing PKI/X.509 certificate infrastructure. Extensive experience supporting and implementing TLS/SSL certificate management systems Proficient with Token-based authentication services, Perfect Forward Security (PFS), Apache, Nginx, HAProxy Solid knowledge of Linux security and system operations. Benefits Roku is committed to offering a diverse range of benefits as part of our compensation package to support More ❯
substituted for a degree) • 15+ years of relevant experience in software development, ranging from work in a DevOps environment to full stack engineering • Proficiency in the following technologies: • Java • Apache NiFi workflow configuration and deployment • Databases such as PostgreSQL and MongoDB • Python and machine learning • Docker • Kubernetes • Cloud-like infrastructure • Experience with Jenkins for pipeline integration and deployment • Familiarity More ❯
languages such as Python, Java, or C++. Experience with machine learning frameworks and libraries (e.g., TensorFlow, PyTorch, Scikit-learn). Familiarity with data processing tools and platforms (e.g., SQL, Apache Spark, Hadoop). Knowledge of cloud computing services (e.g., AWS, Google Cloud, Azure) and containerization technologies (e.g., Docker, Kubernetes) is a plus. Hugging Face Ecosystem: Demonstrated experience using Hugging More ❯
Gloucester, Gloucestershire, United Kingdom Hybrid / WFH Options
Leonardo UK Ltd
attitude, capable of acquiring new skills. Objective and logical with an enquiring and creative mind. It would be nice if you had : Data Engineering - experience of one or more: Apache ecosystem, SQL, Python. Web - HTML, CSS, JavaScript, XML, SOAP. Experience with Secure DevSecOps within an Agile/SAFe environment. Containerisation & Orchestration - Docker, Podman, Kubernetes, Rancher etc. Software development capability. More ❯
Falls Church, Virginia, United States Hybrid / WFH Options
Epsilon Inc
stakeholders. One or more of the following certifications are desired: Certified Cloud Security Professional (CCSP) , GIAC Security Essentials Certification (GSEC), or CompTIA Cybersecurity Analyst (CySA+) Preferred Qualifications: Familiarity with Apache Spark or comparable distributed data processing frameworks, preferably for large-scale data transformations and analytics. Working knowledge of data governance platforms (e.g., Collibra) and cloud-based analytics solutions (e.g. More ❯
Falls Church, Virginia, United States Hybrid / WFH Options
Epsilon Inc
Agile/Scrum, Python Programmer Preferred Qualifications: DOD 8570 IAT Level II Certification may be required (GSEC, GICSP, CND, CySA+, Security+ CE, SSCP or CCNA-Security). Familiarity with Apache Spark or comparable distributed data processing frameworks, preferably for large-scale data transformations and analytics. Working knowledge of data governance platforms (e.g., Collibra) and cloud-based analytics solutions (e.g. More ❯
Falls Church, Virginia, United States Hybrid / WFH Options
Epsilon Inc
teams while maintaining a continuous improvement mindset. One or more of the following certifications are desired: AWS Certified Developer, Databricks, Agile/Scrum, Python Programmer Preferred Qualifications: Familiarity with Apache Spark or comparable distributed data processing frameworks, preferably for large-scale data transformations and analytics. Working knowledge of data governance platforms (e.g., Collibra) and cloud-based analytics solutions (e.g. More ❯
Falls Church, Virginia, United States Hybrid / WFH Options
Epsilon Inc
Solutions Architect or Machine Learning Specialty. Databricks Certified Machine Learning Professional. Agile/Scrum Master Certification. Specialized certifications in AI/ML tools or methodologies. Preferred Qualifications: Familiarity with Apache Spark or comparable distributed data processing frameworks, preferably for large-scale data transformations and analytics. Working knowledge of data governance platforms (e.g., Collibra) and cloud-based analytics solutions (e.g. More ❯
AIBODY is a pioneer in Digital Physiology, offering the world’s most comprehensive library of digital human models—from cellular mechanisms to full-body simulations. Our innovations empower next-generation medical professionals through interactive and intuitive tools that simplify the More ❯
associated with Trading such as MiFID II or EMIR Experience building real-time applications based on a messaging paradigm Experience building large-scale data processing pipelines (for e.g. using Apache Spark, etc) Experience building Highly Available and High Performance applications Experience of FIX messaging protocol FX Options/Derivatives experience Senior Python Developer More ❯
protocols, DNS, SSL/TLS, and web security practices. Your Day to Day will include but not limited to Installation and Configuration: Install and configure web server software (e.g., Apache, Nginx, IIS) and related components. Maintenance and Upgrades: Regularly update and patch web servers to ensure they are secure and performing optimally. Monitoring and Performance Tuning: Monitor server performance More ❯
Spring Boot ecosystem. In-depth knowledge of JPA, particularly with Hibernate. Good understanding of RESTful web services and API design. Expertise in MySQL or any other RDBMS. Expertise in Apache Kafka and change data capture pipelines. Proficient in Java performance profiling. Experience with search solutions like Hibernate Search &Elasticsearch. Strong ability to present technical information clearly. Excellent at providing … systems and build automations. Experience with service-oriented architecture and multi-tier server applications. Exposure to Caching Systems, Spring Cloud, Swagger. Experience with data warehousing solutions like Snowflake/Apache Doris. Familiarity with Debezium. HOW TO APPLY Please apply with a CV and cover letter demonstrating how you meet the skills above. If we would like to move forward More ❯
Falls Church, Virginia, United States Hybrid / WFH Options
Epsilon Inc
Effective communication skills and a collaborative mindset. One or more of the following certifications are desired: AWS Certified Developer, Databricks, Agile/Scrum, Python Programmer Preferred Qualifications: Familiarity with Apache Spark or comparable distributed data processing frameworks, preferably for large-scale data transformations and analytics. Working knowledge of data governance platforms (e.g., Collibra) and cloud-based analytics solutions (e.g. More ❯
Gloucester, Gloucestershire, South West, United Kingdom Hybrid / WFH Options
NSD
hybrid working when possible Must hold active Enhanced DV Clearance (West) Competitive Salary DOE - 6% bonus, 25 days holiday, clearance bonus Experience in Data Pipelines, ETL processing, Data Integration, Apache, SQL/NoSQL Who Are We? Our client is a trusted and growing supplier to the National Security sector, delivering mission-critical solutions that help keep the nation safe … maintain optimal operation. The Data Engineer Should Have: Active eDV clearance (West) Willingness to work full time on site in Gloucester when required. Required technical experience in the following: Apache Kafka Apache NiFI SQL and noSQL databases (e.g. MongoDB) ETL processing languages such as Groovy, Python or Java To be Considered: Please either apply by clicking online or … hearing from you. KEY SKILLS: DATA ENGINEER/DATA ENGINEERING/DEFENCE/NATIONAL SECURITY/DATA STRATEGY/DATA PIPELINES/DATA GOVERNANCE/SQL/NOSQL/APACHE/NIFI/KAFKA/ETL/GLOUCESTER/DV/SECURITY CLEARED/DV CLEARANCE More ❯
and tests. • Leverage development and design patterns to ensure the product's scalability, maintainability, and long-term success. • Understand API-driven microservice design patterns, NoSQL databases, data ingest tools (Apache NiFi), and modern web frameworks. • Maintain a team player mentality as a collaborative member of a fast-paced, structured 10-14 person team. Skills Requirements: • Proficiency in the following … a 10+ person team environment. Nice to Haves: • NoSQL DBs (Mongo, ElasticSearch, Redis, Graph DB, etc.). • Data wrangling (Discovery, Mining, Cleaning, Exploration, Modeling, Structuring, Enriching, and Validating) with Apache NiFi or similar tools. • CI/CD (e.g., Jenkins), Junit testing or similar. • Scripting with Bash, Python, and/or Groovy. YOE Requirement: 3 yrs., B.S. in a technical More ❯
Gloucester, England, United Kingdom Hybrid / WFH Options
Searchability NS&D
hybrid working when possible Must hold active Enhanced DV Clearance (West) Competitive Salary DOE - 6% bonus, 25 days holiday, clearance bonus Experience in Data Pipelines, ETL processing, Data Integration, Apache, SQL/NoSQL Who Are We? Our client is a trusted and growing supplier to the National Security sector, delivering mission-critical solutions that help keep the nation safe … maintain optimal operation. The Data Engineer Should Have: Active eDV clearance (West) Willingness to work full-time on-site in Gloucester when required. Required technical experience in the following: Apache Kafka Apache NiFI SQL and noSQL databases (e.g. MongoDB) ETL processing languages such as Groovy, Python or Java To be Considered: Please either apply by clicking online or … hearing from you. KEY SKILLS: DATA ENGINEER/DATA ENGINEERING/DEFENCE/NATIONAL SECURITY/DATA STRATEGY/DATA PIPELINES/DATA GOVERNANCE/SQL/NOSQL/APACHE/NIFI/KAFKA/ETL/GLOUCESTER/DV/SECURITY CLEARED/DV CLEARANCE More ❯
required GCP Focus What will make you stand out: Proven experience with data warehousing and ETL/ELT. Google BigQuery, GCP, SQL etc. Experience with Python, TypeScript etc. Experience Apache Airflow, Cloud Composer etc. Experience with stream processing (e.g. Dataflow, Pub/Sub). Experience with data visualization tools (e.g. Looker Studio, Tableau). This role is an immediate More ❯
platform teams at scale, ideally in consumer-facing or marketplace environments. Strong knowledge of distributed systems and modern data ecosystems, with hands-on experience using technologies such as Databricks, Apache Spark, Apache Kafka, and DBT. Proven success in building and managing data platforms supporting both batch and real-time processing architectures. Deep understanding of data warehousing, ETL/ More ❯