people, Boeing Defence UK provides long-term support for more than 120 Boeing military rotary-wing and fixed-wing aircrafts in the UK. For example, the Chinook and Apache helicopters, and the Poseidon and C-17 airplanes. Our support ranges from mission critical Logistics Information Services, next generation in-flight digital tools, to aircraft and operational modelling and simulation More ❯
people, Boeing Defence UK provides long-term support for more than 120 Boeing military rotary-wing and fixed-wing aircrafts in the UK. For example, the Chinook and Apache helicopters, and the Poseidon and C-17 airplanes. Our support ranges from mission critical Logistics Information Services, next generation in-flight digital tools, to aircraft and operational modelling and simulation More ❯
London - upto £60K + Benefits Key Skills: Ubuntu Linux, Bash scripting, Chef/Puppet, Jenkins, Networking protocols - TCP/IP, HTTP, SSL, shell/perl scripting, Web Servers e.g. Apache, etc, Agile methodologies, DockerA well known B2B Saas business are now hiring a Cloud Engineer due to steady growth. There are looking for a technical and client orientated Cloud More ❯
can be outside of the ones mentioned here above. Agile, Scrum, Devops knowledges are assets CISSP and/or OSCP are assets Others PKI knowledge Reverse Proxies: ApacheHTTPD, NGINX are assets. Basic networking knowledge (Layer 3, 4) Linux/Unix System Engineer (RedHat) Language: English Soft skills Great teammate and supporter. Customer focus. Open Minded. Eager to learn More ❯
can be outside of the ones mentioned here above. Agile, Scrum, Devops knowledges are assets CISSP and/or OSCP are assets Others PKI knowledge Reverse Proxies: ApacheHTTPD, NGINX are assets. Basic networking knowledge (Layer 3, 4) Linux/Unix System Engineer (RedHat) Language: English Soft skills Great teammate and supporter. Customer focus. Open Minded. Eager to learn More ❯
Database - PostgreSQL or any SQL (Experience with PostgreSQL or similar Relational Database Management System (RDBMS) Web app server Familiar (Experience preferred) with open source web application server such as ApacheHTTP, Tomcat Middle tier - messaging systems Experience using the following: Kafka - or any Java-based message broker Active MQ - product we are using that is java based message broker More ❯
as VMWare, Docker, HyperV, Xen, Kubernetes, etc. Expertise in building out complex enterprise infrastructures using various operating systems and services (e.g., AD, Exchange, DNS, DHCP, VPN, email, databases, IIS, Apache, etc. More ❯
and server-less solution architecting development in Azure. • Experience in automation and server-less solution architecting development in Oracle Cloud. • Experience with streaming data tools and software, such as Apache or Confluent Kafka • Experience with Data Integration, Data Engineering and Data Lake implementations using ETL, Big Data and Cloud Technology. • Experience with JIRA Confluence • Familiarity with Security Information and More ❯
Bristol, Gloucestershire, United Kingdom Hybrid / WFH Options
Apacheix
a year Individual healthcare cover Genuine flexible working Work from home, our Bristol offices, or client sites The latest secure tech Investment in personal development Vibrant social scene Why Apache iX? Our growing team brings a wealth of experience from across the defence and security sector, and we pride ourselves in delivering the highest quality services to our clients. More ❯
Mark Up Languages such as JSON, CSS, and HTML. 24. Demonstrated experience with RESTful interfaces. 25. Demonstrated experience developing and maintaining redundant applications hosted in Linux or Unix using Apache server. 1. Cyber Security Support: 2. Demonstrated experience with the Sponsor's IT review boards. 3. Demonstrated experience with providing recommendations to IT architecture and design reviews. 4. Demonstrated … with the Sponsor's authentication and authorization process. 12. Demonstrated experience delivering solutions with cloud services such as AWS, Oracle, Google, Microsoft Azure, or IBM. 13. Demonstrated experience with Apache Tomcat. 14. Demonstrated experience with Continuous integration systems (Jenkins). 15. Demonstrated experience upgrading and refining existing production applications (working with legacy code). 16. Demonstrated experience with JIRA. More ❯
DoD CIO, DIA, GISA, NEC Experience with unified communications (VoIP Phones, DVTC, VTC) installation, configuration, and management Familiarity with database management systems (MySQL, PostgreSQL, SQL Server) and web servers (Apache, Nginx, IIS) Ability to assist in planning and executing IT projects, such as system upgrades, migrations, or new technology deployments Ability to communicate and work collaboratively with team members More ❯
Multi-Factor Authentication, Single-Sign On, Password Management, and Passwordless Authentication (FIDO2) solutions. Exposure to supporting Web Access Management solutions, such as Ping Access or CA SiteMinder. Experience with Apache and IIS solutions. Understanding of the OSI model. Knowledge of the Software Development Life Cycle. Familiarity and understanding of high-availability environments. Skills: Analytical Thinking Automation Collaboration Production Support More ❯
source tools, cloud computing, machine learning and data visualization as applicable. The ability to use/code in a language applicable to the project or task order such as Apache Hadoop, Python, and advanced knowledge of machine learning. Responsibilities: Work with stakeholders to understand their data needs - researches and provides solutions to meet future growth or to eliminate occurring … source tools, cloud computing, machine learning and data visualization as applicable. The ability to use/code in a language applicable to the project or task order such as Apache Hadoop, Python, and advanced knowledge of machine learning. Experience in building and maintaining of an enterprise data model Experience in implementing data pipelines using ETL and ELT technologies such … as Apache Kafka and Apache Nifi Experienced in data architecture and management tools such as ER/Studio, Alation, and DataHub Experience with data modeling, data warehousing, and data analytics Experience with cloud technologies and cloud computing platforms Experience with security and compliance Experience working in an Agile environment Qualifications: Must have an Active Secret clearance or higher More ❯
DV (MOD) Cleared Data Engineer - Elastic Stack & Apache NiFi Location: Bristol | Contract Type: £430.00 pd (Outside IR35) | Working Pattern: Hybrid (3 - 4 days on-site) Are you a contract Data Engineer with a knack for designing secure, high-performance data solutions? We're on the lookout for a technical expert in the Elastic Stack and Apache NiFi to … impact-ideal for professionals with a strong track record in regulated sectors. What You'll Be Doing Designing and deploying scalable, secure data pipelines using Elasticsearch, Logstash, Kibana, and Apache NiFi Handling real-time data ingestion and transformation with an emphasis on integrity and availability Collaborating with architects and cybersecurity stakeholders to align with governance and compliance needs Monitoring … Minimum 3 years' experience as a Data Engineer in sensitive or regulated industries Proficiency in the full Elastic Stack for data processing, analytics, and visualisation Hands-on expertise with Apache NiFi in designing sophisticated data workflows Solid scripting capabilities using Python, Bash, or similar Familiarity with best practices in data protection (encryption, anonymisation, access control) Experience managing large-scale More ❯
significantly enhance existing software Build modular, reusable services and features within a modern service-oriented architecture Work independently with minimal supervision in an Agile team environment Deploy and maintain Apache NiFi clusters Develop ETL (Extract, Transform, Load) processes for mission-critical data systems Create and maintain Ansible playbooks and roles for software-driven deployment of NiFi and Zookeeper clusters … for infrastructure needs Develop dashboard visualizations in Kibana based on data from Elasticsearch Integrate and interact with RESTful services via REST APIs Requirements: Active Full-Scope Polygraph Expertise with Apache NiFi and equivalent IC technologies General understanding and ability to work with Zookeeper Expert-level experience developing ETL processes 5+ years of experience with scripting languages such as Bash More ❯
Dunn Loring, Virginia, United States Hybrid / WFH Options
River Hawk Consulting LLC
modeling and documenting complex data/metadata structures, data flows, and models Experience creating visualizations with Tableau or comparable programs Demonstrated experience writing and modifying SQL Demonstrated experience with Apache Hive, Apache Spark, and HDFS or S3 Demonstrated expertise developing software using Neo4j, Python, or Java Knowledge of development tools such as Git, Jenkins, or Jira Experience/ More ❯
the desired system and/or are technical lead for the execution. The system probably contains a combo of Data integration: integrate a variety of datasources with for example apache camel, apache pulsar or kafka, dlt, python, airbyte Analytics Engineering: model datawarehouses both batch and real-time with for example clickhouse and dbt or sqlmesh Business intelligence: build … on k8s Translate business logic to available data, for example creating insights for a wholesale client with data warehousing using an azure, aws, gcp or on-premise architecture including apache kafka/pulsar, sqlmesh/dbt, clickhouse/databend and metabase/superset. Build state-of-the-art systems that solve client-specific challenges, for example building agentic LLM More ❯
timely releases. Proven experience managing Power BI deployments (including workspaces, datasets, and reports). Strong understanding of data pipeline deployment using tools like Azure Data Factory, AWS Glue, or Apache Airflow. Hands-on experience with CI/CD tools (Azure DevOps, GitHub Actions, Jenkins). More ❯
looking for someone who can demonstrate an aptitude or willingness to learn some or all of the following technologies. AWS - S3, IAM, RDS, EMR, EC2, etc Linux Commands Trino Apache Spark Node.js JavaScript Preact.js Postgres MySQL HTML CSS Target Salary Range is $125k-$150k or more depending on experience. We recognize this skillset is in high demand and will More ❯
end tech specs and modular architectures for ML frameworks in complex problem spaces in collaboration with product teams Experience with large scale, distributed data processing frameworks/tools like Apache Beam, Apache Spark, and cloud platforms like GCP or AWS Where You'll Be We offer you the flexibility to work where you work best! For this role More ❯
/or teaching technical concepts to non-technical and technical audiences alike Passion for collaboration, life-long learning, and driving business value through ML Preferred Experience working with Databricks & Apache Spark to process large-scale distributed datasets About Databricks Databricks is the data and AI company. More than 10,000 organizations worldwide - including Comcast, Condé Nast, Grammarly, and over … Platform to unify and democratize data, analytics and AI. Databricks is headquartered in San Francisco, with offices around the globe and was founded by the original creators of Lakehouse, Apache Spark, Delta Lake and MLflow. To learn more, follow Databricks on Twitter ,LinkedIn and Facebook . Benefits At Databricks, we strive to provide comprehensive benefits and perks that meet More ❯
solutions. Required Skills and Experience: Experience in Python Experience with either Angular, React, Django and/or Flask Optional Skills: Full stack development Database expertise NiFi experience Proxy experience (Apache, Nginx, ) Amazon Web Services DevOps experience Need 6 years of experience with a Bachelor's degree in a related field or 4 years of experience with a Master's More ❯
and maintain Data Pipelines using Python. Develop real-time streaming features using big data tools such as Spark. SKILLS AND EXPERIENCE Extensive experience using big data tools such as Apache Spark. Experience working in and maintaining an GCP database. Strong Python coding background. Good knowledge of working with SQL. THE BENEFITS Generous Holiday plan. Career development plan. Flexible working. More ❯
in automating data quality checks, reducing data errors by 40% and ensuring more reliable reporting and analytics with data marts. Expertise in data orchestration and automation tools such as Apache Airflow, Python, and PySpark, supporting end- to-end ETL workflows. Experience in deployment activities. More ❯
Ashburn, Virginia, United States Hybrid / WFH Options
Adaptive Solutions, LLC
Minimum of 3 years' experience building and deploying scalable, production-grade AI/ML pipelines in AWS and Databricks • Practical knowledge of tools such as MLflow, Delta Lake, and Apache Spark for pipeline development and model tracking • Experience architecting end-to-end ML solutions, including feature engineering, model training, deployment, and ongoing monitoring • Familiarity with data pipeline orchestration and More ❯