Bristol, Gloucestershire, United Kingdom Hybrid / WFH Options
Curo Resourcing Ltd
etc. Infrastructure as Code and CI/CD paradigms and systems such as: Ansible, Terraform, Jenkins, Bamboo, Concourse etc. Observability - SRE Big Data solutions (ecosystems) and technologies such as: Apache Spark and the Hadoop Ecosystem Excellent knowledge of YAML or similar languages The following Technical Skills & Experience would be desirable: Jupyter Hub Awareness RabbitMQ or other common queue technology More ❯
Cambridge, Cambridgeshire, United Kingdom Hybrid / WFH Options
Arm Limited
and running hybrids. Container orchestration deployment, configuration, and administration. (e.g. Docker Swarm, Kubernetes). Automation and configuration management (e.g., Ansible, Puppet, Chef). Analytics infrastructure and tooling (e.g. Databricks, Apache Spark). Technical leadership, championing standard methodologies, and leading technical product transformations. In Return: We are proud to have a set of behaviours that reflect our culture and guide More ❯
Spoken Fluency in English AND French or Dutch Seniority: 3-7 years of work experience Programming Skills : Expertise in Python and SQL, with experience in big data tools like Apache Spark or Databricks. Machine Learning Mastery : Proficiency in TensorFlow, PyTorch, and scikit-learn for designing, training, and deploying ML models. Cloud Expertise : Hands-on experience with Azure Machine Learning More ❯
systems to support real-time decision-making in the FX market. Building such systems is highly challenging and provides opportunities to work with modern technologies like NoSQL databases, Kafka, Apache Flink, and more. It offers significant opportunities for growth, leadership, and innovation, as well as direct interaction with clients and business teams to deliver impactful solutions in the FX More ❯
people, Boeing Defence UK provides long-term support for more than 120 Boeing military rotary-wing and fixed-wing aircrafts in the UK. For example, the Chinook and Apache helicopters, and the Poseidon and C-17 airplanes. Our support ranges from mission critical Logistics Information Services, next generation in-flight digital tools, to aircraft and operational modelling and simulation More ❯
people, Boeing Defence UK provides long-term support for more than 120 Boeing military rotary-wing and fixed-wing aircrafts in the UK. For example, the Chinook and Apache helicopters, and the Poseidon and C-17 airplanes. Our support ranges from mission critical Logistics Information Services, next generation in-flight digital tools, to aircraft and operational modelling and simulation More ❯
BSM is responsible for the design, development, manufacture, and maintenance of training devices for a wide variety of commercial and military aircraft - everything from F-15 fighter jets to Apache attack helicopters, and even NASA's Starliner spacecraft. Software engineers on our team are responsible for all phases of the software lifecycle, including architecture, design, implementation and test. They More ❯
London - upto £60K + Benefits Key Skills: Ubuntu Linux, Bash scripting, Chef/Puppet, Jenkins, Networking protocols - TCP/IP, HTTP, SSL, shell/perl scripting, Web Servers e.g. Apache, etc, Agile methodologies, DockerA well known B2B Saas business are now hiring a Cloud Engineer due to steady growth. There are looking for a technical and client orientated Cloud More ❯
can be outside of the ones mentioned here above. Agile, Scrum, Devops knowledges are assets CISSP and/or OSCP are assets Others PKI knowledge Reverse Proxies: ApacheHTTPD, NGINX are assets. Basic networking knowledge (Layer 3, 4) Linux/Unix System Engineer (RedHat) Language: English Soft skills Great teammate and supporter. Customer focus. Open Minded. Eager to learn More ❯
Database - PostgreSQL or any SQL (Experience with PostgreSQL or similar Relational Database Management System (RDBMS) Web app server Familiar (Experience preferred) with open source web application server such as ApacheHTTP, Tomcat Middle tier - messaging systems Experience using the following: Kafka - or any Java-based message broker Active MQ - product we are using that is java based message broker More ❯
as VMWare, Docker, HyperV, Xen, Kubernetes, etc. Expertise in building out complex enterprise infrastructures using various operating systems and services (e.g., AD, Exchange, DNS, DHCP, VPN, email, databases, IIS, Apache, etc. More ❯
and server-less solution architecting development in Azure. • Experience in automation and server-less solution architecting development in Oracle Cloud. • Experience with streaming data tools and software, such as Apache or Confluent Kafka • Experience with Data Integration, Data Engineering and Data Lake implementations using ETL, Big Data and Cloud Technology. • Experience with JIRA Confluence • Familiarity with Security Information and More ❯
Bristol, Gloucestershire, United Kingdom Hybrid / WFH Options
Apacheix
a year Individual healthcare cover Genuine flexible working Work from home, our Bristol offices, or client sites The latest secure tech Investment in personal development Vibrant social scene Why Apache iX? Our growing team brings a wealth of experience from across the defence and security sector, and we pride ourselves in delivering the highest quality services to our clients. More ❯
Mark Up Languages such as JSON, CSS, and HTML. 24. Demonstrated experience with RESTful interfaces. 25. Demonstrated experience developing and maintaining redundant applications hosted in Linux or Unix using Apache server. 1. Cyber Security Support: 2. Demonstrated experience with the Sponsor's IT review boards. 3. Demonstrated experience with providing recommendations to IT architecture and design reviews. 4. Demonstrated … with the Sponsor's authentication and authorization process. 12. Demonstrated experience delivering solutions with cloud services such as AWS, Oracle, Google, Microsoft Azure, or IBM. 13. Demonstrated experience with Apache Tomcat. 14. Demonstrated experience with Continuous integration systems (Jenkins). 15. Demonstrated experience upgrading and refining existing production applications (working with legacy code). 16. Demonstrated experience with JIRA. More ❯
source tools, cloud computing, machine learning and data visualization as applicable. The ability to use/code in a language applicable to the project or task order such as Apache Hadoop, Python, and advanced knowledge of machine learning. Responsibilities: Work with stakeholders to understand their data needs - researches and provides solutions to meet future growth or to eliminate occurring … source tools, cloud computing, machine learning and data visualization as applicable. The ability to use/code in a language applicable to the project or task order such as Apache Hadoop, Python, and advanced knowledge of machine learning. Experience in building and maintaining of an enterprise data model Experience in implementing data pipelines using ETL and ELT technologies such … as Apache Kafka and Apache Nifi Experienced in data architecture and management tools such as ER/Studio, Alation, and DataHub Experience with data modeling, data warehousing, and data analytics Experience with cloud technologies and cloud computing platforms Experience with security and compliance Experience working in an Agile environment Qualifications: Must have an Active Secret clearance or higher More ❯
significantly enhance existing software Build modular, reusable services and features within a modern service-oriented architecture Work independently with minimal supervision in an Agile team environment Deploy and maintain Apache NiFi clusters Develop ETL (Extract, Transform, Load) processes for mission-critical data systems Create and maintain Ansible playbooks and roles for software-driven deployment of NiFi and Zookeeper clusters … for infrastructure needs Develop dashboard visualizations in Kibana based on data from Elasticsearch Integrate and interact with RESTful services via REST APIs Requirements: Active Full-Scope Polygraph Expertise with Apache NiFi and equivalent IC technologies General understanding and ability to work with Zookeeper Expert-level experience developing ETL processes 5+ years of experience with scripting languages such as Bash More ❯
Dunn Loring, Virginia, United States Hybrid / WFH Options
River Hawk Consulting LLC
modeling and documenting complex data/metadata structures, data flows, and models Experience creating visualizations with Tableau or comparable programs Demonstrated experience writing and modifying SQL Demonstrated experience with Apache Hive, Apache Spark, and HDFS or S3 Demonstrated expertise developing software using Neo4j, Python, or Java Knowledge of development tools such as Git, Jenkins, or Jira Experience/ More ❯
the desired system and/or are technical lead for the execution. The system probably contains a combo of Data integration: integrate a variety of datasources with for example apache camel, apache pulsar or kafka, dlt, python, airbyte Analytics Engineering: model datawarehouses both batch and real-time with for example clickhouse and dbt or sqlmesh Business intelligence: build … on k8s Translate business logic to available data, for example creating insights for a wholesale client with data warehousing using an azure, aws, gcp or on-premise architecture including apache kafka/pulsar, sqlmesh/dbt, clickhouse/databend and metabase/superset. Build state-of-the-art systems that solve client-specific challenges, for example building agentic LLM More ❯
looking for someone who can demonstrate an aptitude or willingness to learn some or all of the following technologies. AWS - S3, IAM, RDS, EMR, EC2, etc Linux Commands Trino Apache Spark Node.js JavaScript Preact.js Postgres MySQL HTML CSS Target Salary Range is $125k-$150k or more depending on experience. We recognize this skillset is in high demand and will More ❯
end tech specs and modular architectures for ML frameworks in complex problem spaces in collaboration with product teams Experience with large scale, distributed data processing frameworks/tools like Apache Beam, Apache Spark, and cloud platforms like GCP or AWS Where You'll Be We offer you the flexibility to work where you work best! For this role More ❯
/or teaching technical concepts to non-technical and technical audiences alike Passion for collaboration, life-long learning, and driving business value through ML Preferred Experience working with Databricks & Apache Spark to process large-scale distributed datasets About Databricks Databricks is the data and AI company. More than 10,000 organizations worldwide - including Comcast, Condé Nast, Grammarly, and over … Platform to unify and democratize data, analytics and AI. Databricks is headquartered in San Francisco, with offices around the globe and was founded by the original creators of Lakehouse, Apache Spark, Delta Lake and MLflow. To learn more, follow Databricks on Twitter ,LinkedIn and Facebook . Benefits At Databricks, we strive to provide comprehensive benefits and perks that meet More ❯
evolution of our technical stack through the implementation and adoption of new technologies. You will report to the leadership within the Persistence Infrastructure Team. Your Impact Provisioning and maintaining Apache Pulsar Infrastructure on Kubernetes for Event Driven Architecture Developing and deploying software and tools for managing the lifecycle of persistence services such as Kubernetes operators, configuration management tools, shell … hardening activities Developing automation to remove manual tasks Developing and maintaining observability dashboards and alerting Collaborating with Software Engineers and Users across the business Your Qualifications Operational experience with Apache Pulsar or Kafka Experience working with Kubernetes Experience in Linux system administration Familiarity with CI/CD pipeline tooling Comfortable with scripting for automation Preferred Skills Software development skills More ❯
solutions. Required Skills and Experience: Experience in Python Experience with either Angular, React, Django and/or Flask Optional Skills: Full stack development Database expertise NiFi experience Proxy experience (Apache, Nginx, ) Amazon Web Services DevOps experience Need 6 years of experience with a Bachelor's degree in a related field or 4 years of experience with a Master's More ❯
Ashburn, Virginia, United States Hybrid / WFH Options
Adaptive Solutions, LLC
Minimum of 3 years' experience building and deploying scalable, production-grade AI/ML pipelines in AWS and Databricks • Practical knowledge of tools such as MLflow, Delta Lake, and Apache Spark for pipeline development and model tracking • Experience architecting end-to-end ML solutions, including feature engineering, model training, deployment, and ongoing monitoring • Familiarity with data pipeline orchestration and More ❯
Scala programming on the JVM Experience with concurrency, memory management and I/O Experience with Linux or other Unix-like systems Experience with distributed databases, DataStax Enterprise or Apache Cassandra in particular Experience with distributed computing platforms, Apache Spark in particular ABOUT BUSINESS UNIT IBM Software infuses core business operations with intelligence-from machine learning to generative More ❯