paisley, central scotland, united kingdom Hybrid / WFH Options
Venesky Brown
fine-tuning techniques including LoRA, QLoRA, and parameter efficient methods - Multi-modal AI systems combining text, image, and structured data - Reinforcement Learning from Human Feedback (RLHF) for model alignment - Apache Airflow/Dagster for ML workflow orchestration and ETL pipeline management - Model versioning and experiment tracking (MLflow, Weights & Biases) - Real-time model serving and edge deployment strategies - A/ More ❯
milton, central scotland, united kingdom Hybrid / WFH Options
Venesky Brown
fine-tuning techniques including LoRA, QLoRA, and parameter efficient methods - Multi-modal AI systems combining text, image, and structured data - Reinforcement Learning from Human Feedback (RLHF) for model alignment - Apache Airflow/Dagster for ML workflow orchestration and ETL pipeline management - Model versioning and experiment tracking (MLflow, Weights & Biases) - Real-time model serving and edge deployment strategies - A/ More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Hunter Bond
in Python for data pipelines, transformation, and orchestration. Deep understanding of the Azure ecosystem (e.g., Data Factory, Blob Storage, Synapse, etc.) Proficiency in Databricks (or strong equivalent experience with Apache Spark ). Proven ability to work within enterprise-level environments with a focus on clean, scalable, and secure data solutions. If you are the right fit - contact me directly More ❯
in Python for data pipelines, transformation, and orchestration. Deep understanding of the Azure ecosystem (e.g., Data Factory, Blob Storage, Synapse, etc.) Proficiency in Databricks (or strong equivalent experience with Apache Spark ). Proven ability to work within enterprise-level environments with a focus on clean, scalable, and secure data solutions. If you are the right fit - contact me directly More ❯
london, south east england, united kingdom Hybrid / WFH Options
Hunter Bond
in Python for data pipelines, transformation, and orchestration. Deep understanding of the Azure ecosystem (e.g., Data Factory, Blob Storage, Synapse, etc.) Proficiency in Databricks (or strong equivalent experience with Apache Spark ). Proven ability to work within enterprise-level environments with a focus on clean, scalable, and secure data solutions. If you are the right fit - contact me directly More ❯
london (city of london), south east england, united kingdom Hybrid / WFH Options
Hunter Bond
in Python for data pipelines, transformation, and orchestration. Deep understanding of the Azure ecosystem (e.g., Data Factory, Blob Storage, Synapse, etc.) Proficiency in Databricks (or strong equivalent experience with Apache Spark ). Proven ability to work within enterprise-level environments with a focus on clean, scalable, and secure data solutions. If you are the right fit - contact me directly More ❯
slough, south east england, united kingdom Hybrid / WFH Options
Hunter Bond
in Python for data pipelines, transformation, and orchestration. Deep understanding of the Azure ecosystem (e.g., Data Factory, Blob Storage, Synapse, etc.) Proficiency in Databricks (or strong equivalent experience with Apache Spark ). Proven ability to work within enterprise-level environments with a focus on clean, scalable, and secure data solutions. If you are the right fit - contact me directly More ❯
Springfield, Virginia, United States Hybrid / WFH Options
SecureVision
wide asynchronous messaging capability deployed across multiple security domains. The environment supports multiple tenants with a variety of different use cases. Specific Duties and Responsibilities: • O&M of existing Apache Pulsar services hosted on Red Hat Openshift across multiple security domains in both Cloud and datacenter (vSphere) environments. • Support deployment using Red Hat Openshift, Keycloak, Gitlab, Gitlab CI, GitOps More ❯
to create, test and validate data models and outputs Set up monitoring and ensure data health for outputs What we are looking for Proficiency in Python, with experience in Apache Spark and PySpark Previous experience with data analytics softwares Ability to scope new integrations and translate user requirements into technical specifications What’s in it for you? Base salary More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Hexegic
to create, test and validate data models and outputs Set up monitoring and ensure data health for outputs What we are looking for Proficiency in Python, with experience in Apache Spark and PySpark Previous experience with data analytics softwares Ability to scope new integrations and translate user requirements into technical specifications What’s in it for you? Base salary More ❯
london, south east england, united kingdom Hybrid / WFH Options
Hexegic
to create, test and validate data models and outputs Set up monitoring and ensure data health for outputs What we are looking for Proficiency in Python, with experience in Apache Spark and PySpark Previous experience with data analytics softwares Ability to scope new integrations and translate user requirements into technical specifications What’s in it for you? Base salary More ❯
slough, south east england, united kingdom Hybrid / WFH Options
Hexegic
to create, test and validate data models and outputs Set up monitoring and ensure data health for outputs What we are looking for Proficiency in Python, with experience in Apache Spark and PySpark Previous experience with data analytics softwares Ability to scope new integrations and translate user requirements into technical specifications What’s in it for you? Base salary More ❯
london (city of london), south east england, united kingdom Hybrid / WFH Options
Hexegic
to create, test and validate data models and outputs Set up monitoring and ensure data health for outputs What we are looking for Proficiency in Python, with experience in Apache Spark and PySpark Previous experience with data analytics softwares Ability to scope new integrations and translate user requirements into technical specifications What’s in it for you? Base salary More ❯
Ashburn, Virginia, United States Hybrid / WFH Options
Adaptive Solutions, LLC
Minimum of 3 years' experience building and deploying scalable, production-grade AI/ML pipelines in AWS and Databricks • Practical knowledge of tools such as MLflow, Delta Lake, and Apache Spark for pipeline development and model tracking • Experience architecting end-to-end ML solutions, including feature engineering, model training, deployment, and ongoing monitoring • Familiarity with data pipeline orchestration and More ❯
Manchester, North West, United Kingdom Hybrid / WFH Options
OpenSourced Ltd
Develop, maintain, and optimise web applications using Laravel and Vue.js Build and customise WordPress themes and plugins , including WooCommerce integration Manage cloud deployments (AWS/DigitalOcean) and server configuration (Apache/Nginx) Maintain version control and CI/CD pipelines using GitHub workflows Create responsive, interactive user interfaces with modern JavaScript frameworks Design and maintain relational databases ( MySQL/ More ❯
backend language Front-end development utilising JavaScript, HTML and CSS ideally with experience of a modern front-end JavaScript framework (ideally Vue.js.) Databases (MySQL, DynamoDB) and web servers (e.g., Apache, Nginx) Code versioning tools, such as Git Working in an Agile environment (we use Kanban) within cross-functional teams Analysing, contributing to and refining requirements Behaviour-Driven Development (BDD More ❯
to handle multiple assignments concurrently. Other skills/experience would be helpful to have: Bachelor's degree in Computer Science, Computer Engineering, or related field Experience with BigFix for Apache Experience with Nutanix HCI Experience using ServiceNow ITIL managed processes Travel/Shift Work: This position requires the ability to travel 20% of the time. Also, potential to have More ❯
identify issues using tools like PRTG or Zabbix. Ensure adherence to security, governance, and compliance policies. What we're looking for: 3+ years supporting client-server and web applications (Apache/IIS) in Windows Server environments. Strong SQL Server or Oracle database skills; scripting experience (SQL, PowerShell, Python, VBS). Knowledge of virtualized environments, data centres, and IT infrastructure More ❯
and maintaining the Manhattan Platform. the platform runs on cloud (AWS) with RDS database. Experience with installations of SDN’s and upgrading components Willing to learn new techniques like Apache NiFi or other specific technologies. Interested in hardware like printers & Scanners. Knowledge of Docker Swarm, Terraform and Atlantis, Jenkins, Git , and have created Bash scripts. Experience working in Agile More ❯
in Israel. Willingness and ability to travel abroad. Bonus Points: Knowledge and hands-on experience of Office 365 - A big advantage. Experience in Kafka, and preferably some exposure to Apache Flink, is a plus. Why Join Semperis? You'll be part of a global team on the front lines of cybersecurity innovation. At Semperis, we celebrate curiosity, integrity, and More ❯
in Israel. Willingness and ability to travel abroad. Bonus Points: Knowledge and hands-on experience of Office 365 - A big advantage. Experience in Kafka, and preferably some exposure to Apache Flink, is a plus. Why Join Semperis? You'll be part of a global team on the front lines of cybersecurity innovation. At Semperis, we celebrate curiosity, integrity, and More ❯
Shawnee Mission, Kansas, United States Hybrid / WFH Options
ECCO Select
mainly remote) Duration: Direct Hire Benefits: Medical/Dental/Vision/401k/PTO/Holidays Job Description: • Design, build, and maintain scalable data pipelines using tools like Apache NiFi, Airflow, or equivalent orchestration systems. • Work with structured and semi-structured data using SQL and NoSQL systems (e.g., PostgreSQL, MongoDB, Elasticsearch, Neo4j). • Develop services and integrations using … data pipeline or ETL contexts; Python is a plus. • Proficiency with SQL and NoSQL databases, including query optimization and large dataset processing. • Familiarity with data integration tools such as Apache NiFi, Airflow, or comparable platforms. • Knowledge of RESTful API interactions, JSON parsing, and schema transformations. • Exposure to cloud environments (especially AWS: S3, EC2, Lambda) and distributed systems. • Comfortable with More ❯
Washington, Washington DC, United States Hybrid / WFH Options
Gridiron IT Solutions
properties, traversals, and relationship patterns essential for effective graph database design Graph Query Language Proficiency: Demonstrated expertise in one or more graph query languages such as Cypher (Neo4j), Gremlin (Apache TinkerPop), SPARQL, or GraphQL for complex data retrieval and manipulation Advanced Graph Data Modeling: Experience translating business requirements into optimized graph schemas, including property graphs, RDF triples, or knowledge More ❯
platform components. Big Data Architecture: Build and maintain big data architectures and data pipelines to efficiently process large volumes of geospatial and sensor data. Leverage technologies such as Hadoop, Apache Spark, and Kafka to ensure scalability, fault tolerance, and speed. Geospatial Data Integration: Develop systems that integrate geospatial data from a variety of sources (e.g., satellite imagery, remote sensing … or related field. Experience with data visualization tools and libraries (e.g., Tableau, D3.js, Mapbox, Leaflet) for displaying geospatial insights and analytics. Familiarity with real-time stream processing frameworks (e.g., Apache Flink, Kafka Streams). Experience with geospatial data processing libraries (e.g., GDAL, Shapely, Fiona). Background in defense, national security, or environmental monitoring applications is a plus. Compensation and More ❯