JUnit 5, Mockito, database integration. AI: LangChain, Retrieval-Augmented Generation (RAG), MCP servers (as consumer and developer), and prompt engineering for LLM optimization. Exposure to popular libraries and frameworks (Apache Commons, Guava, Swagger, TestContainers). Architecture & Platforms: Skilled in designing and deploying distributed systems on cloud hyperscalers (AWS, GCP). Familiarity with containerization (Docker), CI/CD pipelines, DevOps More ❯
JUnit 5, Mockito, database integration. AI: LangChain, Retrieval-Augmented Generation (RAG), MCP servers (as consumer and developer), and prompt engineering for LLM optimization. Exposure to popular libraries and frameworks (Apache Commons, Guava, Swagger, TestContainers). Architecture & Platforms:Skilled in designing and deploying distributed systems on cloud hyperscalers (AWS, GCP). Familiarity with containerization (Docker), CI/CD pipelines, DevOps More ❯
JUnit 5, Mockito, database integration. AI: LangChain, Retrieval-Augmented Generation (RAG), MCP servers (as consumer and developer), and prompt engineering for LLM optimization. Exposure to popular libraries and frameworks (Apache Commons, Guava, Swagger, TestContainers). Architecture & Platforms: Skilled in designing and deploying distributed systems on cloud hyperscalers (AWS, GCP). Familiarity with containerization (Docker), CI/CD pipelines, DevOps More ❯
Deep Learning or LLM Frameworks) Desirable Minimum 2 years experience in Data related field Minimum 2 years in a Business or Management Consulting field Experience of Docker, Hadoop, PySpark, Apache or MS Azure Minimum 2 years NHS/Healthcare experience Disclosure and Barring Service Check This post is subject to the Rehabilitation of Offenders Act (Exceptions Order) 1975 and More ❯
slough, south east england, united kingdom Hybrid / WFH Options
Hunter Bond
in Python for data pipelines, transformation, and orchestration. Deep understanding of the Azure ecosystem (e.g., Data Factory, Blob Storage, Synapse, etc.) Proficiency in Databricks (or strong equivalent experience with Apache Spark ). Proven ability to work within enterprise-level environments with a focus on clean, scalable, and secure data solutions. If you are the right fit - contact me directly More ❯
Technical Expertise: Solid experience in Python programming, particularly using data manipulation and processing libraries such as Pandas, NumPy, and Apache Spark. Hands-on experience with open-source data frameworks like Apache Spark, Apache Kafka, and Apache Airflow. Strong proficiency in SQL, including advanced query development and performance tuning. Good understanding of distributed computing principles and big … automation pipelines. Experience working with relational databases such as PostgreSQL, MySQL, or equivalent platforms. Skilled in using containerization technologies including Docker and Kubernetes. Experience with workflow orchestration tools like Apache Airflow or Dagster. Familiar with streaming data pipelines and real-time analytics solutions. More ❯
fine-tuning, or deployment. Technical Skills: Proficiency in Python and frameworks such as PyTorch , TensorFlow , or JAX . Strong familiarity with distributed computing and data engineering tools (e.g., SLURM , Apache Spark , Airflow ). Hands-on experience with LLM training, fine-tuning, and deployment (e.g., Hugging Face , LLamafactory , NVIDIA NeMo ). Preferred Qualifications Advanced degree (MS/PhD) in Computer More ❯
a key role in ensuring that all data systems comply with industry regulations and security standards while enabling efficient access for analytics and operational teams. A strong command of Apache NiFi is essential for this role. You will be expected to design, implement, and maintain data flows using NiFi, ensuring accurate, efficient, and secure data ingestion, transformation, and delivery. … business needs and compliance requirements. • Maintain documentation of data flows and processes, ensuring knowledge sharing and operational transparency. Skills & Experience: You will have the following skills or proven experience: Apache NiFi Expertise: • Deep understanding of core NiFi concepts: FlowFiles, Processors, Controller Services, Schedulers, Web UI. • Experience designing and optimizing data flows for batch, real-time streaming, and event-driven More ❯
a key role in ensuring that all data systems comply with industry regulations and security standards while enabling efficient access for analytics and operational teams. A strong command of Apache NiFi is essential for this role. You will be expected to design, implement, and maintain data flows using NiFi, ensuring accurate, efficient, and secure data ingestion, transformation, and delivery. … business needs and compliance requirements. • Maintain documentation of data flows and processes, ensuring knowledge sharing and operational transparency. Skills & Experience: You will have the following skills or proven experience: Apache NiFi Expertise: • Deep understanding of core NiFi concepts: FlowFiles, Processors, Controller Services, Schedulers, Web UI. • Experience designing and optimizing data flows for batch, real-time streaming, and event-driven More ❯