and AI models. Data Engineer Required Experience Data engineering experience (2+ years) Cloud platform proficiency (e.g., AWS, Azure, GCP) Data pipeline development (e.g., Airflow, Apache Spark) SQL proficiency, database design Visualization tools knowledge (e.g., Tableau, PowerBI, Looker) Data Engineer Application Process This is a 1 year contract requirement with more »
/knowledge: Proficiency in programming languages such as Java, Python, or Go, and experience with API frameworks Experience with streaming technologies like Kafka, Apigee, Apache Flink, or Spark Streaming, and Real Time data processing frameworks Strong understanding of microservices architecture, containerization and cloud computing platforms Solid understanding of API more »
/knowledge: Proficiency in programming languages such as Java, Python, or Go, and experience with API frameworks Experience with streaming technologies like Kafka, Apigee, Apache Flink, or Spark Streaming, and Real Time data processing frameworks Strong understanding of microservices architecture, containerization and cloud computing platforms Solid understanding of API more »
Newport, Gwent, Wales, United Kingdom Hybrid / WFH Options
Maclean Moore Ltd
Developer. ROLE: GCP DATA ENGINEER LOCATION: NEWPORT OR CARDIFF (HYBRID) IR35 STATUS: INSIDE LENGTH: 6 MONTHS Required experience: Expertise in python and DataFlow/Apache beam Experience in handling streaming data Strong experience in database replication using Message based CDC Experience in using Kafka implementations in a secured cloud more »
development, deployment of large scale data streaming Pipelines in GCP . Work on Data Streaming POC Experience required: Expertise in python and DataFlow/Apache beam Experience in handling streaming data. Strong experience in database replication using Message based CDC Experience in using Kafka implementations in a secured cloud more »
science, e.g., SQL, R and Python alongside the ability to use tools and packages such as Alteryx, Jupyter notebook, R Markdown, TensorFlow, Keras, Pytorch, Apache Spark etc. oPractical proficiency in producing reproducible code and pipelines including documentation, governance and assurance frameworks, automation and code review using tools such as more »
Newcastle Upon Tyne, United Kingdom Hybrid / WFH Options
NHS Counter Fraud Authority
science, e.g., SQL, R and Python alongside the ability to use tools and packages such as Alteryx, Jupyter notebook, R Markdown, TensorFlow, Keras, Pytorch, Apache Spark etc. oPractical proficiency in producing reproducible code and pipelines including documentation, governance and assurance frameworks, automation and code review using tools such as more »
science, e.g., SQL, R and Python alongside the ability to use tools and packages such as Alteryx, Jupyter notebook, R Markdown, TensorFlow, Keras, Pytorch, Apache Spark etc. oPractical proficiency in producing reproducible code and pipelines including documentation, governance and assurance frameworks, automation and code review using tools such as more »
like BIND DNS, SMTP. Demonstrable experience with managing certificates Technical skills in load balancing, preferably F5/Cloudflare Experience in web hosting experience with Apache, tomcat Practical experience with Puppet Skills with container management, lifecycles and ECS Skills and expertise with terraform. Please note that the assignment is inside more »
Sharing WSUS/Ivanti Security Controls Linux and Unix operating systems Citrix and Windows Terminal Servers CommVault backups Server Hardware Web Servers (IIS/Apache) Automation (Administrative scripting languages e.g PowerShell) Microsoft Office 365 Extensive knowledge of system technologies, concepts, tools and practices, such as TCP/IP, DNS more »
a good understanding of CVEs and the remediation with a good knowledge of security. You must also have good experience of configuration parameters in Apache & Tomcat, administrative security settings in AIX, Solaris, RHEL and common infrastructure, including Active Directory, GPO and Kerberos. Please apply ASAP to discuss further. more »
to develop unit test cases. Help in backlog grooming. Key skills: Extensive experience in developing Bigdata pipelines in cloud using Bigdata technologies such as Apache Spark Expertise in performing complex data transformation using Spark SQL queries Experience in orchestrating data pipelines using Apache Airflow Proficiency in Git based more »
so any knowledge of cross domain solutions or air gapped is a plus AWS as initial hosting provider Containerised apps using Docker and Kubernetes Apache Jena Elastic PostGIS Kafka Apache NiFi AWS Cognito HTTP REST, GraphQL, SPARQL interfaces Web apps based on HTML/CSS/Javascript frameworks more »
Manchester, North West, United Kingdom Hybrid / WFH Options
Searchability NS&D Ltd
existing architectural components including Data Ingest, Data Stores and REST APIs. THE DEVOPS ENGINEER SHOULD HAVE…. You must have an active eDV Clearance. Apache Nifi Flink Java Ansible Docker Kubernetes ELK stack Linux Sys Admin for deployed Clusters (10's of servers) Jenkins Pipeline development Integration/debugging … DEVOPS ENGINEERS - MANCHESTER KEY SKILLS: SOFTWARE DEVELOPER/SOFTWARE ENGINEER/SENIOR SOFTWARE DEVELOPER/SENIOR SOFTWARE ENGINEER/DEVOPS ENGINEER/DEVOPS/APACHE NIFI/FLINK/JAVA/ANSIBLE/DOCKER/KUBERNETES/ELK STACK/TERRAFORM/LINUX/GIT more »
Data Engineer 6 Month Contract Inside IR35 £450/day Hiring Immediately Job Description (Apache Iceberg, Spark, Big Data) Job Details Overview: Overall IT experience of 5+ years of total experience with strong programming skills Excellent skill in Apache Iceberg, Spark, Big Data 3+ years of Big Data … project development experience Hands on experience in working areas like Apache Iceberg & Spark, Hadoop, Hive Must have knowledge in any Database Ex: Postgres, Oracle, MongoDB Excellent in SDLC Processes and DevOps knowledge (Jira, Jenkins pipeline) Working in Agile POD and with team collaboration Ability to participate in deep technical more »