Warwick, Warwickshire, West Midlands, United Kingdom
Tata Technologies Europe Ltd
data analysis techniques Ability to produce clear graphical representations and data visualisations. Knowledge of data analysis tools, for example, R Programming, Tableau Public, SAS, Apache Spark, Excel, RapidMiner Knowledge of data modelling, data cleansing, and data enrichment techniques An understanding of data protection issues Experience of working with Cloud more »
Bingham, Nottinghamshire, United Kingdom Hybrid / WFH Options
Crimson
computer vision algorithms. Expertise in Python, Java, C++, PHP; experience with frameworks like Flask, Django. Familiar with OpenCV, TensorFlow, PyTorch. Experience with MySQL, MongoDB, Apache, and UI/UX design principles. Strong collaboration and teamwork abilities. Excellent organizational and analytical skills. What's on offer? Work on cutting-edge more »
Employment Type: Contract
Rate: £400 - £425/day Flexible working opportunity
Knowledge of Gitlab CI/CD pipeline DevOps practices and principles AWS services XML/JSON Working with large databases Tools: SVN, Git, Tomcat, Apache, Jenkins, Jira & Confluence API and web services (REST) SQL Linux/Unix Personal Attributes: Well organised, with the ability to delve into software issues more »
Leeds, West Yorkshire, Yorkshire, United Kingdom Hybrid / WFH Options
Damia Group Ltd
Spark Scala Developer - Scala/Apache Spark - Hybrid/Leeds - £450-£550 Spark Scala Developer to join our client, one of the biggest financial services organizations in the world, with operations in more than 38 countries. It has an IT infrastructure of 200,000+ servers, 20,000+ database instances … Engineer you will be working for the GDT (Global Data Technology) Team, you will be responsible for: Designing, building, and maintaining data pipelines using Apache Spark and Scala Working on an Enterprise scale Cloud infrastructure and Cloud Services in one of the Clouds (GCP). Mandatory Skills; At least … IT Experience with designing, building, and maintaining data pipelines . At least 4+ Years of experience with designing, building, and maintaining data pipelines using Apache Spark and Scala. Programming languages: Proficiency in Scala and Spark is essential. Familiarity with Python and SQL is often a plus. Big Data technologies more »