communication skills We're excited if you have 7+ years of experience delivering multi tier, highly scalable, distributed web applications Experience working with Distributed computing frameworks knowledge: Hive/Hadoop, Apache Spark, Kafka, Airflow Working with programming languages Python , Java, SQL. Working on building ETL (Extraction Transformation and Loading) solution using PySpark Experience in SQL/NoSQL database design More ❯
in data engineering, with a strong emphasis on data design and architecture. Proven proficiency in SQL and experience with relational databases. Practical experience with big data technologies such as Hadoop or Spark. In-depth understanding of data warehousing concepts and ETL frameworks. Familiarity with cloud platforms including AWS, Azure, or GCP. Strong analytical and problem-solving skills, with the More ❯
Manchester, Lancashire, United Kingdom Hybrid / WFH Options
CHEP UK Ltd
such as Python, R, and SQL for data analysis and model development. Experience working with cloud computing platforms including AWS and Azure, and familiarity with distributed computing frameworks like Hadoop and Spark. Deep understanding of supply chain operations and the ability to apply data science methods to solve real-world business problems effectively. Strong foundational knowledge in mathematics and More ❯
a data science team, mentoring junior colleagues and driving technical direction. Experience working with Agile methodologies in a collaborative team setting. Extensive experience with big data tools, such as Hadoop and Spark, for managing and processing large-scale datasets. Extensive experience with cloud platforms, particularly Microsoft Azure, for building and deploying data science solutions. Why Join? You'll be More ❯
bring: Significant experience in data engineering, including leading or mentoring technical teams. Deep understanding of cloud environments such as Azure, AWS, or Google Cloud Platform, and tools like Synapse, Hadoop, or Snowflake. Hands-on experience with programming languages such as Python, Java, or Scala. Strong knowledge of data architecture, modelling, and governance. A track record of delivering complex data More ❯
Bolton, England, United Kingdom Hybrid / WFH Options
Anson McCade
with noSQL databases (e.g. MongoDB, InfluxDB, Neo4J). • Strong data exchange and processing expertise (ETL, ESB, API). • Programming experience, ideally in Python. • Understanding of big data technologies (e.g. Hadoop stack). • Knowledge of NLP and OCR methodologies. • Familiarity with Generative AI concepts and tools (advantageous). • Experience with containerisation (e.g. Docker) is desirable. • Background in industrial and/ More ❯
technologies (e.g., MongoDB, InfluxDB, Neo4J). Experience with data exchange and processing (ETL, ESB, APIs). Proficiency in Python or similar programming languages. Familiarity with big data frameworks (e.g., Hadoop ecosystem). Desirable Skills: Understanding of NLP (Natural Language Processing) and OCR (Optical Character Recognition). Exposure to Generative AI concepts and tools. Experience with containerisation (e.g., Docker). More ❯
overview. Should have hands-on experience in creating reports in Microsoft excel. Capable of understanding the ITIL terminology and various Service Management and Software Development Lifecycle terminologies. Knowledge of Hadoop and ITIL – Alerting and Monitoring, Change management, Problem management and Incident management. Knowledge of service protection and change exception handling. Knowledge and understanding of Banking domain and IT Infrastructure More ❯