Stoke-on-trent, Staffordshire, United Kingdom Hybrid / WFH Options
Synectics Solutions Ltd
knowledge of Design Patterns and Clean Code Good knowledge of Web architecture: Client-server model Three tier model Service Oriented Architecture (SOA) Micro Services Knowledge of web servers: IIS Apache Nginx Knowledge SQL/T-SQL, writing queries, stored procedures, views. Knowledge of SSIS and or SSRS. Knowledge of deployment processes (CI/CD). Knowledge of testing TDD More ❯
Birmingham, West Midlands, United Kingdom Hybrid / WFH Options
KO2 Embedded Recruitment Solutions LTD
team, you'll deliver Linux infrastructure solutions and support for a diverse range of clients. Expect to work with: Linux distributions: Debian, Ubuntu, Red Hat Enterprise Linux Web stacks: Apache, Nginx, MySQL, PostgreSQL, PHP, Python Networking: Static/dynamic routing, DNS, VPNs, and firewalls Containers & automation: Docker, Kubernetes, and CI/CD pipelines Cloud platforms: AWS, Azure, and Google More ❯
Telford, Shropshire, United Kingdom Hybrid / WFH Options
Experis - ManpowerGroup
cloud platforms (AWS, Azure). Hands-on experience with monitoring tools such as Splunk, Splunk ITSI, Dynatrace, AppDynamics, and synthetic monitoring platforms. Familiarity with enterprise systems such as WebLogic, Apache, Oracle, and SQL. Ability to analyse and resolve complex technical problems and document solutions effectively. Excellent communication and collaboration skills, with a proactive and detail-oriented mindset. Desirable Certifications More ❯
Birmingham, West Midlands, West Midlands (County), United Kingdom Hybrid / WFH Options
Akkodis
Data Engineer (AI-Driven SaaS plaform) (Python, Snowflake, Data Modelling, ETL/ELT, Apache Airflow, Kafka, AWS) Large-scale data environment Up to £70,000 plus benefits FULLY REMOTE UK Are you a Data Engineering enthusiast who thrives from designing and implementing robust ETL processes, highly scalable data structures and data pipelines within a truly enterprise-scale data processing … platform integrates Python and Snowflake and you'll need a deep understanding of SQL and NoSQL databases (MongoDB or similar!) You'll also have exposure with streaming platforms like Apache Kafka and be able to develop and maintain ELT and essentially bring a solid understanding of data warehousing concepts and best practice. Essentially, a strong Data Engineer who is … a Snowflake enthusiast who can write solid SQL queries, within Snowflake! You will understanding Apache Kafka to a high standard and have solid knowledge of Apache Airflow - from a Cloud perspective, good AWS exposure. Naturally you will have good understanding on AWS. I'd love you to be an advocate of Agile too - these guys are massive on More ❯
West Midlands, United Kingdom Hybrid / WFH Options
Experis
data pipelines within enterprise-grade on-prem systems. Key Responsibilities: Design, develop, and maintain data pipelines using Hadoop technologies in an on-premises infrastructure. Build and optimise workflows using Apache Airflow and Spark Streaming for real-time data processing. Develop robust data engineering solutions using Python for automation and transformation. Collaborate with infrastructure and analytics teams to support operational … platform. Ensure compliance with enterprise security and data governance standards. Required Skills & Experience: Minimum 5 years of experience in Hadoop and data engineering. Strong hands-on experience with Python, Apache Airflow, and Spark Streaming. Deep understanding of Hadoop components (HDFS, Hive, HBase, YARN) in on-prem environments. Exposure to data analytics, preferably involving infrastructure or operational data. Experience working More ❯