Bromley, Kent, England, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
focusing on data quality, metadata management, and access control. Collaborate with IT and business stakeholders to align data architecture with organizational goals. Evaluate and implement big data technologies and distributedcomputing frameworks as needed. Maintain expertise in AWS data services and platforms, leveraging cloud capabilities to optimize data infrastructure. Required Skills and Experience: Proven experience as a Data … platforms and related services. Solid grasp of data governance principles, including data quality, metadata management, and access control. Familiarity with big data technologies such as Spark and Hadoop, and distributedcomputing concepts. Proficiency in SQL and at least one programming language (e.g., Python, Java). Preferred Qualifications: Relevant certifications in data architecture, cloud platforms, or data governance. Experience More ❯
platforms and their respective data services. Solid understanding of data governance principles, including data quality, metadata management, and access control. Familiarity with big data technologies (e.g., Spark, Hadoop) and distributed computing. Proficiency in SQL and at least one programming language (e.g., Python, Java) 6 Month Contract Inside IR35 Immediately available London up to 2 times a month in the More ❯
Job Summary We are seeking AI Infra Engineer to design, develop, and optimize distributed AI systems for serverless AI platforms. The successful candidate will leverage expertise in large language models (LLMs), and system design to build robust, scalable solutions. This role offers a unique opportunity to contribute to innovative AI-driven systems, collaborating with cross-functional teams to deliver … high-impact solutions in a fast-paced, research-driven environment. Key Responsibilities Design and implement scalable, distributed systems to support AI-driven workloads, ensuring high performance and reliability. Develop robust software solutions using Python (and potentially C++) to address complex technical challenges in AI and distributed computing. Work within a larger team to rapidly develop proof-of-concept … validate research ideas and integrate them into production systems and serverless infrastructure. Work closely with cross-functional teams to participate in developing innovative AI infrastructure, data systems, and cloud computing technologies. Implement resource scheduling and orchestration mechanisms to ensure efficient execution of distributed tasks. Required: Education: Bachelor's or Master's degree in Computer Science or a related More ❯
Job Summary We are seeking AI Infra Engineer to design, develop, and optimize distributed AI systems for serverless AI platforms. The successful candidate will leverage expertise in large language models (LLMs), and system design to build robust, scalable solutions. This role offers a unique opportunity to contribute to innovative AI-driven systems, collaborating with cross-functional teams to deliver … high-impact solutions in a fast-paced, research-driven environment. Key Responsibilities Design and implement scalable, distributed systems to support AI-driven workloads, ensuring high performance and reliability. Develop robust software solutions using Python (and potentially C++) to address complex technical challenges in AI and distributed computing. Work within a larger team to rapidly develop proof-of-concept … validate research ideas and integrate them into production systems and serverless infrastructure. Work closely with cross-functional teams to participate in developing innovative AI infrastructure, data systems, and cloud computing technologies. Implement resource scheduling and orchestration mechanisms to ensure efficient execution of distributed tasks. Required: Education: Bachelor's or Master's degree in Computer Science or a related More ❯
relational, NoSQL, and data lake environments. Leverage AWS data platforms and services to deliver enterprise-grade solutions. Stay ahead of emerging technologies and best practices in big data and distributed computing. Required Skills & Experience Proven experience as a Data Architect or in a senior data-focused role. Expertise in data modelling techniques and tools. Strong knowledge of data storage … ELT processes. Proficiency in AWS data platforms and services. Solid understanding of data governance principles (data quality, metadata, access control). Familiarity with big data technologies (Spark, Hadoop) and distributed computing. Advanced SQL skills and proficiency in at least one programming language (Python, Java). Additional Requirements Immediate availability for an October start. Must be UK-based with unrestricted More ❯