You are here: Home/Job Search/Senior BigData Engineer/Architect – Scala – Distributed Systems Job Title: Senior BigData Engineer/Architect – Scala – Distributed Systems Location: London AMS is the world’s leading provider of Talent Acquisition and Management Services. Our Contingent Workforce Solutions (CWS) service, partner with Deutsche Bank to … support contingent recruitment processes. On behalf of Deutsche Bank, we are looking for a Senior BigData Engineer/Architect with Scala for an initial 6-month contract on a hybrid basis in London. Deutsche Bank is a global banking business with strong roots in Germany and operations in over 70 countries. Their large but focused footprint … the role: This is a high-profile technology role within a globally recognised Tier-1 investment bank. Deutsche Bank is seeking a hands-on technologist with deep experience in bigdata engineering to support a major modernisation effort across its credit risk platform. The role sits at the heart of the bank’s stress testing function , ensuring the More ❯
As a BigData Solutions Architect (Resident Solutions Architect) in our Professional Services team, you will work with clients on short to medium-term customer engagements on their bigdata challenges using the Databricks platform. You will provide data engineering, data science, and cloud technology projects which require integrating with client systems … and productionalizing customer use cases. Work with engagement managers to scope a variety of professional services work with input from the customer. Guide strategic customers as they implement transformational bigdata projects, 3rd party migrations, including end-to-end design, build and deployment of industry-leading bigdata and AI applications. Consult on architecture and design … a customers' successful understanding, evaluation, and adoption of Databricks. Provide an escalated level of support for customer operational issues. You will work with the Databricks technical team, Project Manager, Architect, and Customer team to ensure the technical components of the engagement are delivered to meet customer's needs. Work with Engineering and Databricks Customer Support to provide product and More ❯
Join to apply for the BigData Solutions Architect role at Databricks 6 days ago Be among the first 25 applicants Join to apply for the BigData Solutions Architect role at Databricks Get AI-powered advice on this job and more exclusive features. CSQ226R109 As a BigData Solutions Architect (Resident Solutions Architect) in our Professional Services team you will work with clients on short to medium-term customer engagements on their bigdata challenges using the Databricks platform. You will provide data engineering, data science, and cloud technology projects which require integrating with client systems, training, and other technical tasks to help customers … s and productionalizing customer use cases Work with engagement managers to scope variety of professional services work with input from the customer Guide strategic customers as they implement transformational bigdata projects, 3rd party migrations, including end-to-end design, build and deployment of industry-leading bigdata and AI applications Consult on architecture and design More ❯
As a BigData Solutions Architect (Resident Solutions Architect) in our Professional Services team you will work with clients on short to medium-term customer engagements on their bigdata challenges using the Databricks platform. You will provide data engineering, data science, and cloud technology projects which require integrating with client systems … s and productionalizing customer use cases. Work with engagement managers to scope variety of professional services work with input from the customer. Guide strategic customers as they implement transformational bigdata projects, 3rd party migrations, including end-to-end design, build and deployment of industry-leading bigdata and AI applications. Consult on architecture and design … a customers' successful understanding, evaluation and adoption of Databricks. Provide an escalated level of support for customer operational issues. You will work with the Databricks technical team, Project Manager, Architect and Customer team to ensure the technical components of the engagement are delivered to meet customer's needs. Work with Engineering and Databricks Customer Support to provide product and More ❯
As a BigData Solutions Architect (Resident Solutions Architect) on our Professional Services team for the Emerging Enterprise & Digital Natives business in EMEA , you will engage with customers on short- to medium-term projects, helping them navigate their bigdata challenges using the Databricks Platform. You will deliver data engineering, data science … Work closely with Engagement Managers and customers to define project scope, schedules, and deliverables for professional services engagements. Enable transformational initiatives : Guide strategic customers through their end-to-end bigdata journeys—migrating from legacy platforms and deploying industry-leading data and AI applications on the Databricks platform. Consult on architecture & design : Provide thought leadership on solution … product feedback : Provide implementation insights to Databricks Product and Support teams, guiding rapid improvements in features and troubleshooting for customers. What we look for: Extensive experience and proficiency in data engineering, data platforms, and analytics with a strong track record of successful projects and in-depth knowledge of industry best practices Comfortable writing code in either Python or More ❯
Greater London, England, United Kingdom Hybrid / WFH Options
InterEx Group
For this role, you will be responsible for providing the framework that appropriately replicates the BigData needs of a company utilizing data. Essential requirements: More than 3 years of presales experience in the design of BigData and Data analytics solutions according to customer requirements Previous experience with the preparation of high-quality engaging … the offer solution team, solution definition, effort and cost estimation, Past experience in dealing with partners, tools vendors, etc. Business Domain Knowledge More than 5 years of experience in BigData implementation projects Experience in the definition of BigData architecture with different tools and environments: Cloud (AWS, Azure and GCP), Cloudera, No-sql databases (Cassandra … Mongo DB), ELK, Kafka, Snowflake, etc. Past experience in Data Engineering and data quality tools (Informatica, Talend, etc.) Previous involvement in working in a Multilanguage and multicultural environment Proactive, tech passionate and highly motivated Desirable requirements: Experience in Data analysis and visualization solutions: Microstrategy, Qlik, PowerBI, Tableau, Looker,... Background in Data Governance and DataMore ❯
South East London, England, United Kingdom Hybrid / WFH Options
InterEx Group
For this role, you will be responsible for providing the framework that appropriately replicates the BigData needs of a company utilizing data. Essential requirements: More than 3 years of presales experience in the design of BigData and Data analytics solutions according to customer requirements Previous experience with the preparation of high-quality engaging … the offer solution team, solution definition, effort and cost estimation, Past experience in dealing with partners, tools vendors, etc. Business Domain Knowledge More than 5 years of experience in BigData implementation projects Experience in the definition of BigData architecture with different tools and environments: Cloud (AWS, Azure and GCP), Cloudera, No-sql databases (Cassandra … Mongo DB), ELK, Kafka, Snowflake, etc. Past experience in Data Engineering and data quality tools (Informatica, Talend, etc.) Previous involvement in working in a Multilanguage and multicultural environment Proactive, tech passionate and highly motivated Desirable requirements: Experience in Data analysis and visualization solutions: Microstrategy, Qlik, PowerBI, Tableau, Looker,... Background in Data Governance and DataMore ❯
Social network you want to login/join with: BigDataArchitect (Open source contributor), Slough Client: HCLTech Location: Slough, United Kingdom Job Category: Other - EU work permit required: Yes Job Views: 5 Posted: 31.05.2025 Expiry Date: 15.07.2025 Job Description: HCLTech is a global technology company, with over 219,000 employees across 54 countries, delivering capabilities in … Skills: Apache Hadoop We seek open-source contributors to Apache projects who have deep understanding of the Apache ecosystem, experience with Cloudera or similar distributions, and extensive knowledge of bigdata technologies. Requirements: Platform engineering and application engineering experience (hands-on) Design experience of open-source platforms based on Apache Hadoop Experience integrating Infra-as-Code in platforms … These individuals should be active open-source contributors, capable of troubleshooting complex issues, and supporting the migration and debugging of critical applications like RiskFinder. They must be experts in BigData platform development using Apache Hadoop and supporting Hadoop implementations in various environments. #J-18808-Ljbffr More ❯
BigDataArchitect (Open source contributor), London Client: HCLTech Location: London, United Kingdom Job Category: Other EU work permit required: Yes Job Description: HCLTech is a global technology company, home to 219,000+ people across 54 countries, delivering industry-leading capabilities centered on digital, engineering, and cloud, powered by a broad portfolio of technology services and products. … projects, who have an in-depth understanding of the code behind the Apache ecosystem, should have experience in Cloudera or similar distribution, and possess in-depth knowledge of the bigdata tech stack. Requirements: Experience in platform engineering along with application engineering (hands-on). Experience in the design of an open-source platform based on the Apache … framework for Hadoop. Experience in integrating Infra-as-a-Code in their platform (Bespoke implementation from scratch). Experience of design & architect work for the open-source Apache platform in a hybrid cloud environment. Ability to debug & fix code in the open-source Apache code and should be an individual contributor to open-source projects. Job description: The Apache More ❯
design of an open source platform based on Apache framework for Hadoop. Experience in integrating Infra-as-a-Code in their platform (Bespoke implementation from scratch) Experience of design & architect work for the open source Apache platform in hybrid cloud environment Ability to do debug & fix code in the open source Apache code and should be an individual contributor … during delivery. 3 individuals required to support all developers in migrating and debugging various RiskFinder critical applications. They need to be "Developers" who are expert in designing and building BigData platforms using Apache Hadoop and support Apache Hadoop implementations both in cloud environments and on-premises. More ❯
design of an open source platform based on Apache framework for Hadoop. Experience in integrating Infra-as-a-Code in their platform (Bespoke implementation from scratch) Experience of design & architect work for the open source Apache platform in hybrid cloud environment Ability to do debug & fix code in the open source Apache code and should be an individual contributor … during delivery. 3 individuals required to support all developers in migrating and debugging various RiskFinder critical applications. They need to be "Developers" who are expert in designing and building BigData platforms using Apache Hadoop and support Apache Hadoop implementations both in cloud environments and on-premises. More ❯
design of an open source platform based on Apache framework for Hadoop. Experience in integrating Infra-as-a-Code in their platform (Bespoke implementation from scratch) Experience of design & architect work for the open source Apache platform in hybrid cloud environment Ability to do debug & fix code in the open source Apache code and should be an individual contributor … during delivery. 3 individuals required to support all developers in migrating and debugging various RiskFinder critical applications. They need to be "Developers" who are expert in designing and building BigData platforms using Apache Hadoop and support Apache Hadoop implementations both in cloud environments and on-premises. #J-18808-Ljbffr More ❯
DataArchitect Active SC Clearance Location: Hybrid working 2-3 days onsite Gloucester Inside IR35 Company Overview: We are seeking a skilled DataArchitect to join our team. The ideal candidate will be responsible for designing and implementing data architecture solutions that align with the organisation's goals and objectives. You will work closely with … stakeholders to understand their data needs and translate them into technical specifications. The data landscape is rapidly evolving, with businesses increasingly relying on data-driven insights to enhance operational efficiency and drive strategic growth. As organisations collect more data than ever before, the demand for skilled data professionals continues to rise. Our company is at … leveraging the latest technologies and methodologies to harness the power of data. By joining us, you will be part of an innovative environment that is shaping the future of data architecture and analytics. Key Responsibilities: - Design and implement comprehensive data architecture solutions that promote effective data management and analytics capabilities. - Utilise bigdata technologies to More ❯
We are headhunting for an AiOps Architect and would like to present this role to you to gauge your interest or to see if you can recommend suitable candidates. Our client is a mid-sized leader in AiOps Telecoms software products related to network management and service assurance. The role involves designing and supporting the implementation of the client … s Cloud Native BigData Analytics infrastructure layer. This infrastructure serves as the backbone for supporting AI/ML workflows, enabling real-time and batch processing for training and inference purposes. The typical data throughput involves analysis on continuous streams of Terabytes of data per hour. We are seeking candidates with the following background: Software development … experience, with extensive expertise in: Back-end data processing Data lakehouse architecture Hands-on experience with BigData open-source technologies such as: Apache Airflow Apache Kafka Apache Pekko Apache Spark & Spark Structured Streaming Delta Lake AWS Athena Trino MongoDB AWS S3, MinIO S3 Proven successful hands-on experience in: Setting up data governance tooling More ❯