IIBA (International Institute of Business Analysis)
designing, developing, testing, and deploying complex projects successfully. Competence in writing unit and integration tests to ensure code quality and reliability. Proficiency in SQL with a solid understanding of data models; knowledge of Python is preferred. Working knowledge of AWS cloud services (e.g., EC2, ECS, Load Balancer, Security Group, Lambda, S3). Experience in DevOps development and deployment using … environments such as IDEs, web & application servers, Git, Azure DevOps, and other modern development tools. Strong problem-solving skills with a solid understanding of software design fundamentals such as datastructures and algorithms. Ability to collaborate effectively with product and UX teams to translate UI designs into functional solutions while maintaining high accessibility standards. Ability to design complex … and written communication skills Good work ethic, self-starter, and results-oriented Additional Preferred Qualifications: Domain knowledge in Financial Industry and Capital Markets is a plus. Experience with Big Data technologies ( i.e. Kafka, Apache Spark, NOSQL) Knowledge of BI tools like Power BI, Microstrategy etc Exposure to Python and Scala Exposure to Salesforce ecosytem About S&P Global Ratings More ❯
The team you'll be working with: We are seeking a highly skilled and experienced Data Architect to join our dynamic team. The ideal candidate will have a strong background in the full data modeling lifecycle, including designing, implementing, and maintaining complex data models that align with organizational goals and industry standards. This role requires a deep … understanding of data architecture, data modeling methodologies, and experience with real-time data integrations. The successful candidate will collaborate with cross-functional teams to ensure optimal datastructures supporting business intelligence, analytics, and operational needs. This role involves creating, assuring, and overseeing the implementation of data models within analytical and real-time streaming domains. … What you'll be doing: Develop conceptual, logical, and physical data models to support data analytics, streaming, and data product implementation. Define and maintain data architecture standards, principles, and best practices. Ensure data models are aligned with business requirements and scalable for future needs. Work closely with business stakeholders, data engineers, data solution More ❯
The team you'll be working with: We are seeking a highly skilled and experienced Data Architect to join our dynamic team. The ideal candidate will have a strong background in the full data modeling lifecycle, including designing, implementing, and maintaining complex data models that align with organizational goals and industry standards. This role requires a deep … understanding of data architecture, data modeling methodologies, and experience with real-time data integrations. The successful candidate will collaborate with cross-functional teams to ensure optimal datastructures supporting business intelligence, analytics, and operational needs. This role involves creating, assuring, and overseeing the implementation of data models within analytical and real-time streaming domains. … What you'll be doing: Develop conceptual, logical, and physical data models to support data analytics, streaming, and data product implementation. Define and maintain data architecture standards, principles, and best practices. Ensure data models are aligned with business requirements and scalable for future needs. Work closely with business stakeholders, data engineers, data solution More ❯
Join to apply for the Data Architect role at NTT DATA 2 days ago Be among the first 25 applicants Join to apply for the Data Architect role at NTT DATA We are seeking a highly skilled and experienced Data Architect to join our dynamic team. The ideal candidate will have a strong background in … full data modelling life cycle, e.g. design, implement, and maintain complex data models that align with organisational goals and industry standards. This role requires a deep understanding of data architecture, data modelling methodologies, and ideally in real-time data integrations. The successful candidate will collaborate with cross-functional teams to ensure optimal datastructures that support business intelligence, analytics, and operational requirements. This role is creating, assuring, and overseeing the implementation of data models within analytical and real-time streaming domains. What you'll be doing: Develop conceptual, logical, and physical data models to support data analytics, streaming and data products implementation. Define and maintain data architecture standards More ❯
We are looking for a Senior Data Engineer to join our growing team. Our data team is responsible for migrating clients' legacy data onto the Intapp DealCloud platform using Extract, Transform and Load processes. They provide expert guidance to clients, execute on the technical aspects of a migration delivery, and are knowledgeable on a wide variety of … legacy datastructures and best practices to ensure we are delivering a first-class service across the board. What you will do: Migrating legacy client data into Intapp products utilizing python, SQL and other ETL pipeline processes. Lead Data Services workstreams as part of a Project team to deliver Data Services for Intapp's customers. … Lead Internal development processes and consult on ETL best practices. Operating in a client facing role to provide analysis of client needs and data mapping recommendations. Conduct data management and data mapping workshops as related to the proposed solution Working cross functionally with several teams including the product teams, implementation teams, professional services teams in order to More ❯
and enterprise architecture is essential. You will collaborate with global and diverse teams, including Business Analysts, Project Management, Production Support, and Infrastructure. Price Master Central is a global reference data management application - responsible for sourcing Securities & Pricing data from market vendors and internal Citi sources and providing it to downstream clients after applying client specific rules. Responsibilities : Deliver … DB Extensive working knowledge on container platform based on Kubernetes, Kafka, Redis Experience with Unix commands, shell scripting. Strong understanding of Design patterns and Architectural principles Familiarity with standard datastructures and algorithms. Experience using the following tools - JIRA, Harness/uDeploy, Sonarqube, TeamCity, Artifactory, Git (GHE & Bitbucket) Logical thinking, strong analytical and problem-solving skills; innovative and … Experience: Master's degree or PHD in relevant field is desirable Experience working with the Scrum methodology Experience designing and implementing microservices Financial services technology experience preferably in reference data domain Physical and logical data modeling Education: Bachelor's degree/University degree or equivalent experience Master's degree preferred What we'll provide you By joining Citi More ❯
We are looking for a hands-on, technically driven Data Engineer to join our expanding data team. This role is ideal for someone with a strong foundation in SQL and data management, who has recently graduated from a Data Science or Data Engineering programme and is eager to apply their skills in a modern, forward … thinking environment. You will be responsible for developing and maintaining robust data solutions that support reporting, automation, and strategic decision-making across the business. The successful candidate will have up-to-date knowledge of data tooling and methodologies, particularly in SQL, Power BI, and data automation, with an interest in modern data architecture such as cloud … As the role sits within a relatively new department, where methodologies and processes are still evolving, there is a significant opportunity to help shape the future of OEG’s data management practices. Key Responsibilities: Database and Data Engineering Design, develop, and maintain SQL-based solutions and data pipelines. Manage relational databases, ensuring data can be extracted More ❯
day ago Be among the first 25 applicants hackajob is collaborating with NTT DATA UK to connect them with exceptional tech professionals for this role. We are seeking a highly skilled and experienced Data Architect to join our dynamic team. The ideal candidate will have a strong background in full data modelling life cycle, e.g. design, implement … and maintain complex data models that align with organisational goals and industry standards. This role requires a deep understanding of data architecture, data modelling methodologies, and ideally in real-time data integrations. The successful candidate will collaborate with cross-functional teams to ensure optimal datastructures that support business intelligence, analytics, and operational requirements. … This role is creating, assuring, and overseeing the implementation of data models within analytical and real-time streaming domains. (Security Check) clearance required. Must have now, or have held in the past. Develop conceptual, logical, and physical data models to support data analytics, streaming and data products implementation. Define and maintain data architecture standards, principles More ❯
Chelsea and Westminster Hospital NHS Foundation Trust
Job summary Are you passionate about data engineering and managing of data projects? Do you have an interest in the creation of solutions which allow NHS operational managers, clinicians and wider teams to delivereffective and high-quality services that make real difference to patients and service users? This role will be responsible for managing data projects on … behalf of the data engineering and business intelligence teams, providing technical and administrative support and expertise, with a focus on agile development practices. The post holder will communicate with project stakeholders, analyse data requirements, get involved in data transformations and data management tasks, designing and maintaining tasks boards and project management documentation to enable the successful … deployment of data projects. The successful candidate will provide essential support to the Data Engineering & BI teams as and when needed and play a crucial role in ensuring the successful execution of data and technology-driven projects. Main duties of the job Key responsibilitieso Collaborate with cross-functional teams to identify business objectives and translate them into More ❯
Southampton, Hampshire, United Kingdom Hybrid / WFH Options
gen2fund.com
years of experience using QlikView version 11 or higher, with proven expertise in the following areas: Good knowledge of SQL, relational databases, and Dimensional Modeling Experience working with large data sets and complex data models involving more than 10 tables Integrating data from multiple sources into QlikView Data Models, including social media content and API extensions … Use of complex QlikView functions and developing optimal scripts for solutions Optimizing Dimensional data models for performance Primary Responsibilities: Creating and providing reporting and dashboard applications using QlikView and NPrinting to facilitate better decision-making Collaborating with stakeholders to gather requirements, and translating these into system and functional specifications Creating prototypes and conducting proof of concepts with business leads … queries Installing, configuring, and maintaining QlikView environments (QlikView, QlikSense, Publisher, NPrinting) Developing complex QlikView applications using advanced functions (set analysis, section access, alternate states, loop and reduce, etc.) Extracting data from various sources (SQL Server, Oracle, Excel, Hive), designing datastructures to standardize and distribute information Working with IT and security teams to implement security models Mentoring More ❯
high-ambiguity environment. Build, train, and deploy state-of-the-art models (e.g., deep learning, NLP, computer vision, reinforcement learning, or relevant domain-specific architectures). Design infrastructure for data ingestion, annotation, experimentation, model versioning, and monitoring. Collaborate closely with product, design, and DevOps to integrate AI features into our platform. Continuously evaluate new research, open-source tools, and … high-ambiguity environment. Build, train, and deploy state-of-the-art models (e.g., deep learning, NLP, computer vision, reinforcement learning, or relevant domain-specific architectures). Design infrastructure for data ingestion, annotation, experimentation, model versioning, and monitoring. Collaborate closely with product, design, and DevOps to integrate AI features into our platform. Continuously evaluate new research, open-source tools, and … mentor, and grow an AI/ML team as we scale beyond our seed round. ⸻ Key Responsibilities Architecture & Hands-On Development Define and implement end-to-end AI pipelines: data collection/cleaning, feature engineering, model training, validation, and inference. Rapidly prototype novel models (e.g., neural networks, probabilistic models) using PyTorch, TensorFlow, JAX, or equivalent. Productionize models in cloud More ❯
Our platform simplifies all administrative tasks, allowing entrepreneurs and small business owners to manage their business admin effortlessly and without the usual complexities. By harnessing cutting-edge technology and data analytics, coupled with outstanding customer service, we provide a seamless experience that emphasises simplicity, efficiency, and rapid execution. Our goal is to remove obstacles, enabling you to achieve success … fostering a collaborative learning environment. API Development: Design and build RESTful API endpoints using NodeJS and Express, including detailed API documentation. Database Management: Design, optimize, and maintain complex SQL data models, ensuring efficient datastructures, query performance, and reliability. Full-Stack Development: Develop responsive and performant front-end applications using ReactJS, focusing on UX/UI best … a proactive and solution-oriented mindset. Soft Skills: Excellent communication, organisational skills, and the ability to work independently or within a team. Preferred Qualifications Advanced Degrees: Master’s in Data Science or Software Engineering. Cloud & DevOps: Experience with AWS cloud services, Terraform, Docker, Kubernetes Agile/DevOps: Experience with Agile methodologies, CI/CD pipelines, and project management tools More ❯
you'll be doing Design, develop and deploy scalable web applications and services, being part of everything from contributing to web technology and framework stack, to use case and data model design Drive architecture discussions, take ownership and responsibility over new projects, and deliver high quality software with tight timeline Partner with security and compliance teams to ensure the … We're excited if you have 5+ years of experience in delivery multi-tier, highly scalable, distributed web application Deep understanding in software architecture, object-oriented design principles, and datastructures Extensive experience in developing microservices using Java, Python Experience in distributed computing frameworks like - Hive/Hadoop, Apache Spark. Good experience in Test driven development and automating … global footprint, and how we've grown, visit . By providing your information, you acknowledge that you have read our Applicant Privacy Notice and authorize Roku to process your data subject to those terms. More ❯
Backend software solution development Experience with Agile development methodologies and SecDevOps practices. Excellent knowledge of secure coding practices inline. Strong knowledge of databases, SQL, and NoSQL, as well as datastructures and algorithms. Excellent problem-solving skills and the ability to work in a fast-paced, evolving environment. Strong communication and collaboration skills, with the ability to articulate More ❯
the product design, application functionality, and technical operations and processes. Some Other Highly Valued Skills May Include Computer Science degree or equivalent of Computer Science, Object Oriented Design and Data Structures. Experience of working in an IT project environment preferably in banking or financial sector. Ability to adapt to new tools, technologies, and methodologies. Willingness to keep up with … innovations and actively contribute to the organization’s technology communities to foster a culture of technical excellence and growth. Adherence to secure coding practices to mitigate vulnerabilities, protect sensitive data, and ensure secure software solutions. Implementation of effective unit testing practices to ensure proper code design, readability, and reliability. Vice President Expectations To contribute or set strategy, drive requirements More ❯
governance. Collaboration with cross-functional teams and senior leadership. Good knowledge and experience of UNIX/Linux skills Good knowledge and experience of programing languages: C++ Knowledge of networking, data structure, databases Experience with scripting (Perl, shell, Python etc.) Strong analytical and problem-solving skills Experience with Agile/DevOps Experience with large-scale distributed systems is preferred Experience More ❯
experiences and building platforms that handle the complexity of modern cruising - all while keeping things collaborative and fun. Your Mission: Are you passionate about unlocking the power of customer data? We're looking for a talented Lead Data Engineer to spearhead the design, development, and optimisation of our critical CRM and customer data transformation. You'll play … a pivotal role in building the data foundations for advanced analytics, personalised customer experiences, and effective marketing activation. As a senior member of the team, you will lead data engineers and champion best practices within our data environment. What You'll Do: Design & Build: Architect, build, test, and deploy robust, scalable, and reliable data pipelines, focusing … on ingesting and transforming CRM and customer data from various sources. Lead & Innovate: Take technical ownership of customer data integration solutions within our data platform (AWS/SQL Server). Drive improvements and implement best-in-class data engineering practices. Ensure Quality: Champion data quality and governance for customer datasets. Implement robust monitoring, validation checks More ❯
experiences and building platforms that handle the complexity of modern cruising - all while keeping things collaborative and fun. Your Mission: Are you passionate about unlocking the power of customer data? We're looking for a talented Lead Data Engineer to spearhead the design, development, and optimisation of our critical CRM and customer data transformation. You'll play … a pivotal role in building the data foundations for advanced analytics, personalised customer experiences, and effective marketing activation. As a senior member of the team, you will lead data engineers and champion best practices within our data environment. What You'll Do: Design & Build: Architect, build, test, and deploy robust, scalable, and reliable data pipelines, focusing … on ingesting and transforming CRM and customer data from various sources. Lead & Innovate: Take technical ownership of customer data integration solutions within our data platform (AWS/SQL Server). Drive improvements and implement best-in-class data engineering practices. Ensure Quality: Champion data quality and governance for customer datasets. Implement robust monitoring, validation checks More ❯
Job Title: Data & AI Engineer, Associate Manager CL8 Locations: London/Bristol/Manchester Salary: Competitive salary and package (Depending on level of experience) Please Note: Any offer of employment is subject to satisfactory BPSS and SC security clearance which requires 5 years continuous UK address history at the point of application. Accenture is a leading global professional services … all of us.” – Julie Sweet, Accenture CEO Job Qualifications Key responsibilities Deploy machine learning models to production and implement measures to monitor their performance Implement ETL pipelines and orchestrate data flows using batch and streaming technologies based on software engineering best practice Define, document and iterate data mappings based on concepts and principles of data modelling Re … engineer data pipelines to be scalable, robust, automatable, and repeatable Navigate, explore and query large scale datasets Build processes supporting data transformation, datastructures, metadata, dependency and workload management Identify and resolve data issues including data quality, data mapping, database and application issues Implement data flows to connect operational systems, dataMore ❯
Join to apply for the Senior Software Engineer, Data role at BenchSci Continue with Google Continue with Google Join to apply for the Senior Software Engineer, Data role at BenchSci We are looking for a Senior Software Engineer, Data to join our Knowledge Enrichment Team ! Reporting to the team’s Engineering Manager, you will evolve BenchSci’s … Knowledge Graph , integrate public life science data into our biological ontology , iterate on data models in various datastores, including Graph DB , improve internal tooling to allow data self-service, and operationalize production-grade data pipelines. As part of this role, you'll collaborate with a world-class team, experience growth and mentorship, and apply data engineering solutions to shape the future of scientific discovery. You Will: Scale data pipelines to allow our data to go from research to platform quickly and reliably Manage sources that contain both semi-structured and unstructured biological data that contribute to the evolution of BenchSci’s Knowledge Graph Integrate public life science data into the More ❯
pipelines using Docker, Git, CI/CD methodologies, and cloud services. MarTech Innovation: Research and implement AI solutions specifically targeting marketing use cases (content generation, audience analytics, personalisation). Data Pipeline Architecture: Design robust data flows between marketing systems and AI services. Technical Leadership: Provide strategic direction, mentorship, and best practices for AI implementation across the team. Documentation … design, and cloud platforms (GCP, Azure). Skilled in containerisation and orchestration (Docker, Kubernetes), version control (Git, BitBucket), and CI/CD pipelines. Strong foundation in software design patterns, datastructures, algorithms, and database technologies (SQL, NoSQL, vector databases). Preferred Qualifications Experience in MarTech, AdTech, or B2B marketing technology, with a background in marketing data and … analytics platforms. Familiarity with prompt engineering, LLM optimisation, and designing data processing pipelines (ETL). Agile development experience (Scrum, Kanban) and a track record of integrating AI into business applications. What you need to do now If you're interested in this role, click 'apply now' to forward an up-to-date copy of your CV, or call us More ❯
Manage deployment pipelines using Docker, Git, CI/CD, and cloud services. MarTech Innovation: Research and implement AI solutions for marketing use cases like content generation, analytics, and personalisation. Data Pipeline Architecture: Design robust data flows between marketing systems and AI services. Technical Leadership: Provide strategic direction, mentorship, and best practices for AI implementation. Documentation: Create comprehensive technical … design, and cloud platforms (GCP, Azure). Skilled in containerisation and orchestration (Docker, Kubernetes), version control (Git, BitBucket), and CI/CD pipelines. Strong foundation in software design patterns, datastructures, algorithms, and database technologies (SQL, NoSQL, vector databases). Preferred Qualifications Experience in MarTech, AdTech, or B2B marketing technology, with a background in marketing data and … analytics platforms. Familiarity with prompt engineering, LLM optimisation, and designing data processing pipelines (ETL). Agile development experience (Scrum, Kanban) and a track record of integrating AI into business applications. What you need to do now If you're interested in this role, click 'apply now' to forward an up-to-date copy of your CV, or call us More ❯
purpose (Java, Cloud computing, HDFS, Spark, S3, ReactJS, Sybase IQ among many others). A glimpse of the interesting problems that we engineer solutions for, include acquiring high quality data, storing it, performing risk computations in limited amount of time using distributed computing, and making data available to enable actionable risk insights through analytical and response user interfaces. … memory and CPU utilization. Perform statistical analyses to identify trends and exceptions related Market Risk metrics. Build internal and external reporting for the output of risk metric calculation using data extraction tools, such as SQL, and data visualization tools, such as Tableau. Utilize web development technologies to facilitate application development for front end UI used for risk management … like Snowflake, Sybase IQ and distributed HDFS systems. Interact with business users for resolving issues with applications. Design and support batch processes using scheduling infrastructure for calculation and distributing data to other systems. Oversee junior technical team members in all aspects of Software Development Life Cycle (SDLC) including design, code review and production migrations. Skills And Experience Bachelor's More ❯
Lloyds Banking Group We're on an exciting transformation journey and there could not be a better time to join us. The investments we're making in our people, data, and technology are leading to innovative projects, fresh possibilities and countless new ways for our people to work, learn, and thrive. What you'll need: Extensive Experience: Extensive professional … JPA). API Development: Strong experience in designing and consuming RESTful APIs (experience with gRPC and/or SOAP is a plus). Computer Science Fundamentals: Solid knowledge of DataStructures and Algorithms. Software Design Principles: Solid knowledge of Object-Oriented Design principles, Design Patterns, and Clean Code practices. Database Skills: Hands-on experience with Relational Databases (i.e. … shaping the financial services of the future, whilst the scale and reach of our Group means you'll have many opportunities to learn, grow and develop. We keep your data safe. So, we'll only ever ask you to provide confidential or sensitive information once you have formally been invited along to an interview or accepted a verbal offer More ❯
Lloyds Banking Group We're on an exciting transformation journey and there could not be a better time to join us. The investments we're making in our people, data, and technology are leading to innovative projects, fresh possibilities and countless new ways for our people to work, learn, and thrive. What you'll need: Extensive Experience: Extensive professional … JPA). API Development: Strong experience in designing and consuming RESTful APIs (experience with gRPC and/or SOAP is a plus). Computer Science Fundamentals: Solid knowledge of DataStructures and Algorithms. Software Design Principles: Solid knowledge of Object-Oriented Design principles, Design Patterns, and Clean Code practices. Database Skills: Hands-on experience with Relational Databases (i.e. More ❯