of innovation! Are you a passionate technologist eager to build impactful solutions in a collaborative environment? At BMC Control-M's DataOps team, we're developing a next-generation data quality solution that ensures reliable, secure, and actionable insights for enterprise systems. We're looking for a skilled Python Developer who's excited to work with modern data … and collaboration skills. Nice to Have Experience with containerization tools (Docker, Kubernetes, Helm). Hands-on work with AWS services (e.g., Lambda, S3, EMR). Familiarity with DataOps practices , data modeling and data warehousing concepts. Knowledge of security best practices (certificates, encryption). Comfortable working in an Agile development environment with strong testing practices. Our commitment to you More ❯
and enterprise architecture is essential. You will collaborate with global and diverse teams, including Business Analysts, Project Management, Production Support, and Infrastructure. Price Master Central is a global reference data management application - responsible for sourcing Securities & Pricing data from market vendors and internal Citi sources and providing it to downstream clients after applying client specific rules. Responsibilities : Deliver … DB Extensive working knowledge on container platform based on Kubernetes, Kafka, Redis Experience with Unix commands, shell scripting. Strong understanding of Design patterns and Architectural principles Familiarity with standard data structures and algorithms. Experience using the following tools - JIRA, Harness/uDeploy, Sonarqube, TeamCity, Artifactory, Git (GHE & Bitbucket) Logical thinking, strong analytical and problem-solving skills; innovative and solutions … Experience: Master's degree or PHD in relevant field is desirable Experience working with the Scrum methodology Experience designing and implementing microservices Financial services technology experience preferably in reference data domain Physical and logical data modeling Education: Bachelor's degree/University degree or equivalent experience Master's degree preferred What we'll provide you By joining Citi More ❯
vendors. Ideate and design end-to-end solutions for Guidewire ClaimCenter implementation and integration of Guidewire specific to customer environment. Handle end-to-end architectural responsibilities from solution design, data migration planning, capacity planning, application security, process orchestration, application and data integration etc. Conduct architectural and technical reviews to meet development standards. Provide necessary software development leadership to … architecture into the necessary application architecture. Create application architecture and the high-level design for the chosen solution. Your Profile Essential skills/knowledge/experience: Understand the Guidewire Data Model and the Application Landscape. Experience in integrating Guidewire Claims Centre with downstream systems including finance systems (general ledger), payment systems and documents management systems. Experience in integrating Guidewire … Claims Centre V10 and SaaS Cloud with third party/broker policy management systems. Experienced in migrating data from legacy systems to Guidewire Claims Centre. Expertise in reconciling data from Guidewire with Datawarehouse. Experience and abilities demonstrated with GOSU programming, XML, PCF, REST and SOAP. Experience with source code management systems such as Subversion, GitHub. Knowledge of General More ❯
vendors. Ideate and design end-to-end solutions for Guidewire ClaimCenter implementation and integration of Guidewire specific to customer environment. Handle end-to-end architectural responsibilities from solution design, data migration planning, capacity planning, application security, process orchestration, application and data integration etc. Conduct architectural and technical reviews to meet development standards. Provide necessary software development leadership to … architecture into the necessary application architecture. Create application architecture and the high-level design for the chosen solution. Your Profile Essential skills/knowledge/experience: Understand the Guidewire Data Model and the Application Landscape. Experience in integrating Guidewire Claims Centre with downstream systems including finance systems (general ledger), payment systems and documents management systems. Experience in integrating Guidewire … Claims Centre V10 and SaaS Cloud with third party/broker policy management systems. Experienced in migrating data from legacy systems to Guidewire Claims Centre. Expertise in reconciling data from Guidewire with Datawarehouse. Experience and abilities demonstrated with GOSU programming, XML, PCF, REST and SOAP. Experience with source code management systems such as Subversion, GitHub. Knowledge of General More ❯
of a system. Work with multiple, enterprise-wide distributed teams to deliver new capabilities in business applications. Design and develop APIs and UIs to enhance the use of large data sets, infrastructure, and user experience. Own the full lifecycle of web software development, from ideas to production, following and improving the Secure Software Development Life Cycle (SSDLC). Provide … ensuring adherence to best practices in design, coding, testing, and deployment. Review programming documentation and recommend changes in development, maintenance, and application standards. Analyze and develop logical database designs, data models, and relational data definitions across multiple computing environments (e.g., host-based, distributed systems, client-server). Comply with architectural standards, established methodologies, and practices, as well as … 8+ years of experience: Java, .NET, Python Minimum of 8+ years coordinating team efforts in a project or operations environment Minimum of 5+ years of experience with databases and data modeling/design (SQL and NoSQL) Minimum of 5+ years in full stack development for cloud solutions (Azure or AWS) - Azure preferred Skills and Knowledge 5+ years in full More ❯
of a system. Work with multiple, enterprise-wide distributed teams to deliver new capabilities in business applications. Design and develop APIs and UIs to enhance the use of large data sets, infrastructure, and user experience. Own the full lifecycle of web software development, from ideas to production, following and improving the Secure Software Development Life Cycle (SSDLC). Provide … ensuring adherence to best practices in design, coding, testing, and deployment. Review programming documentation and recommend changes in development, maintenance, and application standards. Analyze and develop logical database designs, data models, and relational data definitions across multiple computing environments (e.g., host-based, distributed systems, client-server). Comply with architectural standards, established methodologies, and practices, as well as … 8+ years of experience: Java, .NET, Python Minimum of 8+ years coordinating team efforts in a project or operations environment Minimum of 5+ years of experience with databases and data modeling/design (SQL and NoSQL) Minimum of 5+ years in full stack development for cloud solutions (Azure or AWS) - Azure preferred Skills and Knowledge 5+ years in full More ❯
of a system. Work with multiple, enterprise-wide distributed teams to deliver new capabilities in business applications. Design and develop APIs and UIs to enhance the use of large data sets, infrastructure, and user experience. Own the full lifecycle of web software development, from ideas to production, following and improving the Secure Software Development Life Cycle (SSDLC). Provide … ensuring adherence to best practices in design, coding, testing, and deployment. Review programming documentation and recommend changes in development, maintenance, and application standards. Analyze and develop logical database designs, data models, and relational data definitions across multiple computing environments (e.g., host-based, distributed systems, client-server). Comply with architectural standards, established methodologies, and practices, as well as … 8+ years of experience: Java, .NET, Python Minimum of 8+ years coordinating team efforts in a project or operations environment Minimum of 5+ years of experience with databases and data modeling/design (SQL and NoSQL) Minimum of 5+ years in full stack development for cloud solutions (Azure or AWS) - Azure preferred Skills and Knowledge 5+ years in full More ❯
Annapolis Junction, Maryland, United States Hybrid / WFH Options
Codescratch LLC
cutting-edge analytics and tools in support of cyber mission space. Join our team to contribute to innovative engineering projects, creating reliable, scalable, and high-performing software for efficient data processing and informed decision-making. This position is ideal for individuals who are passionate about designing efficient, secure, and scalable software and thrive in a collaborative environment. You'll … to be local to the MD or Northern VA/DC Metro area. Key Responsibilities: Design, develop, and maintain scalable, high-performance back-end services, APIs, and UI for data management and visualization tools. Optimize database performance, data modeling, and processing for large-scale applications. Enhance existing back-end architecture to improve system responsiveness, security, and scalability. Develop … or years of experience Programming skills in Python (Django, DRF, FastAPI) or React/TypeScript with Material UI, Git, SQL, Playwright/Cypress Test Libraries, Testing, Debugging. Experience with Data Visualization, RESTful APIs, RESTful Web Services, Orchestration and Containerization (e.g. Kubernetes, Docker). Experience with Golang, Kotlin/Java, and/or Python U.S. citizenship required Active TS/ More ❯
Senior Data Management Professional - Data Product Owner - Data AI Location London Business Area Data Ref # Description & Requirements Bloomberg runs on data. Our products are fueled by powerful information. We combine data and context to paint the whole picture for our clients, around the clock - from around the world. In Data, we are responsible … for delivering this data, news and analytics through innovative technology - quickly and accurately. We apply problem-solving skills to identify innovative workflow efficiencies, and we implement technology solutions to enhance our systems, products and processes - all while providing customer support to our clients. Our Team: The Bloomberg Data AI group brings innovative AI technologies into Bloomberg's Data … of AI-powered products. Our team provides evaluation and annotation frameworks. We partner closely with team members to align AI innovation with Bloomberg's strategic objectives, focusing on optimizing data workflows and elevating the quality, intelligence, and usability of the data that drives our products. Our work amplifies the impact of the Data organization by delivering intelligent More ❯
Senior Data Management Professional - Data Product Owner - Data AI Location London Business Area Data Ref # Description & Requirements Bloomberg runs on data. Our products are fueled by powerful information. We combine data and context to paint the whole picture for our clients, around the clock - from around the world. In Data, we are responsible … for delivering this data, news and analytics through innovative technology - quickly and accurately. We apply problem-solving skills to identify innovative workflow efficiencies, and we implement technology solutions to enhance our systems, products and processes - all while providing customer support to our clients. Our Team: The Bloomberg Data AI group brings innovative AI technologies into Bloomberg's Data … of AI-powered products. Our team provides evaluation and annotation frameworks. We partner closely with team members to align AI innovation with Bloomberg's strategic objectives, focusing on optimizing data workflows and elevating the quality, intelligence, and usability of the data that drives our products. Our work amplifies the impact of the Data organization by delivering intelligent More ❯
An exciting opportunity has arisen for a Data Analyst with expertise in Data Governance to join an international bank based in London. This role is perfect for someone passionate about shaping data management practices, ensuring the highest standards of data quality, and fostering a culture of best practice across the business. You will play a pivotal … part in implementing a comprehensive Data Strategy, including the introduction of a Data Catalogue and robust governance policies. Key Responsibilities: Contribute to developing and refining Data Governance policies and procedures Maintain the Data Catalogue, Business Glossary, and Data Lineage documentation Engage with stakeholders to clarify Data Strategy goals and responsibilities Coordinate meetings, facilitate discussions … and document outcomes Collaborate across departments to align data initiatives with business processes Stay updated on data management trends and best practices Investigate and report on data issues, supporting resolution efforts Help define and enforce Data Quality standards and methodologies Perform additional tasks as assigned by senior leadership Key Requirements: Relevant degree in Computer Science, Information More ❯
Data Engineer I, Business Data Technologies Business Data Technologies (BDT) makes it easier for teams across Amazon to produce, store, catalog, secure, move, and analyze data at massive scale. Our managed solutions combine standard AWS tooling, open-source products, and custom services to free teams from worrying about the complexities of operating at Amazon scale. This … lets BDT customers move beyond the engineering and operational burden associated with managing and scaling platforms, and instead focus on scaling the value they can glean from their data, both for their customers and their teams. We own the one of the biggest (largest) data lakes for Amazon where 1000's of Amazon teams can search, share, and … store EB (Exabytes) of data in a secure and seamless way; using our solutions, teams around the world can schedule/process millions of workloads on a daily basis. We provide enterprise solutions that focus on compliance, security, integrity, and cost efficiency of operating and managing EBs of Amazon data. Key job responsibilities CORE RESPONSIBILITIES: Be hands-on with More ❯
software developer who likes to solve business problems, Selling Partner Services (SPS) is the place for you. Our team is responsible for Case Management System. We are looking for data engineers who thrive on complex problems and solve for operating complex and mission critical systems under high loads. Our systems manage case resolution systems with hundreds of millions of … requests, and respond to millions of service requests. We have great data engineering and science opportunities. We are aimed to provide customizable and LLM based solution to our clients. Do you think you are up for this challenge? Or would you like to learn more and stretch your skills and career? The successful candidate is expected to contribute to … all parts of the data engineering and deployment lifecycle, including design, development, documentation, testing and maintenance. They must possess good verbal and written communication skills, be self-driven and deliver high quality results in a fast paced environment. You will thrive in our collaborative environment, working alongside accomplished engineers who value teamwork and technical excellence. We're looking for More ❯
Senior Data Architect Joining Capco means joining an organisation that is committed to an inclusive working environment where you're encouraged to Be Yourself At Work. We celebrate individuality and recognize that diversity and inclusion, in all forms, is critical to success. It's important to us that we recruit and develop as diverse a range of talent as … there Focused on maintaining our nimble, agile and entrepreneurial culture. Your Capco Day/Key Responsibilities You will accompany and lead complex agile consulting projects supporting our clients with data architecture design and delivery. You will work with technology leaders across Financial Services to provide best practice guidance for data management and data architecture development and optimization. … platforms. Cloud architect/data architect certifications (AWS, GPC, Azure). Knowledge of the relevant procedures, architectures and technologies in one or more of the following topics: Contextual Datamodelling, Entity Relationship Modelling, Logical & Physical Datamodelling and Data Lake design. Experience designing and implementing cloud data migration and storage patterns on one or more of AWS, GCP More ❯
Possibility of remote work: Hybrid (number of WFO days will depend on the client's needs) Contract duration: 6 months Location: London JOB DETAILS Role Title: Senior Data Engineer Note: (Please do not submit the same profiles as for 111721-1) Required Core Skills: Databricks, AWS, Python, Pyspark, datamodelling Minimum years of experience: 7 years Job … Description: Must have hands-on experience in designing, developing, and maintaining data pipelines and data streams. Must have a strong working knowledge of moving/transforming data across layers (Bronze, Silver, Gold) using ADF, Python, and PySpark. Must have hands-on experience with PySpark, Python, AWS, data modelling. Must have experience in ETL processes. Must have … hands-on experience in Databricks development. Good to have experience in developing and maintaining data integrity and accuracy, data governance, and data security policies and procedures. Must have hands-on experience in SQL development. Excellent communication skills & logical thinking. More ❯
and residential sectors, your role will directly influence business strategy. Key Responsibilities Analyse external and internal datasets to produce actionable insights for seniors housing market strategies. Develop and automate datamodelling processes, creating impactful visualisations and dashboards. Collaborate closely with internal stakeholders, contributing directly to market-leading reports and client presentations. Provide rapid responses to ad-hoc data requests, enabling informed business decisions. Identify and communicate key market trends to shape consultancy, valuation, and agency strategies. Role Requirements 3-5 years' professional experience in data analytics, ideally within real estate or property consultancy. Advanced proficiency in datamodelling and analytics tools (Power BI, ArcGIS, Flourish). Strong coding skills in Python or similar languages. … Solid understanding of real estate data sources (e.g., Co-Star, RCA, Property Data). Familiarity with geospatial analysis software (e.g., ArcGIS Pro) and SQL databases. More ❯
Manchester, Lancashire, England, United Kingdom Hybrid / WFH Options
WüNDER TALENT
Azure Data Engineer/Architect | Customer Facing Experience Essential | Azure, Synapse | £550p/day Outside IR35 - Remote (UK) Role: Azure Data Engineer/Architect with Customer-Facing Experience Duration: 3 Months Initially Location: Remote (UK) Start Date: ASAP Rate: £550p/day Outside IR35 About The Role Our partner is looking for Senior Data Engineer/Architect … to play a key role in shaping their data infrastructure, leveraging the latest Azure technologies. You'll be working hands-on with Synapse and Spark technologies in enterprise-scale data environments. Additionally, you will gain exposure to Microsoft Fabric. This is a contract role for an experienced and customer-facing person who thrives in complex data environments … and can also bridge the gap between hands-on engineering and data architecture. Responsibilities Design, build and maintain scalable data pipelines using Synapse and Spark. Develop and optimise data processing code in Python and PySpark. Support data architecture design and ensure alignment with enterprise data strategy. Collaborate with business and technical stakeholders to gather and More ❯
Data Engineer (Mid-Level) – Halifax/Hybrid Location: Halifax, UK (Hybrid, 2–3 days in-office) Salary: £40,000-£50,000 Type: Full-Time | Permanent A fast-growing, tech-driven business is seeking a Data Engineer to help build scalable data infrastructure during a major platform transformation. You’ll work closely with technical leadership and analysts to … bridge legacy systems with a new modern data stack, enabling better insights, automation, and AI-driven innovation. Key Responsibilities Build and maintain scalable ETL/ELT pipelines across legacy and modern platforms Collaborate with analysts and AI engineers on datamodelling and reporting Clean, transform and structure complex datasets for business use Contribute to data architecture … planning and strategy Improve data access, visibility, and performance What You’ll Need Essential: 2–4 years’ experience in a data engineering role Strong SQL and relational database knowledge Experience building data pipelines and data models Familiarity with legacy systems (ideally Progress DB) Python for data scripting Great problem-solving and communication skills Desirable: PHP More ❯
Data Scientist- Data Scientist - Journey Optimisation page is loaded Data Scientist- Data Scientist - Journey Optimisation Apply locations London time type Full time posted on Posted 8 Days Ago time left to apply End Date: June 27, 2025 (7 days left to apply) job requisition id JR0024 What will you be doing day-to-day? Use sophisticated … statistical and machine learning techniques to identify new trends and relationships in data. Harvest, wrangle and prototype new data sources internally and external to NewDay to create new value for NewDay and our customers. Provide quality and detailed data science outputs, sharing and following up with as much detail as appropriate or requested by senior managers. Develop knowledge … of all relevant data resources within NewDay and in the wider Credit Industry. Governance: support the models throughout their lifecycle from conception, development, implementation, testing and monitoring, with the required level of documentation to follow internal procedures and standards. Your Skills and Experience ESSENTIAL At least a BSc or higher university degree in a data science related field More ❯
Responsibilities for the role are as follows: Drive best-in-class understanding and knowledge of data to maximise its value Utilise segmentation and decisioning tools to implement intricate business strategies Apply intelligent datamodelling and outstanding data quality to our wealth of data Partner with the business to identify issues, recommend solutions and solve complex … looking for? Degree in Maths, Economics, Physics or a numerate focused degree (min 2:1) A high level of analytical and numerical problem-solving skills. An ability to use data interrogation, manipulation and reporting/dashboard creation tools i.e. Excel, T-SQL queries, SharePoint, Power BI, SSRS etc. Although desirable this isn't essential as training will be provided. More ❯
What's the role The Associate Data Engineer applies techniques such as business intelligence (BI), reporting and datamodelling, analytical techniques and data processing to provide the business with relevant information to increase revenues, improve operational efficiency, optimise customer programs, respond quickly to emerging market trends and gain a competitive edge in the market. You will … information among different individuals, organisations and information systems throughout the information life cycle. What you'll be doing Your role specifically focuses on the SAP domain. Contributes to multiple data engineering projects. Team members in defining and building the data pipelines that will enable faster, better, data-informed decision-making within the business. Supports the implementation of … a new data solution which supports high amounts and velocity of data and supports future growth. Works with latest developments, testing and deployment techniques to support the deployment of new releases to e.g. deploy new data pipelines and add data sources. Acts as part of a Scrum, or as a Scrum Stakeholder (e.g. Agile White belt More ❯
Data Scientist - Credit Behaviours page is loaded Data Scientist - Credit Behaviours Apply locations London time type Full time posted on Posted 2 Days Ago time left to apply End Date: July 6, 2025 (9 days left to apply) job requisition id JR0015 What will you be doing day-to-day? Use sophisticated statistical and machine learning techniques to … identify new trends and relationships in data. Harvest, wrangle and prototype new data sources internally and external to NewDay to create new value for NewDay and our customers. Provide quality and detailed data science outputs, sharing and following up with as much detail as appropriate or requested by senior managers. Develop knowledge of all relevant data resources … and monitoring, with the required level of documentation to follow internal procedures and standards. Your Skills and Experience ESSENTIAL At least a BSc or higher university degree in a data science related field (e.g. machine learning, statistics, mathematics) Proficiency in statistical datamodelling techniques. Proficiency with Python, including experience with statistics/machine learning packages such as More ❯
Our Client We power the future of Marketing and realise business value by applying human creativity to AI, data, and technology. Our expert consultants partner with leading brands to unlock the power of their enterprise data and transform their Marketing capabilities, building sustainable value in a privacy-respectful way. Integrating enterprise technologies, Google’s ecosystem, and our proprietary … solutions to clients' first-party data, we create innovative solutions and unlock the power of AI, delivering tangible ROIs for our clients. Our expert consultants orchestrate Google Marketing and Google Cloud technologies to unlock ultimate AI-powered performance within our client’s organisation. About the Role As one of our Data Engineers, you will be part of a … team responsible for the development and overall delivery of big data platform solutions, automation solutions and data AI Agents. You will be designing and proposing effective combinations of Google Marketing Platform tools (GA4, Campaign Manager 360, Search Ads 360, etc.) and Google Cloud solutions (BigQuery, BQ Sharing (Analytics Hub), Cloud Storage, APIs, Compute Engine, etc.) to address specific More ❯
Our Client We power the future of Marketing and realise business value by applying human creativity to AI, data, and technology. Our expert consultants partner with leading brands to unlock the power of their enterprise data and transform their Marketing capabilities, building sustainable value in a privacy-respectful way. Integrating enterprise technologies, Google’s ecosystem, and our proprietary … solutions to clients' first-party data, we create innovative solutions and unlock the power of AI, delivering tangible ROIs for our clients. Our expert consultants orchestrate Google Marketing and Google Cloud technologies to unlock ultimate AI-powered performance within our client’s organisation. About the Role As one of our Data Engineers, you will be part of a … team responsible for the development and overall delivery of big data platform solutions, automation solutions and data AI Agents. You will be designing and proposing effective combinations of Google Marketing Platform tools (GA4, Campaign Manager 360, Search Ads 360, etc.) and Google Cloud solutions (BigQuery, BQ Sharing (Analytics Hub), Cloud Storage, APIs, Compute Engine, etc.) to address specific More ❯
Want to work on real use cases like underwriting and claims with actual impact? Role overview We are seeking an experienced Data Engineer to support the design and delivery of a modern, cloud-native data platform. This role is critical to the success of a large-scale transformation programme, focused on consolidating multiple data sources into a … unified, scalable architecture to enable improved analytics, data governance, and operational insights. Technical requirements: ️ Substantial experience designing and implementing data solutions on Microsoft Azure Hands-on expertise with Snowflake, including datamodelling, performance optimisation, and secure data sharing practices. Proficiency in DBT (Data Build Tool), with a strong understanding of modular pipeline development, testing … and version control. Familiarity with Power BI,particularly in integrating data models to support both enterprise reporting and self-service analytics. Candidates must demonstrate experience working in Agile environments, delivering in iterative cycles aligned to business value. Tech Stack: Azure Power BI DBT Snowflake About us esynergy is a technology consultancy and we build products, platforms and services to More ❯