Benefits: 401(k) matching Competitive salary Health insurance Paid time off About this Role: The Data Architect/Enterprise Data Analyst will provide data strategy, architecture, and analytics expertise to support enterprise data initiatives across the U.S. Department of Health and Human Services (HHS). This role will define and implement data architecture frameworks, data governance standards, and enterprise data management (EDM) solutions to improve data quality, accessibility, security, and interoperability across systems and organizations. The ideal candidate will have experience working in federal data environments, including designing enterprise data models, developing data integration strategies, and supporting data modernization and cloud migration projects. The role also requires strong … data quality best practices. Excellent communication skills, especially in complex federal environments. Desired Skills and Competencies: Experience supporting HHS or other federal health agencies (CMS, NIH, HRSA, FDA, CDC). Familiarity with federal data policies such as the Federal Data Strategy and CDO requirements. Experience with enterprise data warehouses or data lakes . Certifications such More ❯
powerhouse of diverse teams and take your career wherever you want it to go. Join EY and help to build a better working world. FS Technology Consulting - AI and Data - Data Engineer - Senior Consultant/Manager - Dublin/Cork/Limerick/Galway General Information Location: Dublin, Cork, Limerick, or Galway. Available for VISA Sponsorship: No Business Area … Data & Analytics Contract Type: Full Time - Permanent EY is the only major professional services firm with an integrated Financial Services practice across Europe, the Middle East, India, and Africa (EMEIA). We connect our Asset Management, Banking and Capital Markets and Insurance clients to 6,500 talented people from 12 countries and 35,000 Financial Services colleagues around the … Azure Functions and Logic Apps for automation. Snowflake: Strong SQL skills and experience with Snowflake's architecture (virtual warehouses, storage, cloud services). Proficiency in Snowflake Streams & Tasks for CDC and automation. Experience with Snowflake Secure Data Sharing and Snowflake Marketplace. Familiarity with Snowpark for Python/Java-based transformations. Understanding of role-based access control, data masking More ❯
Framingham, Massachusetts, United States Hybrid/Remote Options
Re:build Manufacturing
span a wide array of industries including aerospace, defense, mobility, healthcare, pharma, biotech, clean tech, chemicals, energy, lifestyle, food production, and industrial equipment. Who we are looking for The Data Engineer will focus on utilizing modern data technologies to operationalize and expand the enterprise Data Lake. This role centers on implementing efficient ingestion strategies, integrating diverse data … close collaboration with software engineers and technical leads, ensuring alignment with application domain models and product roadmaps. Build and operate batch, streaming, and changedatacapture (CDC) pipelines from diverse sources (ERP, CRM, Accounting, knowledge repositories, and other enterprise systems) into the data lake. Data Modeling & Enablement Model curated data within the lake into … systems with a strong understanding of cloud-based data lake architectures and data warehouses . Demonstrated expertise in designing and operating data pipelines (batch, streaming, CDC), including schema evolution , backfills , and performance tuning . Hands-on proficiency with Python and SQL , including experience with distributed processing frameworks (e.g., Apache Spark ) and CI/CD for dataMore ❯
Luton, Bedfordshire, South East, United Kingdom Hybrid/Remote Options
Stackstudio Digital Ltd
Job Title: CDC Qlik Talend Lead Location: Luton (Hybrid 2 to 3 days per week onsite) Job Type: Contract (Inside IR35) Job Summary: Join Tata Consultancy Services (TCS) as a CDC Qlik Talend Lead and play a pivotal role in delivering business-critical migration and integration projects for leading UK clients. You will utilize your expertise in Qlik Talend Cloud … and ChangeDataCapture (CDC) to lead end-to-end migration activities, develop robust data integration pipelines, and drive seamless collaboration across distributed teams. Key Responsibilities: Lead and execute end-to-end migration from on-cloud environments to Qlik Talend SaaS. Design, develop, and maintain scalable Talend Cloud pipelines and CDC workflows. Collaborate with cross-functional … components such as tMap, tJoin, tFileInput , etc. Solid experience in SQL , ETL/ELT design, and cloud data platforms. Strong understanding of ChangeDataCapture (CDC) concepts and data modeling. Experience integrating with Git and Talend CI tools. Excellent stakeholder management and communication skills across distributed teams. Person Specification: Independent, detail-oriented professional with a More ❯
Hoplite Solutions is seeking a highly experienced and motivated Data Engineer to join our big data platform team, with a strong focus on Oracle technologies. The ideal candidate will play a critical role in designing, building, and optimizing data pipelines and architecture that support our enterprise-scale data initiatives in an IC customer space. This role … demands technical expertise, strategic thinking, and hands-on experience with Oracle-based systems in high-volume environments. If you thrive in a fast-paced, data-driven organization and enjoy solving complex data engineering challenges, we want to hear from you. Required Qualifications: Minimum 6+ years of experience in data engineering, ETL development, or database management Strong expertise … schemas Knowledge of ETL pipeline design and implementation for large-scale data systems Experience in shell scripting, Python, or other automation tools FAmiliarity with Oracle GoldenGate or similar CDC (ChangeDataCapture) tools Familiarity with data governance, metadata management, and data lineage practices Ability to work with cross-functional teams including data scientists More ❯
of something bigger. MAIN PURPOSE OF THE ROLE & RESPONSIBILITIES IN THE BUSINESS:As a Senior Palantir Engineer (Senior Associate) at Kainos, you will be responsible or designing and developing data processing and data persistence software components for solutions which handle data at scale. Working in agile teams, Senior Data Engineers provide strong development leadership and take … responsibility for significant technical components of data systems. You will work within a multi-skilled agile team to design and develop large scale data processing software to meet user needs in demanding production environments.YOUR RESPONSIBILITIES WILL INCLUDE:• Working to develop data processing software primarily for deployment in Big Data technologies. The role encompasses the full software … Experience of data visualisation and complex data transformations• Experience with steaming and event-processing architectures including technologies such as Kafka and change-data-capture (CDC) products• Expertise in continuous improvement and sharing input on data best practice.# Embracing our differences At Kainos, we believe in the power of diversity, equity and inclusion. We are More ❯
About the Role The Senior Data Engineer will play a key role in designing, implementing, and optimizing Mission's data infrastructure as part of our modern data platform initiative. This hands-on engineering role will focus on building scalable data pipelines, enabling a centralized enterprise data warehouse, and supporting business reporting needs. The ideal candidate … will collaborate across technology, operations, product, and analytics teams to create high-quality, governed, and reusable data assets, while supporting a long-term architecture aligned with Mission's growth. This role is highly technical and focused on execution and is ideal for a data engineer who thrives in fast-paced environments and is passionate about data quality … marts with a focus on dimensional modeling and analytics-ready schemas. Fluent in range of data ingestion: RESTful APIs, JSON/XML ingestion, flat files, message queues, and CDC patterns. Understanding of data governance, metadata management, and access control. Familiarity with version control (GitLab) and CI/CD pipelines for data workflows. Experience working with BI tools More ❯
About the Role The Data Engineer will support the development and optimization of Mission's data infrastructure as part of our modern data platform initiative. This hands-on role will focus on implementing scalable data pipelines, enabling a centralized enterprise data warehouse, and supporting business reporting needs. The ideal candidate will collaborate across technology, operations … product, and analytics teams to deliver high-quality, governed, and reusable data assets in alignment with Mission's long-term architecture. What You'll Do Contribute to the implementation of scalable data pipelines to ingest, transform, and store data from third-party vendors and internal systems using APIs, files, and databases. Support the development of a cloud … based data warehouse solution in partnership with architects, ensuring clean, normalized, and performant data models. Establish and monitor reliable data ingestion processes across systems with varying grain and cadence, ensuring data quality and completeness. Collaborate with API and integration teams to develop secure, robust data exchange processes with external vendors and internal services. Set up More ❯
part of something bigger. JOB PROFILE DESCRIPTIONAs a Palantir Solution Architect (Manager) in Kainos, you'll be responsible for a multi- skilled agile teams to design and deliver contemporary data solutions. You will be a quality orientated pragmatist, where you balance trade-offs to successfully deliver complex solutions. You will be viewed as an authority figure for data … providing strong technical and thought leadership.Your responsibilities will include:• Successfully implementing functional and non-functional designs• Working closely with Ops and Infrastructure architects to productionise robust, resilient, and maintainable data solutions• Working closely with customer architects to agree functional and non-functional designs. Advising, questioning, informing, and helping the customer in making sound solution design outcomes.• Working with your … and mentor those around you• Managing and estimating timelines underpinning solution delivery• Owning technical solution design as part of a pre-sales process• Making a significant contribution to the data community and wider data capability• Managing, coaching and developing a small number of staff, with a focus on managing employee performance and assisting in their career development. You More ❯
part of something bigger. JOB PROFILE DESCRIPTIONAs a Palantir Technical Architect (Consultant) in Kainos, you'll be responsible for designing and delivering technical components as part of a larger data solution. You will work closely with Solution Architects and Customer Architects to integrate these components into quality data solutions.Your responsibilities will include:• Successfully implementing functional and non-functional … are upheld• Managing and estimating timelines underpinning technical component delivery• Contributing or owning technical solution design as part of a pre-sales process• Making a significant contribution to the data analysis community and wider data and analytics capability• Managing, coaching and developing a small number of staff, with a focus on managing employee performance and assisting in their … You'll also provide direction and leadership for your team as you solve challenging problems togetherMINIMUM (ESSENTIAL) REQUIREMENTS:• Experience of technical ownership for a component, sub-system or product (data ingestion, data streaming), including architecture, estimation, product planning and story creation• Proficient in client interaction including communication of technical decisions to non-technical audiences• Experience of applying standards More ❯
San Diego, California, United States Hybrid/Remote Options
Appfolio
Description What we're looking for: The Site Reliability Engineer II - Data who will contribute to our growing Data Engineering and Operations team. We work collaboratively to develop an infrastructure that ingests data from disparate sources and routes them to various target storage and applications, thus providing access to high-quality data to users, ranging from … application developers interested in specific events to data analysts keen on business intelligence to data scientists training ML models At AppFolio, we paddle as one. We ride and make waves together, with a relentless focus on building great products for the way our customers work and live today - and tomorrow. AppFolio is a destination organization where careers are … made and accelerated. Here, innovation is a team sport. Your impact: Design, build, and operate on next-generation data pipeline infrastructure Improve data architecture, quality, discoverability, and access policies to enable and enforce data governance Collaborate with engineers, data analysts, and scientists to ensure that our data infrastructure meets the SLOs of our dataMore ❯
access essential radiology services at fair prices and without surprise bills, all while delivering immediate savings and ROI for employers and payers on every exam. We are seeking a Data Engineer to join our dynamic team. The ideal candidate is an enthusiastic problem-solver who excels at building scalable data systems and has hands-on experience with Databricks … Looker, AWS, MongoDB, PostgreSQL, and Terraform. You will work alongside sales, customer success and engineering to design, implement, and maintain the operational data infrastructure that powers our analytics and platform offerings. What you'll do: Data Pipeline & Integration Design, build, and maintain end-to-end data pipelines using Databricks (SparkSQL, PySpark) for data ingestion, transformation, and … processing. Integrate data from various structured and unstructured sources, including medical imaging systems, EMRs, Change-Data-Capture from SQL Databases, and external APIs. Analytics & Visualization Collaborate with the analytics team to create, optimize, and maintain dashboards in Looker . Implement best practices in data modeling and visualization for operational efficiency. Cloud Infrastructure Deploy and manage More ❯
Do you love the Bible and also enjoy writing code and processing data? This role may be for you! The YouVersion Senior Data Engineer role will be responsible for providing data engineering support and delivery across all product lanes. This position will require expertise in areas of data engineering, architecture, governance, quality, and testing. This individual … will need to understand the big picture of all things YouVersion data, yet be able to design and deliver at the detailed technical level. This role is critical in increasing engagement and growth through the Bible App globally. The Senior Data Engineer will work tightly with the Data Analytics team, Software Engineering team, and Product teams. This … tuning. Highly skilled in Python, Go, Java scripting, or other general-purpose programming languages. Proficient with data orchestration/transformation tools such as Airflow, Pub/Sub, FiveTran, CDC, DBT, streaming, etc. Proficient in Database Management Systems, such as Postgres, BigQuery, or SQL Server. Understanding of user event data collection products such as Firebase, GA4, or similar. Preferred More ❯
Data Science Engineer – Azure | Databricks | AI Innovation Permanent | Birmingham (Hybrid) Amtis is proud to partner with an advanced, data-driven organisation — a business that’s not just talking about AI, but actively building intelligent systems powered by Azure, Databricks, and real-world machine learning applications. This is a hands-on engineering role where you’ll be at the … core of designing, developing, and optimising modern data platforms that enable predictive analytics, AI experimentation, and large-scale automation. You’ll work in an environment where data truly drives business decisions — not just dashboards. If you’re excited by high-volume, high-velocity data challenges and want to work on next-gen infrastructure that supports advanced analytics … Looking For Strong hands-on experience with Azure Databricks, Data Factory, Blob Storage, and Delta Lake Proficiency in Python, PySpark, and SQL Deep understanding of ETL/ELT, CDC, streaming data, and lakehouse architecture Proven ability to optimise data systems for performance, scalability, and cost-efficiency A proactive problem-solver with great communication skills and a passion More ❯
SQL Analyst, CDP & CRM Segmentation London based – hybrid working – 2-3 days on site 3-6 Month contract – Inside IR35 We’re standing up a Customer Data Platform (CDP) and need a hands-on SQL analyst to tighten audience segmentation, productionise segmentation logic in the warehouse, and push curated traits into Braze so CRM can operate campaigns without overloading … the data team. What you’ll do Own audience & trait definition Translate CRM/Marketing objectives into precise, reusable audience and trait specs Build and maintain warehouse-first segmentation tables/views with clear SLOs and documentation. Ship reliable data to Braze Design pipelines to push curated traits and audiences to Braze , including change-datacapture … Braze counts vs. warehouse truth to ensure high confidence. Unblock campaign operations Create a self-serve library of SQL snippets, views, so CRM can launch campaigns without ad-hoc data requests. Implement suppression logic (deliverability, compliance, frequency caps) and guardrails. Partner across teams Work closely with Data Engineering, CRM, BI and Product to align event/identity strategy More ❯
SQL Analyst, CDP & CRM Segmentation London based – hybrid working – 3 days on site 3-6 Month contract – Inside IR35 We’re standing up a Customer Data Platform (CDP) and need a hands-on SQL analyst to tighten audience segmentation, productionise segmentation logic in the warehouse, and push curated traits into Braze so CRM can operate campaigns without overloading the … data team. What you’ll do Own audience & trait definition Translate CRM/Marketing objectives into precise, reusable audience and trait specs Build and maintain warehouse-first segmentation tables/views with clear SLOs and documentation. Ship reliable data to Braze Design pipelines to push curated traits and audiences to Braze , including change-datacapture … Braze counts vs. warehouse truth to ensure high confidence. Unblock campaign operations Create a self-serve library of SQL snippets, views, so CRM can launch campaigns without ad-hoc data requests. Implement suppression logic (deliverability, compliance, frequency caps) and guardrails. Partner across teams Work closely with Data Engineering, CRM, BI and Product to align event/identity strategy More ❯
City of London, London, United Kingdom Hybrid/Remote Options
Advanced Resource Managers
SQL Analyst, CDP & CRM Segmentation London based – hybrid working – 3 days on site 3-6 Month contract – Inside IR35 We’re standing up a Customer Data Platform (CDP) and need a hands-on SQL analyst to tighten audience segmentation, productionise segmentation logic in the warehouse, and push curated traits into Braze so CRM can operate campaigns without overloading the … data team. What you’ll do Own audience & trait definition Translate CRM/Marketing objectives into precise, reusable audience and trait specs Build and maintain warehouse-first segmentation tables/views with clear SLOs and documentation. Ship reliable data to Braze Design pipelines to push curated traits and audiences to Braze , including change-datacapture … Braze counts vs. warehouse truth to ensure high confidence. Unblock campaign operations Create a self-serve library of SQL snippets, views, so CRM can launch campaigns without ad-hoc data requests. Implement suppression logic (deliverability, compliance, frequency caps) and guardrails. Partner across teams Work closely with Data Engineering, CRM, BI and Product to align event/identity strategy More ❯
City of London, London, United Kingdom Hybrid/Remote Options
Advanced Resource Managers
SQL Analyst, CDP & CRM Segmentation London based – hybrid working – 2-3 days on site 3-6 Month contract – Inside IR35 We’re standing up a Customer Data Platform (CDP) and need a hands-on SQL analyst to tighten audience segmentation, productionise segmentation logic in the warehouse, and push curated traits into Braze so CRM can operate campaigns without overloading … the data team. What you’ll do Own audience & trait definition Translate CRM/Marketing objectives into precise, reusable audience and trait specs Build and maintain warehouse-first segmentation tables/views with clear SLOs and documentation. Ship reliable data to Braze Design pipelines to push curated traits and audiences to Braze , including change-datacapture … Braze counts vs. warehouse truth to ensure high confidence. Unblock campaign operations Create a self-serve library of SQL snippets, views, so CRM can launch campaigns without ad-hoc data requests. Implement suppression logic (deliverability, compliance, frequency caps) and guardrails. Partner across teams Work closely with Data Engineering, CRM, BI and Product to align event/identity strategy More ❯
Bellevue, Washington, United States Hybrid/Remote Options
Brook
long-term health goals by supporting smart, daily decisions and partnering with their primary care physicians. Our product suite includes continuous remote monitoring, population health management tools, and a CDC-approved diabetes prevention program. Brook has an intentional, user-centric culture with high expectations for delivering better health outcomes for patients, providers, and health systems. Job Overview The Data … s analytics capabilities by transforming complex datasets into actionable insights that drive business performance and patient outcomes. This role develops, maintains, and scales reporting and visualization tools that enable data-driven decision-making across Brook's business and clinical operations. You will collaborate closely with Product, Operations, Finance, and Partner Success teams to identify trends, measure impact, and provide … in-office collaboration. Requirements Key Responsibilities Design and maintain analytics dashboards: Build, automate, and enhance reports and dashboards that track key business, clinical, and population health metrics to support data-driven decision-making. Analyze business and operational performance: Use quantitative methods to identify trends, measure outcomes, and translate findings into actionable recommendations that drive growth and efficiency. Develop standardized More ❯
Herndon, Virginia, United States Hybrid/Remote Options
ALTA IT Services
Trust clearance (level 2) The DMAC AMD team is seeking an Application Architect to lead the design and implementation of secure, scalable, and modern cloud solutions supporting Public Health data modernization initiatives. In this role, you'll architect, optimize, and guide the delivery of cloud-based platforms that enable more effective data sharing, analytics, and decision-making across … solving complex engineering challenges that directly advance national Public Health outcomes. Key Responsibilities: • Design, implement, and maintain secure, scalable user applications hosted in the cloud to enable Public Health data modernization • Architect and deploy enterprise applications and services in AWS and hybrid environments • Guide development of applications using .NET, Entity Framework, and SQL across cloud and on-premise systems … Trust or Suitability/Fitness determination based on client requirements • HS diploma or GED Nice If You Have: • Experience optimizing cloud cost management and implementing governance frameworks • Familiarity with CDC processes, data modernization efforts, and federal IT environments • Master's degree in Computer Science or related field • Cloud certifications such as AWS Solution Architect (Associate/Professional) or Microsoft More ❯
tuning techniques Experience with backup and disaster recovery processes Review the current Debezium deployment architecture, including Oracle connector configuration, Kafka integration, and downstream consumers. Analyze Oracle database setup for CDC compatibility (e.g., redo log configuration, supplemental logging, privileges). Evaluate connector performance, lag, and error handling mechanisms. Identify bottlenecks, misconfigurations, or anti-patterns in the current implementation. Provide a detailed … expertise in MongoDB architecture, including replication, sharding, backup, and recovery Strong hands-on experience with Debezium, especially the Oracle connector (LogMiner). Deep understanding of Oracle internals relevant to CDC: redo logs, SCNs, archive log mode, supplemental logging. Proficiency with Apache Kafka and Kafka ecosystem tools. Experience with monitoring and debugging Debezium connectors in production environments. Ability to analyze logs … to comply with CHAMP Security Requirements (including but not limited to CHAMP's IT Security Policies, especially the ISMS Policy and the Acceptable Use Policy, mandatory courses, confidentiality and data protection, use of company assets, and incident reporting). CHAMP Cargosystems is an equal opportunity employer and prohibits discrimination and harassment of any kind. We are committed to the More ❯
firm operating at the intersection of knowledge graphs, generative AI, and enterprise transformation. This role requires you to embed graph intelligence into mission-critical systems enabling explainable AI, unified data views, and advanced reasoning across regulated industries and industrial domains. You will drive the design, build, and deployment of knowledge graph + GenAI systems for high-impact clients. You … ll be part of a small elite team, bridging data, AI, and business outcomes — from model scoping through to production launch. Ship full systems Operate in regulated or industrial domains (e.g. manufacturing, life sciences, public sector) Embed client strategy and technical teams Stretching the frontier … hybrid KG + AI, graph + reasoning + M 🛠 Responsibilities Lead KG schema & ontology design across domains (assets, risk, supply chain, compliance) Build ingestion pipelines (ETL/streaming/CDC) and entity resolution for graph population Author complex queries (Cypher, GSQL, AQL, SPARQL etc. depending on stack) Integrate knowledge graph retrieval & reasoning into LLM/RAG/GraphRAG systems Develop More ❯
City of London, London, United Kingdom Hybrid/Remote Options
Intelix.AI
firm operating at the intersection of knowledge graphs, generative AI, and enterprise transformation. This role requires you to embed graph intelligence into mission-critical systems enabling explainable AI, unified data views, and advanced reasoning across regulated industries and industrial domains. You will drive the design, build, and deployment of knowledge graph + GenAI systems for high-impact clients. You … ll be part of a small elite team, bridging data, AI, and business outcomes — from model scoping through to production launch. Ship full systems Operate in regulated or industrial domains (e.g. manufacturing, life sciences, public sector) Embed client strategy and technical teams Stretching the frontier … hybrid KG + AI, graph + reasoning + M 🛠 Responsibilities Lead KG schema & ontology design across domains (assets, risk, supply chain, compliance) Build ingestion pipelines (ETL/streaming/CDC) and entity resolution for graph population Author complex queries (Cypher, GSQL, AQL, SPARQL etc. depending on stack) Integrate knowledge graph retrieval & reasoning into LLM/RAG/GraphRAG systems Develop More ❯
requirements Participate with other coordinators across the country in joint advancing public health readiness challenge Define and management the process for systems to test the submission of meaningful use data received directly from providers and help to accept the data when successful Serve as the liaison between the state and facilities for issues regarding meaningful use and immunization … data Strong working knowledge and experience of … Oracle database and PL/SQL scripting Serve as the central point of contact for information and assistance regarding meaningful use for the Centers for Disease Control and Prevention (CDC), the Joint Public Health Informatics Taskforce (JPHIT) the Centers for Medicaid and Medicate Services (CMS), and the Office of National Coordinator for Health Information technology (ONC) Participate in projects and More ❯