powerhouse of diverse teams and take your career wherever you want it to go. Join EY and help to build a better working world. FS Technology Consulting - AI and Data - Data Engineer - Senior Consultant/Manager - Dublin/Cork/Limerick/Galway General Information Location: Dublin, Cork, Limerick, or Galway. Available for VISA Sponsorship: No Business Area … Data & Analytics Contract Type: Full Time - Permanent EY is the only major professional services firm with an integrated Financial Services practice across Europe, the Middle East, India, and Africa (EMEIA). We connect our Asset Management, Banking and Capital Markets and Insurance clients to 6,500 talented people from 12 countries and 35,000 Financial Services colleagues around the … Azure Functions and Logic Apps for automation. Snowflake: Strong SQL skills and experience with Snowflake's architecture (virtual warehouses, storage, cloud services). Proficiency in Snowflake Streams & Tasks for CDC and automation. Experience with Snowflake Secure Data Sharing and Snowflake Marketplace. Familiarity with Snowpark for Python/Java-based transformations. Understanding of role-based access control, data masking More ❯
Reston, Virginia, United States Hybrid / WFH Options
ICF
supporting 100% remote work anywhere within the United States. Must be able to support Eastern Time Zone ICF is a rapidly growing, entrepreneurial, multi-faceted consulting company, seeking a Data Architect. The Data Architect will lead the design and implementation of a robust, scalable, and secure data lake architecture to support advanced analytics and AI/ML … also expected to uphold and maintain appropriate certifications necessary for their practice expertise. What you'll be doing: Lead the architecture and design of an enterprise-scale AWS-based data lake and data integration ecosystem to support advanced analytics and AI/ML initiatives. Define and enforce data modeling, metadata, data lineage, and data governance … standards to support analytics and machine learning workflows Establish best practices in data architecture, including schema design, data normalization, and optimal format selection (e.g., Parquet, JSON) Work with cross-functional teams to define data ingestion, transformation, and curation strategies aligned with AI/ML use cases. Collaborate across teams to design scalable, secure and cost-efficient dataMore ❯
Reston, Virginia, United States Hybrid / WFH Options
ICF
supporting 100% remote work anywhere within the United States. Must be able to support Eastern Time Zone ICF is a rapidly growing, entrepreneurial, multi-faceted consulting company, seeking a Data Engineer. The Data Engineer will help bring new data insights to a government agency committed to improving child welfare. The ICF team performs custom software development, analytics … oral presentations.Team members are also expected to uphold and maintain appropriate certifications necessary for their practice expertise. What you'll be doing: Help build, and optimize a AWS-based data lake to support AI/ML initiatives and advance analytics Design and implement scalable data ingestion pipelines for both batch and real-time data from diverse structured … and unstructured sources Perform extensive data profiling, transformation and enrichment to prepare clean, ML ready datasets for data scientists and analysts. Develop custom reports and data visualizations to support analytics, decision-making across business and technical teams Collaborate with data scientists and business teams to deliver curated datasets and reporting needs for ML and analytics. Support More ❯
Hoplite Solutions is seeking a highly experienced and motivated Data Engineer to join our big data platform team, with a strong focus on Oracle technologies. The ideal candidate will play a critical role in designing, building, and optimizing data pipelines and architecture that support our enterprise-scale data initiatives in an IC customer space. This role … demands technical expertise, strategic thinking, and hands-on experience with Oracle-based systems in high-volume environments. If you thrive in a fast-paced, data-driven organization and enjoy solving complex data engineering challenges, we want to hear from you. Required Qualifications: Minimum 6+ years of experience in data engineering, ETL development, or database management Strong expertise … schemas Knowledge of ETL pipeline design and implementation for large-scale data systems Experience in shell scripting, Python, or other automation tools FAmiliarity with Oracle GoldenGate or similar CDC (ChangeDataCapture) tools Familiarity with data governance, metadata management, and data lineage practices Ability to work with cross-functional teams including data scientists More ❯
Location: Bishopsgate, London We are seeking a highly experienced Senior Developer with deep hands-on expertise in AWS BluAge, AWS cloud services and ChangeDataCaptureCDC mechanisms. The ideal candidate will be responsible for leading modernization efforts, building cloud native applications, and implementing real time data synchronization strategies using CDC. Must Have Technical Skills: -AWS … BluAge -AWS cloud services -ChangeDataCapture Nice to have skills: Good to have some knowledge on banking domain Top 3 responsibilities: -Solution design -To Involve in all SDLC -Excellent communication skills and a team player. Liaise with Clients, SMEs and Business Associates More ❯
landscape-detailing how Salesforce integrates with internal/external systems, ensuring scalability and maintainability. Define robust integration strategies, including REST/SOAP APIs, middleware (e.g., MuleSoft), platform events, and changedatacapture, while adhering to Salesforce integration best practices. Create and maintain secure, scalable data models, applying Salesforce data modeling principles-object relationships, custom metadata … Salesforce-related implementations. Maintain right balance between low-code and pro-code solutions Stay current with Salesforce releases and roadmap, continuously improving architectural standards and leveraging platform innovations (e.g., Data Cloud, Einstein AI, CRM Analytics). Key Role Requirement: Proficiency in Apex Programming & Triggers Deep understanding of Apex classes, triggers, asynchronous processing (Batch, Queueable, Future), and test-driven development. … Advanced Experience with Lightning Web Components (LWC) & Aura Ability to design dynamic, high-performance UIs using modern Salesforce front-end frameworks. Strong Knowledge of Salesforce Data Modeling Expertise in creating and optimizing custom objects, relationships, schema design, indexing, and large data volume (LDV) strategies. Integration Architecture & Patterns Hands-on experience with REST/SOAP APIs, Platform Events, ChangeMore ❯
Reston, Virginia, United States Hybrid / WFH Options
ICF
Description Our Digital Modernization and Experience (DMX) Group is growing, and we are looking for a motivated, experienced Senior Databricks SME who is passionate about turning complex data into actionable solutions that improve public systems and services. This role supports an enterprise initiative focused on platform infrastructure and analytics modernization for a federal customer. You'll be joining a … cross-functional team of full stack developers, data engineers, and data analysts working within a modular, cloud-native platform supporting the emergency management sector. Your work will help ensure disaster management and mitigation decision-makers have access to accurate, timely, and meaningful data and data products to drive effective service delivery and measurable mission outcomes. If … Support the design and development of data pipelines and ETL routines in Azure Cloud environment for many source system types including RDBMS, API, and unstructured data using CDC, incremental, and batch loading techniques. Conduct data profiling, transformation, and quality assurance on structured, semi-structured, and unstructured data. Identify underlying issues and translate them into technical requirements. Assist More ❯
Title : AbInitio Developer Location : Reston, VA Duration : 12 Months Job Description: Hybrid Role Client is looking for a senior AbInitio developer with many years of data integration, data warehouse project experience. NoSQL/MongoDB, AWS experience is highly preferred , must be a self starter and should be able to work with minimum supervision. Total of 10+ Years of … IT Experience predominantly in Data Integration/Data Warehouse area Must … have at least 5 years of ETL Design and Development experience using Ab Initio 1-2 years of Data Integration project experience on Hadoop Platform, preferably Cloudera AbInitio CDC ( ChangeDataCapture ) experience in a Data Integration/ETL project setting is great to have Working knowledge of HDFS, Hive, Impala and other related Hadoop More ❯
/SQL, etc.) • Experience with SSIS • Experience with large enterprise-grade data warehouse solutions. • Extensive experience working with Informatica. • Experience working with ChangeDataCapture (CDC) technology • Experience with large database implementations (Oracle, SQL server, Teradata, IBM, etc.) • Experience with Big Data solutions including Hadoop • Experience with Predictive Analytics and Forecasting • Strong analytical skills and More ❯
SQL Analyst, CDP & CRM Segmentation London based - hybrid working - 3 days on site 3-6 Month contract - Inside IR35 Why this role exists We're standing up a Customer Data Platform (CDP) and need a hands-on SQL analyst to tighten audience segmentation, productionise segmentation logic in the warehouse, and push curated traits into Braze so CRM can operate … campaigns without overloading the data team. What you'll do Own audience & trait definition Translate CRM/Marketing objectives into precise, reusable audience and trait specs Build and maintain warehouse-first segmentation tables/views with clear SLOs and documentation. Ship reliable data to Braze Design pipelines to push curated traits and audiences to Braze , including change-datacapture and dependency handling. Set up monitoring/alerts and reconcile Braze counts vs. warehouse truth to ensure high confidence. Unblock campaign operations Create a self-serve library of SQL snippets, views, so CRM can launch campaigns without ad-hoc data requests. Implement suppression logic (deliverability, compliance, frequency caps) and guardrails. Partner across teams Work More ❯
Arlington, Virginia, United States Hybrid / WFH Options
Alpha Omega Integration LLC
Our Health IT capabilities, enhance health outcomes for the warfighter via the Defense Health Agency (DHA) and support public health initiatives at the Centers for Disease Control and Prevention (CDC), National Institutes of Health (NIH), and the Substance Abuse and Mental Health Services Administration (SAMHSA). Space & Science - We leverage technology to advance agricultural sustainability, secure our nation's food More ❯
Arlington, Virginia, United States Hybrid / WFH Options
Alpha Omega Integration LLC
Our Health IT capabilities, enhance health outcomes for the warfighter via the Defense Health Agency (DHA) and support public health initiatives at the Centers for Disease Control and Prevention (CDC), National Institutes of Health (NIH), and the Substance Abuse and Mental Health Services Administration (SAMHSA). Space & Science - We leverage technology to advance agricultural sustainability, secure our nation's food More ❯
Huntsville, Alabama, United States Hybrid / WFH Options
ICF
to support daily operations of IT systems for a government program. You will manage and optimize production databases supporting the DoD MCC JAS program, ensuring high availability, performance, and data integrity. Job Location: Remote role - must be performed within the United States and support U.S. Eastern Time Zone hours. ICF monitors employee work locations, restricts access from foreign IP … addresses, and prohibits the use of personal VPNs. Responsibilities: Monitor database performance and implement tuning strategies. Ensure data integrity, backups, and disaster recovery readiness. Collaborate with development teams on schema design, query optimization, and data modeling (OLTP/OLAP). Support database security and access controls in compliance with DoD standards. Maintain documentation and change control procedures. … databases. Scripting ability in at least one: PowerShell, Bash, or Python. Experience hardening and accrediting systems under DoD RMF. Familiarity with data integration & messaging (ETL/ELT tools, CDC, Kafka), and supporting analytics/BI workloads. Experience with Infrastructure as Code and config management (Terraform, Ansible) for DB infrastructure and parameter baselines. Experience building database observability: custom metrics, log More ❯
Arlington, Virginia, United States Hybrid / WFH Options
Alpha Omega Integration LLC
Technologies NET MVC - Custom attributes and references to MVC controllers, views, actions, and templates WCF (Windows Communication Foundation), Web Services, SOAP endpoints and service contracts for inter-service communication Data Access Entity Framework 6.1.1 - ORM and database access NET Dependency Injection/IoC Autofac, Unity, Castle.Core, CommonServiceLocator Logging & Exception Handling Enterprise Library, Logging, ExceptionHandling, and related configuration. RollingFlatFileTraceListener (Enterprise … Our Health IT capabilities, enhance health outcomes for the warfighter via the Defense Health Agency (DHA) and support public health initiatives at the Centers for Disease Control and Prevention (CDC), National Institutes of Health (NIH), and the Substance Abuse and Mental Health Services Administration (SAMHSA). Space & Science - We leverage technology to advance agricultural sustainability, secure our nation's food More ❯
Jacksonville, Florida, United States Hybrid / WFH Options
Collabera
DB2 z/OS DBA to manage the full lifecycle of mainframe DB2 databases, including design, implementation, maintenance, and decommissioning. The role involves supporting DB2 replication using IIDR/CDC, performing database upgrades, and ensuring high availability, performance, and security of production systems. Key Responsibilities: Manage DB2 databases on z/OS, including replication, backup, and recovery. Plan subscriptions and … documentation and reports for database operations and health. Required Skills & Experience: 5+ years of DB2 DBA experience on z/OS; 1+ year with data replication (IIDR/CDC). Strong knowledge of InfoSphere CDC, DB2 Utilities (BMC), VSAM, IDCAMS, and Endevor. Experience with BMC tools: Change Manager, Catalog Manager, Mainview, SQL Performance. Broad understanding of ACF2/ More ❯
required You will join a global IT consultancy delivering digital transformation to a public sector body. Key Duties and Responsibilities Develop and optimise PySpark batch pipelines that process Parquet data and use Delta Lake for all IO, applying validation, enrichment, and billing calculation logic directly in PySpark code. Build reliable PySpark jobs that read/write Delta tables on … adjustments, and full auditability. Externalise billing and validation rules via versioned JSON configs, ensuring deterministic, idempotent re-runs. Optimise Delta operations (MERGE, OPTIMIZE, Z-ORDER, VACUUM) and incremental/CDC merges into Azure SQL. Tune performance (partitioning, caching, broadcast joins) and maintain robust retries, checkpoints, and structured logging. Integrate with orchestrators (ADF or Container App Orchestrator) and CI/CD More ❯
role is fully remote and is inside of IR35. You will require active CTC Clearance. Key Duties and Responsibilities: . Develop and optimise PySpark batch pipelines that process Parquet data and use Delta Lake for all IO, applying validation, enrichment, and billing calculation logic directly in PySpark code. . Build reliable PySpark jobs that read/write Delta tables … full auditability. . Externalise billing and validation rules via versioned JSON configs, ensuring deterministic, idempotent re-runs. . Optimise Delta operations (MERGE, OPTIMIZE, Z-ORDER, VACUUM) and incremental/CDC merges into Azure SQL. . Tune performance (partitioning, caching, broadcast joins) and maintain robust retries, checkpoints, and structured logging. . Integrate with orchestrators (ADF or Container App Orchestrator) and CI More ❯