Fort Lauderdale, Florida, United States Hybrid/Remote Options
Vegatron Systems
TO CORP) WITH 3RD PARTY WORK AUTHORIZATION: US CITZ or GC,GC - EAD,H4 & L2 EAD, NO H1B & OPT/CPT ACCEPTED BY CLIENT RATE: $OPEN - DOE JOB TITLE: Data Engineer BigData Hadoop JOB DESCR: Candidates will start out REMOTE WORK and then will eventually be sitting in Frt. Lauderdale, FL. Candidates should be senior Data Engineers with bigdata tools (Hadoop, Spark, Kafka) as well as AWS (cloud services: EC2, EMR, RDS, Redshift) and NOSQL. This is a phone and Skype to hire. Candidates in Florida with a LinkedIn profile preferred but not required. Essential Duties and Responsibilities: • Past experience in executing and delivering solutions in an Agile scrum development environment is … preferred • Produce deliverables with little oversight or management assistance • Assemble large, complex data sets that meet functional/non-functional business requirements • Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc. • Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide More ❯
BIGDATA ENGINEER - DV CLEARED NEW PERMANENT JOB OPPORTUNITY AVAILABLE WITHIN A LEADING NATIONAL SECURITY SME FOR A BIGDATA ENGINEER WITH DV CLEARANCE Permanent job opportunity for a BigData Engineer Leading National Security & Defence SME Salary up to £80,000 plus clearance bonus London based organisation in an easily accessible location To … apply please call or email WHO WE ARE? We are recruiting multiple BigData Engineers to support urgent National Security & Defence projects in London. Due to the nature of these projects you must hold DV or enhanced DV Security Clearance. WHAT WILL THE BIGDATA ENGINEER BE DOING? You will be joining a leading SME who … is working hard to support National Security projects within UK Govt. Departments in London. As part of a team, you will be responsible for implementing BigData Solutions in Mission-Critical areas. WE NEED THE BIGDATA ENGINEER TO HAVE.... Current DV clearance - Standard or Enhanced Must have experience with bigdata tools such More ❯
ML Data Engineer - Healthcare Data Curation & Cleaning (1 Year Fixed Term) School of Medicine, Stanford, California, United States Information Analytics Jun 03, 2025 Post Date 106579 Requisition University is seeking a BigData Architect 1 for a 1 year fixed term (possibility of renewal) to design and develop applications, test and build automation tools and support … the development of BigData architecture and analytical solutions. About Us: The Department of Biomedical Data Science merges the disciplines of biomedical informatics, biostatistics, computer science and advances in AI. The intersection of these disciplines is applied to precision health, leveraging data across the entire medical spectrum, including molecular, tissue, medical imaging, EHR, biosensory and population … data. About the Position: We are seeking an experienced ML Data Engineer to drive the programmatic curation, cleaning, and generation of healthcare data. In this role, you will focus exclusively on developing and maintaining automated, ML-accelerated pipelines that ensure high-quality data ready for machine learning applications. Your work will be pivotal in shaping the integrity of More ❯
CONTRACT DATA ENGINEER - eDV CLEARED NEW OUTSIDE IR35 CONTRACT OPPORTUNITY AVAILABLE WITHIN A LEADING NATIONAL SECURITY SME FOR A DATA ENGINEER WITH eDV CLEARANCE Contract job opportunity for a Data Engineer National Security client Palantir Foundry Outside IR35 Central London based organisation in an easily accessible location To apply please call/or email WHO WE ARE … We are recruiting a contract Data Engineer to work with a National Security SME in central London. Due to the nature of the work, you must hold enhanced DV Clearance. WE NEED THE DATA ENGINEER TO HAVE.... Current enhanced DV Security Clearance Experience with bigdata tools such as Hadoop, Cloudera or Elasticsearch Experience With Palantir … Scrum environment with tools such as Confluence/Jira Experience in design, development, test and integration of software Willingness to learn new technologies IT WOULD BE NICE FOR THE DATA ENGINEER TO HAVE.... Cloud based architectures Microservice architecture or server-less architecture Messaging/routing technologies such as Apache Nifi/RabbitMQ TO BE CONSIDERED.... Please either apply by More ❯
CONTRACT DATA ENGINEER - eDV CLEARED NEW CONTRACT OPPORTUNITY AVAILABLE WITHIN A LEADING NATIONAL SECURITY COMPANY FOR A DATA ENGINEER WITH eDV CLEARANCE Contract job opportunity for a Principal Data Engineer National Security client Daily rate up to £900 Central London based organisation in an easily accessible location To apply please call or email (see below) WHO WE … ARE? We are recruiting a contract Data Engineer to work with a National Security SME in central London. Due to the nature of the work, you must hold enhanced DV Clearance. WE NEED THE DATA ENGINEER TO HAVE Current enhanced DV Security Clearance Experience with bigdata tools such as Hadoop, Cloudera or Elasticsearch Experience With … Scrum environment with tools such as Confluence/Jira Experience in design, development, test and integration of software Willingness to learn new technologies IT WOULD BE NICE FOR THE DATA ENGINEER TO HAVE Cloud based architectures Microservice architecture or server-less architecture Messaging/routing technologies such as Apache Nifi/RabbitMQ TO BE CONSIDERED Please either apply by More ❯
CONTRACT DATA ENGINEER - eDV CLEARED NEW CONTRACT OPPORTUNITY AVAILABLE WITHIN A LEADING NATIONAL SECURITY COMPANY FOR A DATA ENGINEER WITH eDV CLEARANCE Contract job opportunity for a Principal Data Engineer National Security client Daily rate up to £900 Central London based organisation in an easily accessible location To apply please call/or email WHO WE ARE … We are recruiting a contract Data Engineer to work with a National Security SME in central London. Due to the nature of the work, you must hold enhanced DV Clearance. WE NEED THE DATA ENGINEER TO HAVE.... Current enhanced DV Security Clearance Experience with bigdata tools such as Hadoop, Cloudera or Elasticsearch Experience With Palantir … Scrum environment with tools such as Confluence/Jira Experience in design, development, test and integration of software Willingness to learn new technologies IT WOULD BE NICE FOR THE DATA ENGINEER TO HAVE.... Cloud based architectures Microservice architecture or server-less architecture Messaging/routing technologies such as Apache Nifi/RabbitMQ TO BE CONSIDERED.... Please either apply by More ❯
DATA ARCHITECT - DV CLEARED NEW PERMANENT JOB OPPORTUNITY AVAILABLE WITHIN A LEADING NATIONAL SECURITY SME FOR A DATA ARCHITECT WITH ENHANCED DV CLEARANCE Permanent job opportunity for a Data Architect Leading National Security SME Salary up to £100,000 + Bonus London based organisation in an easily accessible location To apply please call or email WHO WE … ARE? We are recruiting multiple a Data Architect to support urgent National Security & Defence projects in London. Due to the nature of these projects you must hold enhanced DV Security Clearance. WHAT WILL THE DATA ARCHITECT BE DOING? You will be joining a leading SME who is working hard to support National Security projects within UK Govt. Departments … in London. As part of a team, you will be responsible for designing and implementing Data Solutions in Mission-Critical areas. WE NEED THE DATA ARCHITECT TO HAVE.... Current DV clearance - Enhanced Good at understanding complexity and abstracting that into a form that is consumable for a non-technical audience. Experience of achieving data interoperability between More ❯
opportunity to join us as we continue to expand throughout the UK. Main duties of the job The role holder will be accountable for designing, building, and maintaining the data infrastructure and pipelines that enable efficient and accurate data collection, storage, processing, and analysis. The role holder will collaborate with cross-functional teams, including analysts, software engineers and … key stake holders to create and maintain data solutions that are crucial in delivering high quality care for our patients. About us We are always looking for great talent to join our team and help achieve our ambitious goals and growth. We care about our people, and we care about the future of community health and how CHEC can … play an innovative part in making this great, with your help. Job description Job responsibilities Design, implement, and optimise data storage systems, including databases, data warehouses, and data lakes, to efficiently handle large volumes of structured and unstructured data. Develop and maintain scalable ETL pipelines to extract, transform, and load data from various sources into the More ❯
We are currently recruiting a Microsoft Fabric Data Engineer for a 6-month on-site contract. This is a hybrid position. Position offers a competitive pay rate. 8+ Years of Experience: Data Management and Storage: Design and implement data storage systems using Azure services like Azure SQL Database, Azure Data Lake Storage, and Azure Synapse. Ensure … scalability, performance, and cost-effectiveness. Data Integration and ETL (Extract, Transform, Load): Develop and implement data integration processes using Azure Data Factory. Extract data from various sources, transform it, and load it into data warehouses or data lakes. BigData and Analytics: Utilize bigdata technologies such as Apache Spark. … Create data processing workflows and pipelines to support data analytics and machine learning applications. Build and maintain new and existing applications in preparation for a large-scale architectural migration within an Agile function. Monitor and optimize data pipelines and database performance to ensure data processing efficiency. Build interfaces for supporting evolving and new applications and accommodating More ❯
Job Title: Senior Data Engineer Location: Cary, NC Experience: 12+ Years About the Role We are looking for an experienced Senior Data Engineer who will lead the design and development of modern data platforms and scalable data pipelines. The ideal candidate has strong hands-on expertise in cloud data engineering, bigdata technologies … ELT/ETL architecture, and data modeling, along with the ability to mentor teams and work closely with business stakeholders to deliver high-quality data solutions. Key Responsibilities Architect, design, and implement large-scale data pipelines and data integration workflows across structured and unstructured datasets. Build, optimize, and maintain robust ETL/ELT processes for ingestion … transformation, and delivery of data across enterprise systems. Develop and manage data lakes, data warehouses, and analytics platforms using cloud technologies. Work closely with data scientists, analysts, and business teams to understand data requirements and deliver reliable data for reporting and analytics. Define best practices for data quality, lineage, governance, security, and performance More ❯
dynamic organization allows you to work across functional business pillars, contributing your ideas, experiences, diverse thinking, and a strong mindset. Are you ready? About your team Join our growing Data & Analytics practice and make a difference. In this practice you will be utilizing the most innovative technological solutions in modern data ecosystem. In this role you'll be … able to see your own ideas transform into breakthrough results in the areas of Data & Analytics strategy, Management & Governance, Data Integration & engineering, Analytics & Data science. About your role The ideal candidate will have extensive experience in designing and implementing data architectures, with a strong understanding of database management, data modelling, and data governance. This … with strong analytical and problem-solving skills and the ability to work collaboratively with clients and cross-functional teams. Requirements · Have experience in gathering, validating, synthesizing, documenting, and communicating data and information for a range of audiences, particularly audiences that are not technical. · Design and implement robust scalable, secure, optimised data solutions that support business requirements and strategic More ❯
Job Title: Cloud Data Architect Lead Specialist Engineer Location: Little Rock, Arkansas Experience: 12+ Years Employment Type: Contract Interview Type: In-Person or Webcam Job Description The Cloud Data Architect Lead Specialist Engineer will be responsible for designing, implementing, and optimizing cloud-based data architecture and solutions that support enterprise data platforms. This role involves working … closely with IT leadership, cross-functional engineering teams, and business stakeholders to define scalable data frameworks, data governance strategies, and modern data integration approaches. The candidate is expected to provide technical leadership and guidance, drive best practices, and support enterprise data transformation initiatives. Key Responsibilities Architect, design, and implement scalable cloud-based data platforms and … solutions for enterprise analytics, data warehousing, and real-time data pipelines. Lead the development of data architecture standards, data models, reference architectures, and integration patterns. Develop and maintain cloud data frameworks using modern technologies such as Azure, AWS, or Google Cloud. Oversee end-to-end data lifecycle including ingestion, transformation, metadata management, security, and More ❯
Data Engineer Position Summary The Data Engineer is responsible for building and maintaining scalable data pipelines, data warehousing solutions, and data platforms that support analytics, machine learning, and business intelligence. This role focuses on data integration, ETL/ELT workflows, and ensuring data quality and availability. Key Responsibilities Develop, maintain, and optimize data pipelines (batch and streamingBuild and manage data lakes, data warehouses, and analytics platforms.Design ETL/ELT workflows and automate data ingestion from multiple sources.Work with bigdata tools and distributed processing frameworks.Ensure data quality, validation, governance, and lineage.Collaborate with data scientists, analysts, and product teams.Optimize data storage, query performance, and processing … efficiency.Develop APIs, dashboards, or interfaces for data access.Manage cloud-based data infrastructure (AWS, Azure, GCPDocument architecture, workflows, and data models. Required Skills & Experience Strong experience with SQL, relational databases, and NoSQL systems.Proficiency in Python, Scala, or Java for data engineering tasks.Experience with bigdata tools (Spark, Hadoop, Kafka, FlinkKnowledge of ETL/ELT tools More ❯
Data/AI Enterprise Architect Washington, DC - 100% ONSITE US citizenship required per government contract Must be able to obtain Public Trust ALTA IT Services is in search of a candidate who has deep knowledge of enterprise architecture principles, AI technologies, and data management practices to guide decision-making and ensure that the architecture supports business objectives and … enhances operational capabilities. Key Responsibilities: • Design and develop end-to-end AI and data architectures that support business goals, ensuring scalability, performance, security, and maintainability. • Create architectural blueprints and roadmaps that guide the integration of AI and data solutions across the organization. • Lead the development and implementation of data platforms and AI-driven systems that facilitate advanced … units, ensuring alignment with business objectives. • Oversee the deployment and integration of AI models, tools, and technologies into production systems. • Design and implement scalable cloud-based and on-premises data architectures using platforms like Azure, AWS, or Google Cloud. • Work with bigdata technologies (e.g., Hadoop, Spark) and data lake architectures to ensure the organization's More ❯
Preston, Lancashire, North West, United Kingdom Hybrid/Remote Options
Circle Group
Head of Data Engineering - Preston A Head of Data Engineering/Data Engineering Manager to lead the design, development, and enhancement of the data infrastructure and pipelines is required by a leading company based in Preston. The role offers Hybrid working, so 2 - 3 days in the office a week. You must have the following: Proven … experience as a lead data engineer/Data Engineering Manager with some management experience Experience handling large datasets, complex data pipelines, bigdata processing frameworks and technologies Experience with data modelling, Databricks, data integration ETL processes and designing efficient data structures Strong programming skills in Python, Java, or Scala Data warehousing … concepts and dimensional modelling experience Any data engineering skills in Azure Databricks and Microsoft Fabric would be a bonus This new role involves building & managing a team of data engineers, fostering a culture of technical excellence and continuous improvement. Collaboration with cross-functional teams is essential to ensure robust, scalable, and aligned data solutions for delivering high More ❯
to work in a successful IT company? Then take advantage of your chance: JOB WORLD GmbH is a partner of the leading IT companies in Austria. We offer dedicated BigData Administrators (Apache Hadoop/Cloudera) top access to interesting IT jobs. We are looking for DIRECT EMPLOYMENT for a renowned IT company BigData Administrator … Apache Hadoop/Cloudera) (all genders) Aufgaben Administrate, monitor and optimize our BigData environment based on Apache Hadoop from Cloudera (AWS-Cloud) Manage and maintain services like Kafka, Flink, NiFi, DynamoDB and Iceberg Tables IaaC deployment via Terraform Plan and execute updates/upgrades Advise our Data Engineers and Data Scientists on the selection of … level support through troubleshooting and error analysis Support workload optimization (eg. query optimization) Evaluation and implementation of Cloud Native Services Profil Technical education (Computer Science HTL, Computer Science Degree, Data Science etc.) Technical education (computer science studies, etc.) At least 5 years experience in Apache Hadoop Cloudera environments. Migration to AWS Experience in system administration of Linux systems (RedHat More ❯
to work in a successful IT company? Then take advantage of your chance: JOB WORLD GmbH is a partner of the leading IT companies in Austria. We offer dedicated BigData Administrators (Apache Hadoop/Cloudera) top access to interesting IT jobs. We are looking for DIRECT EMPLOYMENT for a renowned IT company BigData Administrator … Apache Hadoop/Cloudera) (all genders) Aufgaben Administrate, monitor and optimize our BigData environment based on Apache Hadoop from Cloudera (AWS-Cloud) Manage and maintain services like Kafka, Flink, NiFi, DynamoDB and Iceberg Tables IaaC deployment via Terraform Plan and execute updates/upgrades Advise our Data Engineers and Data Scientists on the selection of … level support through troubleshooting and error analysis Support workload optimization (eg. query optimization) Evaluation and implementation of Cloud Native Services Profil Technical education (Computer Science HTL, Computer Science Degree, Data Science etc.) Technical education (computer science studies, etc.) At least 5 years experience in Apache Hadoop Cloudera environments. Migration to AWS Experience in system administration of Linux systems (RedHat More ❯
to work in a successful IT company? Then take advantage of your chance: JOB WORLD GmbH is a partner of the leading IT companies in Austria. We offer dedicated BigData Administrators (Apache Hadoop/Cloudera) top access to interesting IT jobs. We are looking for DIRECT EMPLOYMENT for a renowned IT company BigData Administrator … Apache Hadoop/Cloudera) (all genders) Aufgaben Administrate, monitor and optimize our BigData environment based on Apache Hadoop from Cloudera (AWS-Cloud) Manage and maintain services like Kafka, Flink, NiFi, DynamoDB and Iceberg Tables IaaC deployment via Terraform Plan and execute updates/upgrades Advise our Data Engineers and Data Scientists on the selection of … level support through troubleshooting and error analysis Support workload optimization (eg. query optimization) Evaluation and implementation of Cloud Native Services Profil Technical education (Computer Science HTL, Computer Science Degree, Data Science etc.) Technical education (computer science studies, etc.) At least 5 years experience in Apache Hadoop Cloudera environments. Migration to AWS Experience in system administration of Linux systems (RedHat More ❯
to work in a successful IT company? Then take advantage of your chance: JOB WORLD GmbH is a partner of the leading IT companies in Austria. We offer dedicated BigData Administrators (Apache Hadoop/Cloudera) top access to interesting IT jobs. We are looking for DIRECT EMPLOYMENT for a renowned IT company BigData Administrator … Apache Hadoop/Cloudera) (all genders) Aufgaben Administrate, monitor and optimize our BigData environment based on Apache Hadoop from Cloudera (AWS-Cloud) Manage and maintain services like Kafka, Flink, NiFi, DynamoDB and Iceberg Tables IaaC deployment via Terraform Plan and execute updates/upgrades Advise our Data Engineers and Data Scientists on the selection of … level support through troubleshooting and error analysis Support workload optimization (eg. query optimization) Evaluation and implementation of Cloud Native Services Profil Technical education (Computer Science HTL, Computer Science Degree, Data Science etc.) Technical education (computer science studies, etc.) At least 5 years experience in Apache Hadoop Cloudera environments. Migration to AWS Experience in system administration of Linux systems (RedHat More ❯
to work in a successful IT company? Then take advantage of your chance: JOB WORLD GmbH is a partner of the leading IT companies in Austria. We offer dedicated BigData Administrators (Apache Hadoop/Cloudera) top access to interesting IT jobs. We are looking for DIRECT EMPLOYMENT for a renowned IT company BigData Administrator … Apache Hadoop/Cloudera) (all genders) Aufgaben Administrate, monitor and optimize our BigData environment based on Apache Hadoop from Cloudera (AWS-Cloud) Manage and maintain services like Kafka, Flink, NiFi, DynamoDB and Iceberg Tables IaaC deployment via Terraform Plan and execute updates/upgrades Advise our Data Engineers and Data Scientists on the selection of … level support through troubleshooting and error analysis Support workload optimization (eg. query optimization) Evaluation and implementation of Cloud Native Services Profil Technical education (Computer Science HTL, Computer Science Degree, Data Science etc.) Technical education (computer science studies, etc.) At least 5 years experience in Apache Hadoop Cloudera environments. Migration to AWS Experience in system administration of Linux systems (RedHat More ❯
to work in a successful IT company? Then take advantage of your chance: JOB WORLD GmbH is a partner of the leading IT companies in Austria. We offer dedicated BigData Administrators (Apache Hadoop/Cloudera) top access to interesting IT jobs. We are looking for DIRECT EMPLOYMENT for a renowned IT company BigData Administrator … Apache Hadoop/Cloudera) (all genders) Aufgaben Administrate, monitor and optimize our BigData environment based on Apache Hadoop from Cloudera (AWS-Cloud) Manage and maintain services like Kafka, Flink, NiFi, DynamoDB and Iceberg Tables IaaC deployment via Terraform Plan and execute updates/upgrades Advise our Data Engineers and Data Scientists on the selection of … level support through troubleshooting and error analysis Support workload optimization (eg. query optimization) Evaluation and implementation of Cloud Native Services Profil Technical education (Computer Science HTL, Computer Science Degree, Data Science etc.) Technical education (computer science studies, etc.) At least 5 years experience in Apache Hadoop Cloudera environments. Migration to AWS Experience in system administration of Linux systems (RedHat More ❯
to work in a successful IT company? Then take advantage of your chance: JOB WORLD GmbH is a partner of the leading IT companies in Austria. We offer dedicated BigData Administrators (Apache Hadoop/Cloudera) top access to interesting IT jobs. We are looking for DIRECT EMPLOYMENT for a renowned IT company BigData Administrator … Apache Hadoop/Cloudera) (all genders) Aufgaben Administrate, monitor and optimize our BigData environment based on Apache Hadoop from Cloudera (AWS-Cloud) Manage and maintain services like Kafka, Flink, NiFi, DynamoDB and Iceberg Tables IaaC deployment via Terraform Plan and execute updates/upgrades Advise our Data Engineers and Data Scientists on the selection of … level support through troubleshooting and error analysis Support workload optimization (eg. query optimization) Evaluation and implementation of Cloud Native Services Profil Technical education (Computer Science HTL, Computer Science Degree, Data Science etc.) Technical education (computer science studies, etc.) At least 5 years experience in Apache Hadoop Cloudera environments. Migration to AWS Experience in system administration of Linux systems (RedHat More ❯
Role: Senior BigData & DevOps EngineerBill Rate: $78/hour C2CLocation: Dallas, TXDuration: 12+ months/long-term Interview Criteria: Telephonic + ZoomDirect Client Requirement We are seeking a highly experienced Senior BigData & DevOps Engineer to manage end-to-end data operations for enterprise-scale platforms. The ideal candidate will have8+ years of experience … in BigData technologies, ETL development, and DevOps automation, with hands-on expertise inHDFS, Hive, Impala, PySpark, Python, Jenkins, and uDeploy. This role is critical in ensuring thestability, scalability, and efficiency of data platforms while enabling smooth development-to-production workflows. Required Qualifications Bachelor s degree in Computer Science, IT, or related field. 8+ years of experience … inBig Data engineering and DevOps practices. Advanced proficiency in HDFS, Hive, Impala, PySpark, Python, and Linux. Hands-on experience with CI/CD tools such as Jenkins anduDeploy. Strong understanding of ETL development, orchestration, and performance optimization. Experience with ServiceNow for incident/change/problem management. Excellent analytical, troubleshooting, and communication skills. Nice to Have Exposure to cloud More ❯
We are seeking a Sr. Lead Data Engineer for a hybrid for a W2 position (Must work on our W2 and commute to Huntsville, Texas when required). Will lead the design, implementation, and management of end-to-end, enterprise-grade data solutions for our Huntsville, Texas, client. This role requires expertise in building and optimizing data warehouses, data lakes, and lake house platforms, with a strong emphasis on data engineering, data science, and machine learning. You will work closely with cross-functional teams to create scalable and robust architectures that support advanced analytics and machine learning use cases, while adhering to industry standards and best practices. Responsibilities Include: Architect, design, and manage … the entire data lifecycle from data ingestion, transformation, storage, and processing to advanced analytics and machine learning databases and large-scale processing systems. Implement robust data governance frameworks, including metadata management, lineage tracking, security, compliance, and business glossary development. Identify, design, and implement internal process improvements, including redesigning infrastructure for greater scalability, optimizing data delivery, and More ❯
Description This position will play a critical role in the success of the Firm's Data Strategy program As a Vice President on the Data Governance team within the firmwide CDO, you will be responsible for working with stakeholders to define governance and tooling requirements and building out the Firmwide Data Lineage framework. The firmwide team also … to deliver this governance framework. In addition, you will be responsible for driving the adoption of the firmwide CDO framework by the LOBs and Corporate Functions CDOs. The Chief Data & Analytics Office (CDAO) at JPMorgan Chase is responsible for accelerating the firm's data and analytics journey. This includes ensuring the quality, integrity, and security of the company … s data, as well as leveraging this data to generate insights and drive decision-making. The CDAO is also responsible for developing and implementing solutions that support the firm's commercial goals by harnessing artificial intelligence and machine learning technologies to develop new products, improve productivity, and enhance risk management effectively and responsibly. Job Responsibilities Partners with CDO More ❯