Fairfax, Virginia, United States Hybrid / WFH Options
Centene
information into the Security Data Lake. Data Storage and Management: Store and manage the data the ingested data. This involves optimizing data schemas and ensuring data quality and integrity. ETL (Extract, Transform, Load) Processes: Design ETL pipelines to transform raw data into a format suitable for analysis. This involves data cleansing, aggregation, and enrichment, ensuring the data is usable for … quality and governance Machine learning model development and maintenance Data integration processes Data security and privacy regulations Data visualization tools development Data warehouse and data mart design and development ETL (Extract, Transform, Load) processes. Data governance and compliance Proficiency in SQL and Python Knowledge of Big Data technologies (Hadoop, Spark) Cloud computing (AWS, Azure, GCP) Data modeling and architecture Advanced More ❯
Richmond, Virginia, United States Hybrid / WFH Options
Centene
information into the Security Data Lake. Data Storage and Management: Store and manage the data the ingested data. This involves optimizing data schemas and ensuring data quality and integrity. ETL (Extract, Transform, Load) Processes: Design ETL pipelines to transform raw data into a format suitable for analysis. This involves data cleansing, aggregation, and enrichment, ensuring the data is usable for … quality and governance Machine learning model development and maintenance Data integration processes Data security and privacy regulations Data visualization tools development Data warehouse and data mart design and development ETL (Extract, Transform, Load) processes. Data governance and compliance Proficiency in SQL and Python Knowledge of Big Data technologies (Hadoop, Spark) Cloud computing (AWS, Azure, GCP) Data modeling and architecture Advanced More ❯
Manassas, Virginia, United States Hybrid / WFH Options
Centene
information into the Security Data Lake. Data Storage and Management: Store and manage the data the ingested data. This involves optimizing data schemas and ensuring data quality and integrity. ETL (Extract, Transform, Load) Processes: Design ETL pipelines to transform raw data into a format suitable for analysis. This involves data cleansing, aggregation, and enrichment, ensuring the data is usable for … quality and governance Machine learning model development and maintenance Data integration processes Data security and privacy regulations Data visualization tools development Data warehouse and data mart design and development ETL (Extract, Transform, Load) processes. Data governance and compliance Proficiency in SQL and Python Knowledge of Big Data technologies (Hadoop, Spark) Cloud computing (AWS, Azure, GCP) Data modeling and architecture Advanced More ❯
Virginia Beach, Virginia, United States Hybrid / WFH Options
Centene
information into the Security Data Lake. Data Storage and Management: Store and manage the data the ingested data. This involves optimizing data schemas and ensuring data quality and integrity. ETL (Extract, Transform, Load) Processes: Design ETL pipelines to transform raw data into a format suitable for analysis. This involves data cleansing, aggregation, and enrichment, ensuring the data is usable for … quality and governance Machine learning model development and maintenance Data integration processes Data security and privacy regulations Data visualization tools development Data warehouse and data mart design and development ETL (Extract, Transform, Load) processes. Data governance and compliance Proficiency in SQL and Python Knowledge of Big Data technologies (Hadoop, Spark) Cloud computing (AWS, Azure, GCP) Data modeling and architecture Advanced More ❯
Newport News, Virginia, United States Hybrid / WFH Options
Centene
information into the Security Data Lake. Data Storage and Management: Store and manage the data the ingested data. This involves optimizing data schemas and ensuring data quality and integrity. ETL (Extract, Transform, Load) Processes: Design ETL pipelines to transform raw data into a format suitable for analysis. This involves data cleansing, aggregation, and enrichment, ensuring the data is usable for … quality and governance Machine learning model development and maintenance Data integration processes Data security and privacy regulations Data visualization tools development Data warehouse and data mart design and development ETL (Extract, Transform, Load) processes. Data governance and compliance Proficiency in SQL and Python Knowledge of Big Data technologies (Hadoop, Spark) Cloud computing (AWS, Azure, GCP) Data modeling and architecture Advanced More ❯
data challenges. Your Job's Key Responsibilities Are: Designing, developing, and maintaining robust data pipelines using Databricks, Spark, and Python Building efficient and scalable ETL processes to ingest, transform, andload data from various sources (databases, APIs, streaming platforms) into cloud-based data lakes and warehouses Leveraging the Databricks ecosystem (SQL, Delta Lake, Workflows, Unity Catalog) to deliver reliable and … data manipulation libraries (e.g., PySpark, Spark SQL) Experience with core components of the Databricks ecosystem: Databricks Workflows, Unity Catalog, and Delta Live Tables Solid understanding of data warehousing principles, ETL/ELT processes, data modeling and techniques, and database systems Proven experience with at least one major cloud platform (Azure, AWS, or GCP) Excellent SQL skills for data querying, transformation More ❯
data challenges. Your Job's Key Responsibilities Are: Designing, developing, and maintaining robust data pipelines using Databricks, Spark, and Python Building efficient and scalable ETL processes to ingest, transform, andload data from various sources (databases, APIs, streaming platforms) into cloud-based data lakes and warehouses Leveraging the Databricks ecosystem (SQL, Delta Lake, Workflows, Unity Catalog) to deliver reliable and … data manipulation libraries (e.g., PySpark, Spark SQL) Experience with core components of the Databricks ecosystem: Databricks Workflows, Unity Catalog, and Delta Live Tables Solid understanding of data warehousing principles, ETL/ELT processes, data modeling and techniques, and database systems Proven experience with at least one major cloud platform (Azure, AWS, or GCP) Excellent SQL skills for data querying, transformation More ❯
data challenges. Your Job's Key Responsibilities Are: Designing, developing, and maintaining robust data pipelines using Databricks, Spark, and Python Building efficient and scalable ETL processes to ingest, transform, andload data from various sources (databases, APIs, streaming platforms) into cloud-based data lakes and warehouses Leveraging the Databricks ecosystem (SQL, Delta Lake, Workflows, Unity Catalog) to deliver reliable and … data manipulation libraries (e.g., PySpark, Spark SQL) Experience with core components of the Databricks ecosystem: Databricks Workflows, Unity Catalog, and Delta Live Tables Solid understanding of data warehousing principles, ETL/ELT processes, data modeling and techniques, and database systems Proven experience with at least one major cloud platform (Azure, AWS, or GCP) Excellent SQL skills for data querying, transformation More ❯
Falls Church, Virginia, United States Hybrid / WFH Options
Epsilon Inc
database standards. Collaborate with team members to support data workflows for bronze, silver, and gold data tables. Help identify recurring issues and suggest improvements for operational efficiency. Support basic ETL (Extract, Transform, Load) processes to maintain data pipelines. Execute simple data quality checks to ensure the accuracy and consistency of data. Assist in troubleshooting database errors and performance bottlenecks. Escalate … as needed. Adhere to security protocols and compliance requirements in database operations. Assist in Data Migration and Integration Efforts - Support the transition of data between systems by helping with Extract, Transform, Load (ETL) processes and ensuring data consistency across different platforms. Monitor and Troubleshoot Database Performance Issues - Identify potential bottlenecks, perform root cause analysis, and work with senior architects to More ❯
Falls Church, Virginia, United States Hybrid / WFH Options
Epsilon Inc
part in ongoing training and knowledge-sharing sessions with more experienced architects Assist in Data Migration and Integration Efforts - Support the transition of data between systems by helping with Extract, Transform, Load (ETL) processes and ensuring data consistency across different platforms. Monitor and Troubleshoot Database Performance Issues - Identify potential bottlenecks, perform root cause analysis, and work with senior architects to … in either Python or SQL, with knowledge in building and optimizing data pipelines or analytics solutions. Familiarity with popular relational databases (e.g., MySQL, PostgreSQL, SQL Server) Basic understanding of ETL processes and tools (e.g., Talend, Informatica, or similar) Exposure to cloud-based data services (e.g., AWS RDS, Azure SQL Database, or Google Cloud SQL) Understanding of fundamental networking concepts andMore ❯
data manipulation libraries (e.g., PySpark, Spark SQL) • Experience with core components of the Databricks ecosystem: Databricks Workflows, Unity Catalog, and Delta Live Tables • Solid understanding of data warehousing principles, ETL/ELT processes, data modeling and techniques, and database systems • Proven experience with at least one major cloud platform (Azure, AWS, or GCP) • Excellent SQL skills for data querying, transformation … platform, supporting clients in solving complex data challenges. • Designing, developing, and maintaining robust data pipelines using Databricks, Spark, and Python • Building efficient and scalable ETL processes to ingest, transform, andload data from various sources (databases, APIs, streaming platforms) into cloud-based data lakes and warehouses • Leveraging the Databricks ecosystem (SQL, Delta Lake, Workflows, Unity Catalog) to deliver reliable and … Staying up to date with advancements in Databricks, data engineering, and cloud technologies to continuously improve tools and approaches Technologies: AI AWS Azure CI/CD Cloud Databricks DevOps ETL GCP Support Machine Learning Power BI Python PySpark SQL Spark Terraform Unity GameDev Looker SAP More: NETCONOMY has grown over the past 20 years from a startup to a More ❯
to ensure data infrastructure is optimized for performance, scalability, and reliability. Provide leadership and mentorship to data architecture and engineering teams. Data Integration and Management: Design and implement robust ETL processes to integrate data from various sources into the data warehouse and big data platforms. Oversee the management of metadata, master data, and data lineage across systems. Ensure data consistency … Proven track record of designing and implementing large-scale data architectures in complex environments. CICD/DevOps experience is a plus. Skills: Strong expertise in data modeling, data integration (ETL/ELT), and database design. Proficiency in SQL, PL/SQL, and performance tuning in Teradata, Oracle, and other databases. Strong experience with cloud data platforms tools and services (e.g. More ❯
Falls Church, Virginia, United States Hybrid / WFH Options
Epsilon Inc
Jira (or similar agile tool), ensuring timely and high-quality deliverables. Experience with cloud platforms (AWS, Azure, or GCP), including infrastructure provisioning, configuration, and cost optimization. Solid understanding of ETL/ELT processes and tools, with a track record of designing and managing reliable data pipelines. Expertise in CI/CD tools (e.g., Jenkins, GitLab CI, or Azure DevOps), including More ❯
a key role in shaping, implementing, and enhancing our Data Lake using Azure Databricks Source Integration : Establish connections to diverse systems via APIs , ODBC , Event Hubs , and sFTP protocols. ETL/ELT Pipelines : Design and optimize data pipelines using Azure Data Factory and Databricks Medallion Architecture : Implement Bronze, Silver, and Gold layers using formats like Delta , Parquet , and JSON for More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
data and analytics needs. Design and deploy end-to-end data solutions using Microsoft Fabric, encompassing data ingestion, transformation, and visualisation workflows. Construct and refine data models, pipelines, andETL frameworks within the Fabric ecosystem. Leverage Fabric's suite of tools to build dynamic reports, dashboards, and analytical applications. Maintain high standards of data integrity, consistency, and system performance across More ❯
Falls Church, Virginia, United States Hybrid / WFH Options
Epsilon Inc
structures. Demonstrated ability to write clean, efficient code in Python and SQL with knowledge on building and optimizing data pipelines or analytics solutions. Foundational understanding of data pipelines andETL/ELT processes, including data ingestion and transformation concepts. Basic knowledge of CI/CD tools (e.g., Jenkins, GitLab CI) with a willingness to learn more advanced automation practices. General More ❯
Liverpool, Lancashire, United Kingdom Hybrid / WFH Options
Intuita - Vacancies
using Azure Data Factory (ADF), ensuring efficient and reliable data movement and transformation. • Data Modelling using Kimball, 3NF or Dimensional methodologies • Utilize SQL and Python languages to extract, transform, andload data from various sources into Azure Databricks and Azure SQL/SQL Server. • Design and implement metadata driven pipelines to automate data processing tasks. • Collaborate with cross-functional teams More ❯
market. Working specifically with underwriting teams to support their BI processes. Write efficient DAX measures and calculated columns to support analytical requirements. Develop robust SQL queries, stored procedures, andETL processes to source andtransform data. Translate complex data requirements into scalable BI solutions. Collaborate with business users to gather requirements and deliver meaningful insights. Lead data governance and quality … . Experience working with Underwriting teams. Strong understanding of insurance-specific data models including premiums, claims, exposures, and bordereaux. Hands-on experience working with data warehouses, data marts, andETL pipelines. Solid knowledge of relational databases (e.g., MS SQL Server, Azure SQL). Excellent problem-solving skills and ability to work independently or as part of a team. Strong stakeholder More ❯
market. Working specifically with underwriting teams to support their BI processes. Write efficient DAX measures and calculated columns to support analytical requirements. Develop robust SQL queries, stored procedures, andETL processes to source andtransform data. Translate complex data requirements into scalable BI solutions. Collaborate with business users to gather requirements and deliver meaningful insights. Lead data governance and quality … . Experience working with Underwriting teams. Strong understanding of insurance-specific data models including premiums, claims, exposures, and bordereaux. Hands-on experience working with data warehouses, data marts, andETL pipelines. Solid knowledge of relational databases (e.g., MS SQL Server, Azure SQL). Excellent problem-solving skills and ability to work independently or as part of a team. Strong stakeholder More ❯
on experience with modern data stack tools including dbt, Airflow, and cloud data warehouses (Snowflake, BigQuery, Redshift) Strong understanding of data modelling, schema design, and building maintainable ELT/ETL pipelines Experience with cloud platforms (AWS, Azure, GCP) and infrastructure-as-code practices Familiarity with data visualisation tools (Tableau, PowerBI, Looker) and analytics frameworks Leadership & Communication Proven experience leading technical More ❯
Birmingham, West Midlands, England, United Kingdom Hybrid / WFH Options
Broster Buchanan Ltd
time data streaming, batch data processing and data transformation processes Experience with core tools such as Data Factory, Databricks, Synapse, Kafka and Python Any exposure to data migration/ETL would be highly beneficial, with SQL/T-SQL, SSIS, SSRS and SSAS, as there is a large data migration project planned Any previous experience with setting up data governance … and processes would be highly beneficial Good proficiency in Power BI and Tableau Strong knowledge of data integration techniques andETL/ELT processes Experience with data modelling, data warehousing, and data governance best practices More ❯
ensuring data quality, efficiency, and scalability. If you have a passion for data engineering and a desire to make a significant impact, we encourage you to apply! Job Responsibilities ETL/ELT Pipeline Development: Design, develop, and optimize efficient and scalable ETL/ELT pipelines using Python, PySpark, and Apache Airflow. Implement batch and real-time data processing solutions using More ❯
Gold layers), working with modern tools such as Databricks , dbt , Azure Data Factory , and Python/SQL to support critical business analytics and AI/ML initiatives. Key Responsibilities ETL Development : Design and build robust and reusable ETL/ELT pipelines through the Medallion architecture in Databricks . Data Transformation : Create and manage data models and transformations using dbt , ensuring More ❯
with relational SQL databases either on premises or in the cloud. Power platform experience is desirable. Experience delivering multiple solutions using key techniques such as Governance, Architecture, Data Modelling, ETL/ELT, Data Lakes, Data Warehousing, Master Data, and BI. A solid understanding of key processes in the engineering delivery cycle including Agile and DevOps, Git, APIs, Containers, Microservices andMore ❯
City of London, London, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
with relational SQL databases either on premises or in the cloud. Power platform experience is desirable. Experience delivering multiple solutions using key techniques such as Governance, Architecture, Data Modelling, ETL/ELT, Data Lakes, Data Warehousing, Master Data, and BI. A solid understanding of key processes in the engineering delivery cycle including Agile and DevOps, Git, APIs, Containers, Microservices andMore ❯