Falls Church, Virginia, United States Hybrid / WFH Options
Epsilon Inc
maintenance of bronze, silver, and gold data tables Optimize query performance, indexing strategies, and resource utilization for scalable data solutions Implement and refine database schema designs to align with evolving business requirements Coordinate with AI/ML teams to structure data sets and support advanced analytical needs Lead the … high-quality deliverables. Familiarity with cloud-based data platforms (e.g., AWS, Azure, GCP) and associated managed services Experience with data modeling frameworks (e.g., star schema, snowflakeschema) Knowledge of data governance principles, data cataloging, and metadata management Ability to work with version control tools (e.g., Git, SVN More ❯
with the remaining days from home. Key Skills Hands-on experience working with Teradata and other data warehouses. Deep expertise in Teradata architecture, star schema, snowflakeschema, SQL optimization, and Data Modelling. Experience in implementing Teradata utilities (BTEQ, Fast Load, Multiload, TPT, etc.) for efficient data loading. More ❯
the rest of the days from home. Key Skills * Hands-on experience working with Teradata and other Datawarehouses. * Deep expertise in Teradata architecture, star schema, snowflakeschema, SQL optimization, and Data Modelling. * Experience in implementation of Teradata utilities (BTEQ, Fast Load, Multiload, TPT etc.) for efficient Data More ❯
the rest of the days from home. Key Skills * Hands-on experience working with Teradata and other Datawarehouses. * Deep expertise in Teradata architecture, star schema, snowflakeschema, SQL optimization, and Data Modelling. * Experience in implementation of Teradata utilities (BTEQ, Fast Load, Multiload, TPT etc.) for efficient Data More ❯
Job Title: Snowflake Data Engineer Location: Somerset, 3 days per week onsite Overview: We are looking for a talented Snowflake Data Engineer to join our growing data team. Reporting to the BI & Data Manager, you will play a key role in the technical build of our new enterprise … led by the manager, your focus will be on hands-on development: building scalable pipelines, designing efficient data models, and ensuring a high-quality Snowflake environment to support business intelligence, analytics, and data-driven decision-making. Key Responsibilities: Develop and optimize data pipelines and ETL processes into Snowflake. Build … structure, and maintain the Snowflake data warehouse. Work closely with the BI & Data Manager to align technical solutions with strategic goals. Implement data modelling best practices to support reporting and analysis needs. Ensure data integrity, security, and performance within the Snowflake environment. Collaborate with business analysts, developers, and More ❯
Job Title: Snowflake Data Engineer Location: Somerset, 3 days per week onsite Overview: We are looking for a talented Snowflake Data Engineer to join our growing data team. Reporting to the BI & Data Manager, you will play a key role in the technical build of our new enterprise … led by the manager, your focus will be on hands-on development: building scalable pipelines, designing efficient data models, and ensuring a high-quality Snowflake environment to support business intelligence, analytics, and data-driven decision-making. Key Responsibilities: Develop and optimize data pipelines and ETL processes into Snowflake. Build … structure, and maintain the Snowflake data warehouse. Work closely with the BI & Data Manager to align technical solutions with strategic goals. Implement data modelling best practices to support reporting and analysis needs. Ensure data integrity, security, and performance within the Snowflake environment. Collaborate with business analysts, developers, and More ❯
Job Title: Snowflake Data Engineer Location: Somerset, 3 days per week onsite Contract- outside IR35 Overview: We are looking for a talented Snowflake Data Engineer to join our growing data team. Reporting to the BI & Data Manager, you will play a key role in the technical build of … led by the manager, your focus will be on hands-on development: building scalable pipelines, designing efficient data models, and ensuring a high-quality Snowflake environment to support business intelligence, analytics, and data-driven decision-making. Key Responsibilities: Develop and optimize data pipelines and ETL processes into Snowflake. Build … structure, and maintain the Snowflake data warehouse. Work closely with the BI & Data Manager to align technical solutions with strategic goals. Implement data modelling best practices to support reporting and analysis needs. Ensure data integrity, security, and performance within the Snowflake environment. Collaborate with business analysts, developers, and More ❯
Job Title: Snowflake Data Engineer Location: Somerset, 3 days per week onsite Contract- outside IR35 Overview: We are looking for a talented Snowflake Data Engineer to join our growing data team. Reporting to the BI & Data Manager, you will play a key role in the technical build of … led by the manager, your focus will be on hands-on development: building scalable pipelines, designing efficient data models, and ensuring a high-quality Snowflake environment to support business intelligence, analytics, and data-driven decision-making. Key Responsibilities: Develop and optimize data pipelines and ETL processes into Snowflake. Build … structure, and maintain the Snowflake data warehouse. Work closely with the BI & Data Manager to align technical solutions with strategic goals. Implement data modelling best practices to support reporting and analysis needs. Ensure data integrity, security, and performance within the Snowflake environment. Collaborate with business analysts, developers, and More ❯
in developing and optimising ETL/ELT pipelines and using DBT for data transformation and modelling. Knowledge of data modelling techniques, including star and snowflake schemas, for efficient data analysis. Familiarity with cloud platforms such as AWS or GCP, including services like Databricks, Redshift, BigQuery, and Snowflake. Strong Python More ❯
/CD) Proficiency in SQL and Python for data processing and automation Experience working with data modeling tools and practices (e.g., dimensional, star/snowflakeschema, dbt) Solid understanding of data governance, metadata, and quality frameworks Strong collaboration and communication skills, with the ability to work cross-functionally More ❯
Role: Snowflake Data Engineer Data Validator ITAR Export Control Project Location: Charlotte, NC Duration: 5 Months Job Description We are seeking a skilled Snowflake Data Engineer to join our ITAR International Traffic in Arms Regulations project team As a Snowflake Data Engineer you will be responsible for … designing building validating and maintaining scalable Snowflake data pipelines and solutions in compliance with ITAR regulations The ideal candidate should have extensive experience with Snowflake Core SQL Skills Cloud ETL processes Python GitHub ControlM Redwood orchestration and a strong understanding of ITAR compliance requirements Key Responsibilities Snowflake Data Modeling Design and implement efficient data models in Snowflake to support ITAR compliance requirements Optimize schema designs for performance and scalability ETL Development Develop ETL processes on IICS Informatica Integration cloud Services to extract transform and load data from various sources into Snowflake Ensure data More ❯
effective insight that will inform business decisions Experience working within a high-growth, fast-paced technology company Excellent knowledge of SQL and working with Snowflake Experience using BI tools like, Tableau, DBT, PowerBI or any BI Reporting Tool and Data Visualization application. Excellent skills in Data Warehouse Design & Development … Dimensional Data modelling, Star/Snowflakeschema Write requirements documents, functional/technical specifications, and test plans. Experience working in an Agile Scrum Environment Bachelors Degree in Computer Science, Mathematics, Engineering or similar discipline Compensations We offer a competitive salary, commensurate with experience, and provide excellent benefits including More ❯
scale data processing. Familiarity with data visualisation platforms such as Looker/Looker Studio/Tableau/QuickSight. Experience with data modelling such as Snowflake/Star Proficiency in using Terraform and implementing Infrastructure as Code. Solid experience in setting up and maintaining CI/CD pipelines. Strong understanding More ❯
and testing Experience with version control systems and CI/CD pipelines In-depth knowledge of Kimball data modeling techniques, such as star and snowflake schemas Strong SQL skills and ability to write performant queries Proficient in data analysis and interpretation Excellent verbal and written communication skills Proven track More ❯
and testing Experience with version control systems and CI/CD pipelines In-depth knowledge of Kimball data modeling techniques, such as star and snowflake schemas Strong SQL skills and ability to write performant queries Proficient in data analysis and interpretation Excellent verbal and written communication skills Proven track More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
T-SQL for data extraction and transformation Hands-on experience with Azure technologies (Data Lake, Data Factory, Synapse, etc.) Data modelling experience (star/snowflake schemas, normalization, etc.) Familiarity with ETL processes and working alongside Data Engineers Excellent communication and stakeholder management skills Insurance industry experience (ideally Reinsurance) Desirable More ❯
and programmatic data access patterns Data Engineering & Modelling Strong T-SQL skills for data retrieval and performance tuning Knowledge of dimensional modelling , star/snowflake schemas , and data warehouse best practices Preferred Qualifications Microsoft certifications such as DA-100 , DP-500 , or MCSE: BI Familiarity with CI/CD More ❯
improvements for data processing capabilities. Write and optimize SQL queries to ensure data integrity, performance, and scalability. Implement a flexible Data Vault model in Snowflake to support large-scale analytics and BI. Collaborate with Data Engineers, Product Managers, and Data Scientists to deliver data-driven solutions. Engage with stakeholders … solutions. Implement and enforce data governance and quality processes across systems. Support cloud platforms such as AWS/Azure and tools like DBT with Snowflake for scalable data solutions. Continuously seek improvements in data systems, processes, and tools for efficiency and scalability. Key Skills/Experience: Solid understanding of … ETL/ELT processes with hands-on experience using DBT, Snowflake, Python, SQL, Terraform, and Airflow. Experience designing and implementing cloud-based data products and solutions. Proficiency with cloud data warehouses and analytics platforms such as Snowflake, AWS, and Azure. Experience with GitHub for version control and project More ❯
optimize SQL queries, ensuring data integrity, performance, and scalability, using best practices and techniques * Data vault Model implementation: Implement flexible Data vault model in Snowflake to support large-scale analytics and business intelligence. * Cross-Team Collaboration: Collaborate with Data Engineers, Product Managers, and Data Scientists to deliver solutions that … ensuring accurate and consistent data flows across all systems. * Cloud & Infrastructure Support: Work with cloud platforms such as AWS/Azure and DBT with Snowflake to build and maintain scalable data solutions. * Continuous Improvement: Proactively look for ways to improve data systems, processes, and tools, ensuring efficiency and scalability. … ETL/ELT & Data Pipelines: Solid understanding of ETL/ELT processes, along with hands-on experience building and maintaining data pipelines using DBT, Snowflake, Python, SQL, Terraform and Airflow * Experience in designing and implementing data products and solutions on cloud-based architectures. * Cloud Platforms: Experience working with cloud More ❯
and skills Experience with AWS services like Lambdas and Terraform. Knowledge of Java and front-end development. Skills in User Experience design. Experience with Snowflake and data modeling. Experience working in large-scale, global, highly regulated environments. About the Team J.P. Morgan is a global leader in financial services More ❯
Raritan, New Jersey, United States Hybrid / WFH Options
esrhealthcare
Experience: 14+ Years Duration:Long term Contract Need LinkedIn & Passport No with the profile Mandatory Skills: Snowflake, ETL, Informatica, Python, SQL, Cloud, Data Visualization Job Description: - As a Snowflake Tech Lead, you will play a pivotal role in designing, developing, and implementing data solutions using Snowflake, while … a team of talented engineers. Key Responsibilities: Lead the design, development, and implementation of scalable and efficient data solutions using Snowflake. Architect and optimize Snowflake data models, ETL processes, and data pipelines to ensure high performance and reliability. Collaborate with cross-functional teams, including data scientists, analysts, and business … data requirements and deliver solutions that meet business needs. Provide technical leadership and mentorship to a team of data engineers, ensuring best practices in Snowflake development and data engineering. Monitor and troubleshoot Snowflake environments to ensure optimal performance, cost-efficiency, and data integrity. Stay up-to-date with More ❯
explain technical concepts to non-technical stakeholders and influence through data-driven insights. Technical: Proficient in Alteryx, Power BI, SAP Analytics Cloud; experienced with Snowflake and data modelling; familiarity with R and Python highly desirable. Financial Services: Experience working in a Finance firm, preferably Wealth Manager/Private Bank More ❯
/ELT pipelines, particularly with tools like DBT, ensures efficient data transformation and modelling. A strong understanding of data modelling techniques, including star and snowflake schemas, is critical for structuring data for analysis. Proficiency in cloud platforms, such as AWS and GCP, with hands-on experience in services like … Databricks, Redshift, BigQuery, and Snowflake, is highly valued. Advanced Python skills for data manipulation, automation, and scripting, using libraries like Pandas and NumPy, are necessary for effective data engineering. Expertise in managing and optimising data architectures within data warehouse and lakehouse environments is a core requirement. Proficiency in version More ❯
other sources. Apply data cleansing rules to ensure high data quality standards. Model data into a single source of truth using Kimball methodology (star schema, snowflake, etc.). Develop high-quality code following DevOps and software engineering best practices, including testing and CI/CD. Monitor and maintain More ❯
other sources. Apply data cleansing rules to ensure high data quality standards. Model data into a single source of truth using Kimball methodology (star schema, snowflake, etc.). Develop high-quality code following DevOps and software engineering best practices, including testing and CI/CD. Monitor and maintain More ❯