Falls Church, Virginia, United States Hybrid / WFH Options
Epsilon Inc
maintenance of bronze, silver, and gold data tables Optimize query performance, indexing strategies, and resource utilization for scalable data solutions Implement and refine database schema designs to align with evolving business requirements Coordinate with AI/ML teams to structure data sets and support advanced analytical needs Lead the … high-quality deliverables. Familiarity with cloud-based data platforms (e.g., AWS, Azure, GCP) and associated managed services Experience with data modeling frameworks (e.g., star schema, snowflakeschema) Knowledge of data governance principles, data cataloging, and metadata management Ability to work with version control tools (e.g., Git, SVN More ❯
the rest of the days from home. Key Skills * Hands-on experience working with Teradata and other Datawarehouses. * Deep expertise in Teradata architecture, star schema, snowflakeschema, SQL optimization, and Data Modelling. * Experience in implementation of Teradata utilities (BTEQ, Fast Load, Multiload, TPT etc.) for efficient Data More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
T-SQL for data extraction and transformation Hands-on experience with Azure technologies (Data Lake, Data Factory, Synapse, etc.) Data modelling experience (star/snowflake schemas, normalization, etc.) Familiarity with ETL processes and working alongside Data Engineers Excellent communication and stakeholder management skills Insurance industry experience (ideally Reinsurance) Desirable More ❯
and programmatic data access patterns Data Engineering & Modelling Strong T-SQL skills for data retrieval and performance tuning Knowledge of dimensional modelling , star/snowflake schemas , and data warehouse best practices Preferred Qualifications Microsoft certifications such as DA-100 , DP-500 , or MCSE: BI Familiarity with CI/CD More ❯
improvements for data processing capabilities. Write and optimize SQL queries to ensure data integrity, performance, and scalability. Implement a flexible Data Vault model in Snowflake to support large-scale analytics and BI. Collaborate with Data Engineers, Product Managers, and Data Scientists to deliver data-driven solutions. Engage with stakeholders … solutions. Implement and enforce data governance and quality processes across systems. Support cloud platforms such as AWS/Azure and tools like DBT with Snowflake for scalable data solutions. Continuously seek improvements in data systems, processes, and tools for efficiency and scalability. Key Skills/Experience: Solid understanding of … ETL/ELT processes with hands-on experience using DBT, Snowflake, Python, SQL, Terraform, and Airflow. Experience designing and implementing cloud-based data products and solutions. Proficiency with cloud data warehouses and analytics platforms such as Snowflake, AWS, and Azure. Experience with GitHub for version control and project More ❯
optimize SQL queries, ensuring data integrity, performance, and scalability, using best practices and techniques * Data vault Model implementation: Implement flexible Data vault model in Snowflake to support large-scale analytics and business intelligence. * Cross-Team Collaboration: Collaborate with Data Engineers, Product Managers, and Data Scientists to deliver solutions that … ensuring accurate and consistent data flows across all systems. * Cloud & Infrastructure Support: Work with cloud platforms such as AWS/Azure and DBT with Snowflake to build and maintain scalable data solutions. * Continuous Improvement: Proactively look for ways to improve data systems, processes, and tools, ensuring efficiency and scalability. … ETL/ELT & Data Pipelines: Solid understanding of ETL/ELT processes, along with hands-on experience building and maintaining data pipelines using DBT, Snowflake, Python, SQL, Terraform and Airflow * Experience in designing and implementing data products and solutions on cloud-based architectures. * Cloud Platforms: Experience working with cloud More ❯
Raritan, New Jersey, United States Hybrid / WFH Options
esrhealthcare
Experience: 14+ Years Duration:Long term Contract Need LinkedIn & Passport No with the profile Mandatory Skills: Snowflake, ETL, Informatica, Python, SQL, Cloud, Data Visualization Job Description: - As a Snowflake Tech Lead, you will play a pivotal role in designing, developing, and implementing data solutions using Snowflake, while … a team of talented engineers. Key Responsibilities: Lead the design, development, and implementation of scalable and efficient data solutions using Snowflake. Architect and optimize Snowflake data models, ETL processes, and data pipelines to ensure high performance and reliability. Collaborate with cross-functional teams, including data scientists, analysts, and business … data requirements and deliver solutions that meet business needs. Provide technical leadership and mentorship to a team of data engineers, ensuring best practices in Snowflake development and data engineering. Monitor and troubleshoot Snowflake environments to ensure optimal performance, cost-efficiency, and data integrity. Stay up-to-date with More ❯
other sources. Apply data cleansing rules to ensure high data quality standards. Model data into a single source of truth using Kimball methodology (star schema, snowflake, etc.). Develop high-quality code following DevOps and software engineering best practices, including testing and CI/CD. Monitor and maintain More ❯
High Wycombe, Buckinghamshire, United Kingdom Hybrid / WFH Options
TieTalent
engineer; working with teams across the group to present their data in ways that make it useful, accurate and timely. We model data in Snowflake using dbt, making use of Power BI for most (but not all) of our presentation. Code is managed in GitHub using a continuous integration More ❯
ETL/ELT pipelines across structured and unstructured data sources using Airbyte , Airflow , DBT Core , and AWS Glue Design and maintain dimensional models in Snowflake , including SCDs and best practices for indexing, clustering, and performance Collaborate cross-functionally with analysts and business teams to support Power BI and enterprise … Data Warehousing (EDW) with hands-on experience in Kimball-style modelling Expert-level SQL skills for complex transformation and query tuning Deep knowledge of Snowflake including optimisation, cost management, and architecture Experience with modern data stacks – especially DBT Core , Airbyte , and Airflow Familiarity with AWS data services (e.g., S3 More ❯
remote working And more WHAT WILL YOU BE DOING? We are looking for a skilled data professional to design and optimise ETL pipelines in Snowflake and Azure Data Factory, ensuring seamless data integration and transformation. This role involves building and managing semantic data models in Snowflake and Power … BI to support scalable, user-friendly analytics and reporting. You will also develop Snowflake stored procedures using Python to automate workflows and handle complex data transformations. Maintaining data integrity and accessibility within Snowflake will be essential for effective data warehousing operations. Additionally, you will collaborate closely with analytics … consent for us to process and submit (subject to required skills) your application to our client in conjunction with this vacancy only. KEY SKILLS: Snowflake, Azure Data Factory, PowerBI, Python, SQL, ETL More ❯
level. As Senior Analytics Engineer, you will have sole ownership of analytics engineering at Omaze. You will use industry standard tools and platforms (dbt, Snowflake, ThoughtSpot) to amplify the effectiveness and impact of our (growing) analytics team. You will provide clean, tested, well-documented models, and work with our … give the team access to new data sources. What you’ll do: Fully own our dbt project, building and maintaining data models in our Snowflake data warehouse, blending and modelling data from multiple sources Work with analysts and engineers to collaboratively design and build data new models as the More ❯
like Azure Data Factory , Azure Data Lake , or Azure Synapse . Understanding of Azure’s security and identity management practices (e.g., IAM, RBAC). Snowflake Data Warehouse Experience : Designing and optimizing Snowflake schemas for efficient querying. Implementing ETL/ELT pipelines to load and transform data in Snowflake. More ❯
data and improve every decision we make at EFG. This is a unique opportunity to shape our data foundations, optimize our dbt models on Snowflake, and ensure our analysts and business users have access to clean, well-structured, high-performance datasets. Own Your Domain Ask all the whys, relentlessly … data stack - whether that is traditional BI, powering ML models or providing enriched data to operational systems via reverse ETL. Optimize dbt models on Snowflake, ensuring efficiency, reliability, and maintainability. Enforce best practices in dimensional modeling (star/snowflake schemas), normalization, and performance tuning. Write clean, modular SQL … write highly performant queries. You have multiple years of experience with dbt and have optimized models at scale. You have a deep understanding of Snowflake architecture, including performance tuning, clustering, caching, and cost optimization. You have created data sources for highly performant BI and know how best to optimize More ❯