PySpark, and SQL, with the ability to build modular, efficient, and scalable data pipelines. Deep expertise in data modeling for both relational databases and data warehouses, including Star and Snowflakeschema designs. Extensive experience working with AWS Redshift and Aurora for data warehousing and transactional workloads. Experience using dbt (Data Build Tool) for building modular, version-controlled, and More ❯
PySpark, and SQL, with the ability to build modular, efficient, and scalable data pipelines. Deep expertise in data modeling for both relational databases and data warehouses, including Star and Snowflakeschema designs. Extensive experience working with AWS Redshift and Aurora for data warehousing and transactional workloads. Experience using dbt (Data Build Tool) for building modular, version-controlled, and More ❯
PySpark, and SQL, with the ability to build modular, efficient, and scalable data pipelines. Deep expertise in data modeling for both relational databases and data warehouses, including Star and Snowflakeschema designs. Extensive experience working with AWS Redshift and Aurora for data warehousing and transactional workloads. Experience using dbt (Data Build Tool) for building modular, version-controlled, and More ❯
london (city of london), south east england, united kingdom
HCLTech
PySpark, and SQL, with the ability to build modular, efficient, and scalable data pipelines. Deep expertise in data modeling for both relational databases and data warehouses, including Star and Snowflakeschema designs. Extensive experience working with AWS Redshift and Aurora for data warehousing and transactional workloads. Experience using dbt (Data Build Tool) for building modular, version-controlled, and More ❯
data engineering role. Proficient in SQL and Python . Strong experience with AWS services (e.g., Lambda, Glue, Redshift, S3). Solid understanding of data warehousing and modelling: star/snowflakeschema Familiarity with Git , CI/CD pipelines, and containerisation (e.g., Docker). Ability to troubleshoot BI tool connections (e.g., Power BI). Desirable Skills Experience with Infrastructure More ❯
data engineering role. Proficient in SQL and Python . Strong experience with AWS services (e.g., Lambda, Glue, Redshift, S3). Solid understanding of data warehousing and modelling: star/snowflakeschema Familiarity with Git , CI/CD pipelines, and containerisation (e.g., Docker). Ability to troubleshoot BI tool connections (e.g., Power BI). Desirable Skills Experience with Infrastructure More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Morgan McKinley
. Advanced programming skills in SQL and Python for data transformation and analysis. Experience with web analytics tools (e.g. Adobe Analytics). Familiarity with data warehousing and database modeling (Snowflake experience a plus). Strong communication and presentation skills, with the ability to explain complex data simply. Self-starter with a commercial mindset and a drive for continuous improvement. More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Xpertise Recruitment Ltd
in governance forums and strategic planning What You’ll Bring Proven experience designing and architecting complex, high-volume data systems Strong hands-on knowledge of cloud platforms (AWS preferred), Snowflake, DBT, and data modelling tools Experience with data integration, ETL, and API-based data exchange Familiarity with Data Vault, dimensional modelling, and master data management Excellent communication skills and More ❯
dashboards, DAX, Power Query, and complex data modeling Strong SQL skills for data extraction, transformation, and performance optimisation (essential) Solid understanding of data warehousing principles such as star and snowflake schemas, as well as ETL processes Experience in designing and implementing semantic and tabular models for reporting solutions Excellent communication abilities with proven experience collaborating with clients (essential) What More ❯
improving a system that handles a high volume of traffic so experience with cloud and data warehouse infrastructure will help you in this role (we use AWS, Cloudflare and snowflake). Familiarity with infrastructure as code will also help when updating our cloud architecture (we use terraform). We place a large focus on data quality so you'll … not just the code, but the architecture of our platforms and everything that enables the business to thrive. Gain expertise over our tools and services: Python, Docker, Github Actions, Snowflake, AWS Participate in all team ceremonies and have direct input in the team's ways of working. This is a high-trust, supportive, and collaborative environment where you will … experience-we want to hire people to grow into the role and beyond. About the team: Python is our bread and butter. The wider data platform team uses dbt, Snowflake, and Looker to model, transform, and expose data for analytics and reporting across the business. We use Docker and Kubernetes to manage our production services. We use Github Actions More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Reed.co.uk
Looker solutions that support sales performance and revenue growth. Develop, document, and distribute Looker reports, ensuring consistency, accuracy, and usability. Build and maintain robust data models in DBT and Snowflake, ensuring clean, trustworthy data flows into reporting. Support data integrity within Salesforce and other commercial data sources, enhancing quality and enabling consistent reporting. Automate ingestion and transformation of financial … tool such as Looker, Tableau, Power Bi or similar Good SQL skills, you know your way around a database. Some exposure to a cloud-based data warehouse such as Snowflake, Redshift, Big Query or similar. Some knowledge of ETL/ELT and data automation. A desire to work with complex Salesforce detail and master that data. Excellent problem-solving More ❯
a range of global brands across industries such as (but not limited to) banking, pharmaceuticals, retail, and insurance. Plus, we have strong partnerships with tech providers including Microsoft, AWS, Snowflake, Starburst, Neo4J, Databricks, Google Cloud, and others, giving us access to cutting-edge technologies and enabling us to spearhead innovation. Who We're Looking For Degree in any major More ❯
Customers simplify operations, improve data security, and unlock data's value. Customers include: Roche - Saved $50M by securely operationalizing data products and saving inventory. Thomson Reuters - Faster access to Snowflake data and a 60x increase in data usage resulting in greater productivity. Swedbank - 3x time saved setting up data security and self-service policy authoring. 2x more data use … 5x improvement in process efficiency. JB Hunt - Increased permitted use cases for cloud analytics by 100% by managing access to 100+ databases while achieving cost savings. • Technology partners include Snowflake, Databricks, AWS, Azure, Google Cloud, and Starburst. Immuta was recognized as the Snowflake Data Security Partner of the Year in June 2023. • Immuta has been recognized by Forbes … BuiltIn as one of the best workplaces, and by Fast Company as one of the top 50 most innovative companies. • $267 million in total funding. Lead investors include NightDragon, Snowflake, and Databricks, along with additional funding from ServiceNow, Citi Ventures, Dell Technologies Capital, DFJ Growth, IAG, Intel Capital, March Capital, Okta Ventures, StepStone, Ten Eleven Ventures, and Wipro Ventures. More ❯