best practices Salesforce cross-cloud integrations Proficiency in moderately complex SQL queries Experience in integrating with cloud-based data warehouses/data lakes (e.g. Snowflake, Databricks..) Preferred: Experience with Marketing Cloud Personalisation Experience with at least 1 end-to-end Salesforce Data Cloud implementation Experience with Segmentation strategy Experience with more »
strategies & practicesDWH PlatformsRelease Management strategies & practicesTableau & Power BI reporting toolsMS .Net technologies & TFS/ALMExcel/VBAModern data offerings such as Azure, AWS, Alteryx, Snowflake etcPermanent2022 more »
checks of audiences and campaign set ups that have been created by CRM developers Vast experience in audience targeting and segmentation Experience with SQL, Snowflake, Power BI and Salesforce Marketing Cloud Data and analytics enthusiast Upon meeting with you, I will be able to see that you have: 4+ years more »
batch/risk reports or trade life cycle Experience in at least one scripting (PowerShell, python) and database (ORACLE, MS SQL, MongoDB, Postgres or Snowflake) language CI/CD deployment pipeline involving Azure DevOps, Team City, Git, Jenkins etc. Ability to work well under pressure and Strong communication skills Demonstrated more »
Greater London, England, United Kingdom Hybrid / WFH Options
Tekskills Inc
batch/risk reports or trade life cycle Experience in at least one scripting (PowerShell, python) and database (ORACLE, MS SQL, MongoDB, Postgres or Snowflake) language CI/CD deployment pipeline involving Azure DevOps, Team City, Git, Jenkins etc. Ability to work well under pressure and Strong communication skills Demonstrated more »
batch/risk reports or trade life cycle Experience in at least one scripting ( PowerShell, python ) and database (ORACLE, MS SQL, MongoDB, Postgres or Snowflake ) language CI/CD deployment pipeline involving Azure DevOps, Team City, Git, Jenkins etc. Ability to work well under pressure and Strong communication skills Demonstrated more »
will also be focusing on building, designing and taking ownership of high quality data products. Key requirements: Advanced SQL Google Cloud Platform BigQuery (or Snowflake) Terraform Full understanding of CI/CD pipelines Python Expereince in leading data teams (Tech lead) Offer and Benefits: Discount scheme for retail brands Remote more »
of campaign performance, driving improved targeted promotions, which in turn aligns with their hyper personalisation strategy for the end customer. Technically they operate a Snowflake and DataBricks environment, leveraging SQL to extract data and R and Python for modelling and statistical analysis, building code and tools. They also have PowerBI more »
of campaign performance, driving improved targeted promotions, which in turn aligns with their hyper personalisation strategy for the end customer. Technically they operate a Snowflake and DataBricks environment, leveraging SQL to extract data and R and Python for modelling and statistical analysis, building code and tools. They also have PowerBI more »
rules to analyse historical data to determine its accuracy. Experience with the use of data strategy & governance tools such as Informatica, Collibra, MS Purview, Snowflake, Snowsight. Project experience using proven delivery methodologies (e.g., Waterfall, Agile, Scrum, DevOps, Testing), frameworks and best practices. Advanced working knowledge in SQL and relational databases more »
Queens Road, Teddington, Middlesex, England Hybrid / WFH Options
LGC LIMITED
the opportunity for to experience the ins and outs of working in a Data team and exposure to market leading technologies such as Tableau, Snowflake, dbt, and SAP Business Objects. Your duties and responsibilities in this role will consist of: Supporting in handling user access requests and approval process for more »
team of direct reports committed to driving adherence to data governance within the team and resolving escalated technical queries within a team.Extensive experience with Snowflake is essential and working knowledge of dbt, Airflow and AWS is highly desirable.Extensive stakeholders’ management experience ideally in area of responsibility, with IT and architects more »
interface into IT.The ability to resolve complex and non-routine issues and identify improvements in the testing and validation of data accuracy.Extensive experience with Snowflake is essential and working knowledge of dbt, Airflow and AWS is highly desirable.Strong background developing, constructing, testing, and maintaining practical data architectures and drive improvements more »
a critical user experience element.Product development for data science/large data processing tools.A firm technical understanding of cloud technologies in key platforms like Snowflake, AWS, GCP, Azure, and DatabricksMore about us:LiveRampers are empowered to live our values of committing to shared goals and operational excellence. Connecting LiveRampers to more »
About the role: My client is seeking an experienced Senior Data Engineer with a specialisation in Snowflake to own and evolve the company's data stack. The ideal candidate for this role will be a self-starter with a knack for problem-solving and a strong ability to learn new … a confident individual who sees improving inefficient code and processes as a challenge rather than a chore, and who stays up-to-date on Snowflake best practices and new features. This role is perfect for someone who enjoys the challenge of working with Snowflake and is passionate about data engineering. more »
stakeholders and technical teams. * Ability to work autonomously and as part of a team in a fast-paced environment. Nice to Have: * Experience with Snowflake or Azure data platforms. * Previous involvement in insurance data modernization initiatives. more »
knowledge of data modelling concepts and data management techniques are advantageous. Skills in implementing reports/analytics will be desirable. Knowledge of AWS and Snowflake desirable. You are responsible for: Ensuring we deliver value from data to our business stakeholders. Working with stakeholders to define, document, prioritise business requirements. Sharing more »
D&B data and will allow D&B to explore and address new use cases in the context of Cloud Service providers such as ‘Snowflake, Google Cloud and Databricks’. You'll help to shape the future strategy of D&B’s business in relation to client digital transformation, connected more »
you’ll enjoy this opportunity!Experience of database design, performance optimisation, troubleshooting and awareness of cloud-based Big Data “data warehouse” systems such as Snowflake, Databricks, BigQuery or Athena.PACKAGE DESCRIPTIONJob Reference: 14922 Band: D Salary: 62,620 - 93,930 depending on relevant skills, knowledge and experience. The expected salary range more »
is instrumental in implementing the data strategy that supports front office stakeholders, systems, and clients. The Data Engineer will leverage cutting-edge technology in Snowflake, Python, SQL, and Azure to enhance our data capabilities and support the investment decision-making process. Key responsibilities include: Design, build, and maintain efficient, reliable … data pipelines using ETL and ELT processes. Ensure the seamless flow and availability of high-quality data across the organization. Utilize Snowflake for data storage, processing, and analytics. Optimize data structures and queries to support analytics and BI initiatives. Develop scripts in Python and SQL to automate data processes, integrate … Must have skills Proven experience as a Data Engineer, with a strong background in data pipeline construction, data architecture, and data warehousing. Expertise in Snowflake, Python, SQL, and cloud-native ETL/ELT tools. Familiarity with Azure and other cloud-native technologies. Understanding of finance industry data domains and their more »
Systems, Cloudera/Hortonworks, AWS EMR, GCP DataProc or GCP Cloud Data Fusion. * NoSQL Databases. Dynamo DB/Neo4j/Elastic, Google Cloud Datastore. * Snowflake Data Warehouse/Platform * Streaming technologies and processing engines, Kinesis, Kafka, Pub/Sub and Spark Streaming. * Experience of working with CI/CD technologies … cloud platforms * Google Data Products tools knowledge (e.g., BigQuery, Dataflow, DataProc, AI Building Blocks, Looker, Cloud Data Fusion, Data prep, etc.) Relevant certifications * Python * Snowflake * Databricks To apply please click the "Apply" button and follow the instructions. For a further discussion, please contact Sam Stark - (phone number removed) 83DATA is more »