join the team on an initial 12 Month Contract Requirements 5+ Years Experience Working as a Data Architect Strong Customer Management Excellent AWS knowledge Snowflake experience a plus Excellent Agile Experience If you are interested in a new role apply now below Reperio Human Capital acts as an Employment Agency more »
is essential. We are seeking a DevOps Engineer with the following: Experience maintaining and scaling AWS services Big data experience with tooling such as Snowflake/PostSQL (over 1TB Daily ideally) Containerisation experience with Docker or Kubernetes We can offer a DevOps Engineer in this team: Remote working anywhere in more »
in Data Science, Data Management, Information Technology, or similar Proficient in creating models in excel, as well as utilising data visualisation tools (Tableau) and Snowflake Prior experience working in a large, matrixed organisation, where you have played a pivotal role in supporting multiple departments on various projects. This is an more »
seamless integration with the ERP system. Business Intelligence (BI) Tool Implementation: - Evaluate, scope, and recommend a cutting-edge front-end BI tool such as Snowflake, Tableau, or Power BI. - Lead the implementation and customisation of the chosen BI tool to meet the company’s analytical needs. ETL Operations: - Manage and more »
of campaign performance, driving improved targeted promotions, which in turn aligns with their hyper personalisation strategy for the end customer. Technically they operate a Snowflake and DataBricks environment, leveraging SQL to extract data and R and Python for modelling and statistical analysis, building code and tools. They also have PowerBI more »
with the ability to collaborate effectively with stakeholders at all levels. - Certifications in Informatica Cloud services and/or relevant cloud platforms (e.g. AWS, Snowflake) would be a plus. more »
sales. Ability to articulate complex solutions and align them with customers’ needs. Demonstrable experience of partner vendor collaboration with partners such as AWS, Microsoft, Snowflake and Informatica. Characteristically a self-starter, typically self-sufficient while also collaborative and with strong solutioning, planning and management capabilities. Excellent presentation abilities and composure more »
for a Database Specialist who is keen to work on multiple database platforms. This position is responsible for operating PostgreSQL, MySQL, Oracle, DB2 and Snowflake databases for development, test, and production environments. This individual will become part of an international DBA team and will work closely with application and infrastructure more »
in data modeling techniques such as entity-relationship diagramming, dimensional modeling, and data normalization. • Proficient in SQL and database management systems (e.g., MySQL, Oracle, Snowflake, etc.). • Demonstrated experience and familiarity with emerging technologies, including IoT and AI. • Proficient in data visualization tools such as Tableau, Power BI and Sigma. more »
and PyTorch. Experience of working with large quantities of data using e.g., Hadoop and Spark. Knowledge of cloud-based analytical platforms such as Databricks, Snowflake, Google BigQuery. Experience with workflow and pipelining frameworks such as Kubeflow, MLFlow, or Argo. Strong appreciation of ethical AI considerations. Job Reference Number 15063BR Employee more »
Northamptonshire, England, United Kingdom Hybrid / WFH Options
Capgemini
ll do and bring: Proficiency in AWS services relevant to data engineering such as S3, Glue, EMR, Athena, and Lambda. Hands-on experience with Snowflake, Redshift cloud data warehousing solutions. Familiarity with DBT (Data Build Tool) for managing transformations in the data pipeline. Strong programming skills in technologies like Python … Scala, Spark, PySpark or Ab Initio, Glue, Starburst, Snowflake, Redshift, Starburst, Hybrid(on-prem to cloud) for building data pipelines and ETL processes. Experience with data pipeline orchestration tools like Apache Airflow . Knowledge of SQL and ability to write complex queries for data transformation and analysis. Understanding of data more »
Knutsford, England, United Kingdom Hybrid / WFH Options
Capgemini
ll do and bring: Proficiency in AWS services relevant to data engineering such as S3, Glue, EMR, Athena, and Lambda. Hands-on experience with Snowflake, Redshift cloud data warehousing solutions. Familiarity with DBT (Data Build Tool) for managing transformations in the data pipeline. Strong programming skills in technologies like Python … Scala, Spark, PySpark or Ab Initio, Glue, Starburst, Snowflake, Redshift, Starburst, Hybrid(on-prem to cloud) for building data pipelines and ETL processes. Experience with data pipeline orchestration tools like Apache Airflow . Knowledge of SQL and ability to write complex queries for data transformation and analysis. Understanding of data more »
music industry. Requirements To qualify for this role, you will require: · Strong experience with SQL and Tableau · Experience working within Cloud Environments (Redshift, Bigquery, Snowflake) Salary A successful candidate will receive: · A Salary of up to £52,000 · Excellent progression opportunities Process- two interview stages 1st Stage- Short Conversation with more »
knowing of SQL and working with large and complex data sets You've had some exposure to cloud based analytical platforms such as Databricks, Snowflake, Google BigQuery etc You have experience of marketing campaign design and analysis, working with customer data to find key insights to inform and drive change more »
Camberley, Surrey, South East, United Kingdom Hybrid / WFH Options
Siemens Healthineers
Classic" or ASP.NET SQL Server skills and use with .NET clients and web-based systems Knowledge of cloud-based services such as Azure and Snowflake (Applicants will be expected to describe projects that use these skills) Our Benefits: 26 days' holiday with the option to buy or sell an additional more »
Queens Road, Teddington, Middlesex, England Hybrid / WFH Options
LGC LIMITED
the opportunity for to experience the ins and outs of working in a Data team and exposure to market leading technologies such as Tableau, Snowflake, dbt, and SAP Business Objects. Your duties and responsibilities in this role will consist of: Supporting in handling user access requests and approval process for more »
With a deep understanding of enterprise software environments, especially Cloud/SaaS analytics/Data warehouse and Business Intelligence solutions, applications & technologies including BigQuery, Snowflake, Redshift, and Databricks. Commercial acumen: you are deeply data-driven and care about what moves the needle for pipeline creation, rather than focusing on vanity more »
them further grow their already exciting business. Within this role, you will be responsible for Maintaining, supporting and expanding existing data pipelines using DBT, Snowflake and S3. You will also be tasked with implementing standardised data ingress/egress pipelines coupled with onboarding new, disparate data sets, sourced from many … understanding of Python Experience developing in the cloud (AWS preferred) Solid understanding of libraries like Pandas and NumPy Experience in data warehousing tools like Snowflake, Databricks, BigQuery Familiar with AWS Step Functions, Airflow, Dagster, or other workflow orchestration tools Commercial experience with performant database programming in SQL Capability to solve more »
Engineering Delivery Lead (AWS/Snowflake/Python/Scala) - UK based - 6 Month contract + extensions - £450-550/day + negotiable Our client are looking for 2 Engineering Delivery Leads for workstreams on Platform Builds. The suitable individual needs to have an understanding of Data modelling Principles & best … practices and also needs prior experience leading a data engineering team. Key Tech: - AWS (S3, Glue, EMR, Athena, Lambda) - Snowflake, Redshift - DBT (Data Build Tool) - Programming: Python, Scala, Spark, PySpark or Ab Initio - Data pipeline orchestration (Apache Airflow) - Knowledge of SQL This is a 6 month initial contract with a more »
biotech space to interact with stakeholders. Not a validation specialist but understands the sequence and what that process needs and is involved. Data tools - Snowflake, Matillion, Tableau, PowerBI more »
play a pivotal role in developing a new marketplace platform from the ground up, utilizing a blend of leading technologies such as Salesforce, Adobe, Snowflake/Databricks, and Zuora. This role requires a strategic thinker who can align technology initiatives with business goals, ensuring scalability, integration, and consistency across systems. more »
project wins. I'm looking for an experienced Lead Data Engineer to play a pivotal role in building state-of-the-art Databricks and Snowflake Lakehouse solutions from scratch. If you have experience with Azure, Databricks or Snowflake, SQL, ETL, and a passion for consulting, please get in touch. Key … Responsibilities: Design, develop, and maintain cutting-edge Databricks or Snowflake Lakehouse solutions from scratch Collaborate with cross-functional teams to understand data requirements and translate them into efficient ETL processes. Work with Azure services to ensure data storage, security, and scalability. Develop and optimize SQL queries for data extraction and more »
is instrumental in implementing the data strategy that supports front office stakeholders, systems, and clients. The Data Engineer will leverage cutting-edge technology in Snowflake, Python, SQL, and Azure to enhance our data capabilities and support the investment decision-making process. Key responsibilities include: Design, build, and maintain efficient, reliable … data pipelines using ETL and ELT processes. Ensure the seamless flow and availability of high-quality data across the organization. Utilize Snowflake for data storage, processing, and analytics. Optimize data structures and queries to support analytics and BI initiatives. Develop scripts in Python and SQL to automate data processes, integrate … Must have skills Proven experience as a Data Engineer, with a strong background in data pipeline construction, data architecture, and data warehousing. Expertise in Snowflake, Python, SQL, and cloud-native ETL/ELT tools. Familiarity with Azure and other cloud-native technologies. Understanding of finance industry data domains and their more »
allowance for use within our subsidized onsite canteen Must have skills Working knowledge of Azure DevOps (Git, build and release pipelines), Python, databricks, and snowflake as a requirement. Nice to haves are: PowerBI, and Attunity Replicate Significant information technology and/or application development, database development and/or python … OpenID Connect, and JWT tokens. Database Skills: Proficiency in working with databases, particularly SQL databases like Microsoft SQL Server or Azure SQL Database or Snowflake Agile Methodologies: Experience with Agile development methodologies like Scrum or Kanban for project management and collaboration. more »
A highly prestigious Investment Management firm are looking to hire a Senior Cloud Data Engineer with strong Python and Snowflake experience to join their new Data & AI team in a permanent position. The Cloud Data Engineer will join alongside Data Scientists and Data Modellers to manage and improve the cloud … based Data Warehouse (Azure, Snowflake), and guide the organisation through an unprecedented period of data change. Your Key responsibilities will be: Engineering scalable and secure cloud data solutions for high-performance Data Science workloads and Generative AI applications Enhancing and managing the cloud-based Data Warehouse (Azure, Snowflake) Designing, building … you will have: 5+ years' experience in cloud data engineering, with specific experience in building and maintaining cloud Data Warehouses 2+ years’ experience of Snowflake configuration, deployment and maintenance Extensive hands-on experience of using Python commercially (scikit-learn, pandas, numpy, etc.) Extensive hands-on experience of using Azure to more »