leadership position. Proven track record of building and scaling data teams and capabilities in a global context. Deep understanding of data architecture, data warehousing, and modern analytics platforms (e.g., Snowflake, Power BI, Tableau, Databricks). Hands-on experience with Microsoft Data Factory, Azure Data Lake, and Microsoft Fabric. Strong knowledge of data governance, privacy regulations (e.g., GDPR), and data lifecycle More ❯
visualization tools, such as Tableau. • Utilize web development technologies to facilitate application development for front end UI used for risk management actions • Develop software for calculations using databases like Snowflake, Sybase IQ and distributed HDFS systems. • Interact with business users for resolving issues with applications. • Design and support batch processes using scheduling infrastructure for calculation and distributing data to other More ❯
technologies essential for automating models and advancing our engineering practices. You're familiar with cloud technologies . You have experience working with data in a cloud data warehouse (Redshift, Snowflake, Databricks, or BigQuery) Experience with a modern data modeling technology (DBT) You document and communicate clearly . Some experience with technical content writing would be a plus You are excited More ❯
queries, alongside the ability to upskill team members through peer review, training, and best practice development. Data Infrastructure: A solid understanding of cloud-based data warehousing (e.g., Azure, Fabric, Snowflake, Redshift, BigQuery) and ETL processes. Analytical Prowess: Exceptional analytical and problem-solving skills, with the ability to translate complex data into clear, actionable business insights. Demonstrated proficiency in Python/ More ❯
for quantitative trading and analytics, with demonstrated expertise in: The python data engineering stack (Polars, Parquet, FastAPI, Jupyter, Airflow, Streamlit, Ray) High-performance data stores and query engines (Starburst, Snowflake) Real-time streaming analytics technologies (Kafka, Flink) Cloud container technologies (AWS, Azure, GCP, Docker, Kubernetes) Proven success in enhancing developer experience that reduces friction in coding, building and deploying APIs More ❯
classical statistics (hypothesis testing, regression, significance, p-value pitfalls) Ability to translate data into plain-English insights and present to C-level audiences Experience working in cloud data warehouses (Snowflake, BigQuery, Redshift) and version control (Git) What we can offer Bonus Hybrid working; meaning you'll be in our Farringdon office Tuesdays to Thursdays 25 days annual leave, plus the More ❯
role and a seasoned architect with solid knowledge of the current technology trends • Expertise in relational and non-relational databases, including Oracle, MS SQL, and modern cloud-based solutions (Snowflake, Redshift, etc.) • 7 to 10 years of experience with enterprise data architecture, database management, and data engineering • Proven experience leading a data transformation project(s) from concept to delivery • Deep More ❯
fast-moving cloud environment. This engineer needs to be very proficient in managing large sets of data, including excellent proficiency with ANSI-SQL querying structured and unstructured data sources (Snowflake, Oracle, SQL, No-SQL). Understanding proper coding techniques, testing requirements, and debugging techniques. Conduct Code-Reviews, and Peer Reviews. Maintain comprehensive technical documentation to support ongoing development and future More ❯
concepts to a wide variety of audiences. For candidates applying for the Senior Consultant role, we additionally require: Working experience with at least one Cloud Platform (AWS, Azure, GCP, Snowflake, Databricks etc.) and exposure to Cloud Architecture principles. Demonstrated experience in people management, product owner or workstream management. Experience supporting and participating in the commercial cycle, including defining project scope More ❯
concepts to a wide variety of audiences. For candidates applying for the Senior Consultant role, we additionally require: Working experience with at least one Cloud Platform (AWS, Azure, GCP, Snowflake, Databricks etc.) and exposure to Cloud Architecture principles. Demonstrated experience in people management, product owner or workstream management. Experience supporting and participating in the commercial cycle, including defining project scope More ❯
Manchester, Lancashire, United Kingdom Hybrid / WFH Options
Capgemini
concepts to a wide variety of audiences. For candidates applying for the Senior Consultant role, we additionally require: Working experience with at least one Cloud Platform (AWS, Azure, GCP, Snowflake, Databricks etc.) and exposure to Cloud Architecture principles. Demonstrated experience in people management, product owner or workstream management. Experience supporting and participating in the commercial cycle, including defining project scope More ❯
a related role Proficient in SQL, Python or R, and experienced with BI tools such as Tableau. Strong analytical, structured, and critical thinking skills. Familiar with data warehousing concepts (Snowflake, AWS). Excellent attention to detail and proficiency in MS Office Suite. A proactive self-starter with strong time and task management skills. Eagerness to learn and adapt to new More ❯
Essential Skills & Experience: Minimum 5 years of experience in a Data Engineering role or equivalent Strong Python and SQL proficiency Proven experience with a modern data warehousing solution (e.g. Snowflake, Azure Data Factory) Experience with modern Python-based orchestration tools (e.g. Prefect) Familiarity with Git-based development workflows and CI/CD pipelines Package: Salary up to £85,000 (depending More ❯
build and manage data pipelines. Our no-code/low-code ETL platform allows seamless integration of data from any source whether databases, applications, or files into lakehouses like Snowflake, Databricks, and Redshift. With pipelines that just work and features like advanced data transformation using dbt Core and end-to-end pipeline observability, were focused on making robust data pipelines More ❯
teams. About YouYou'll bring: Proven experience in data architecture, data modelling, and database design. Strong knowledge of cloud platforms (Azure, AWS, or GCP) and modern data technologies (e.g., Snowflake, Databricks, Kafka). Expertise in SQL, Python, or other data-centric programming languages. Familiarity with data governance, security, and compliance frameworks. Excellent communication and stakeholder management skills. Why Join Us More ❯
Belfast, County Antrim, Northern Ireland, United Kingdom Hybrid / WFH Options
Hays
teams. About YouYou'll bring: Proven experience in data architecture, data modelling, and database design. Strong knowledge of cloud platforms (Azure, AWS, or GCP) and modern data technologies (e.g., Snowflake, Databricks, Kafka). Expertise in SQL, Python, or other data-centric programming languages. Familiarity with data governance, security, and compliance frameworks. Excellent communication and stakeholder management skills. Why Join Us More ❯
Leeds, West Yorkshire, Yorkshire, United Kingdom Hybrid / WFH Options
Forward Role
data processing and ETL workflows Hands-on experience with cloud platforms- Azure Experience designing and maintaining data pipelines using tools like Databricks and PySpark Knowledge of data warehousing solutions - Snowflake experience would be brilliant Understanding of CI/CD processes for deploying data solutions Some exposure to big data technologies and distributed processing Nice to haves: ML pipeline experience Previous More ❯
Experience engaging senior stakeholders, interpreting business needs, and operating with minimal supervision Desirable Experience Python for data transformation or analysis DAX for advanced Power BI metrics Experience working with Snowflake Use of Git for version control Streamlit or similar tools for lightweight data apps This role has been deemed Outside IR35 by the client. Applicants must hold, or be happy More ❯
end-to-end AI/ML projects. Nice to Have: Exposure to LLMs (Large Language Models), generative AI , or transformer architectures . Experience with data engineering tools (Spark, Airflow, Snowflake). Prior experience in fintech, healthtech, or similar domains is a plus. More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
around CI/CD, infrastructure-as-code, and modern data tooling Introduce and advocate for scalable, efficient data processes and platform enhancements Tech Environment: Python, SQL, Spark, Airflow, dbt, Snowflake, Postgres AWS (S3), Docker, Terraform Exposure to Apache Iceberg, streaming tools (Kafka, Kinesis), and ML pipelines is a bonus What We're Looking For: 5+ years in Data Engineering, including More ❯
including Lambda, ECS/EC2, S3 and RDS. Deep experience with Terraform and infrastructure-as-code practices. Familiarity with tools like Airflow or DBT , and data platforms such as Snowflake or Databricks . Solid experience with CI/CD, observability, and platform reliability practices in cloud-native environments. Understanding of distributed computing concepts , and experience designing systems for scale, security More ❯
and with external clients. Strong hands-on experience using SQL for multi-step/complex analytics is essential for this role. Experience in cloud platforms (GCP – BigQuery, Azure -Synapse, Snowflake) and exposure to data science tools/languages such as Python, dbt, D3, Github, GCP/AWS would be advantageous. (This is not a technical advanced data science role so More ❯
City of London, London, United Kingdom Hybrid / WFH Options
MRK Associates
and with external clients. Strong hands-on experience using SQL for multi-step/complex analytics is essential for this role. Experience in cloud platforms (GCP – BigQuery, Azure -Synapse, Snowflake) and exposure to data science tools/languages such as Python, dbt, D3, Github, GCP/AWS would be advantageous. (This is not a technical advanced data science role so More ❯
Market PAS platforms (e.g. OpenTWINS, DXC Assure, Sequel, IRIS). Knowledge of BI/MI tooling (e.g. Power BI, Tableau, Qlik). Familiarity with data warehouse technologies (e.g. SQL, Snowflake, Azure, Informatica, etc.). Exposure to Agile delivery and use of tools such as Jira or Azure DevOps. Certification in Project Management (PMP, PRINCE2) or Agile (Scrum Master, PMI-ACP More ❯
sales, and account teams to uncover real-world data needs and friction points. Build working data products-from custom Python pipelines and enriched datasets to Power BI templates and Snowflake-ready views. Support pre- and post-sales with prototypes, demos, onboarding materials, and technical discovery. Act as a trusted technical advisor, helping clients see the value in Signal's unique More ❯