solutions including the choice of data sources and ETL approach Familiar with engineering processes for developing APIs Understanding the principles of building solutions using Snowflake, open-source frameworks, multi-cloud infrastructure This is a contract position. more »
processes, and technologies. Strong SQL skills (ideally with Azure SQL), experience working with relational databases, and programming experience in Python or Scala. Experience with Snowflake and its tooling (Snowpark, Snowpipe, etc.). Familiarity with Fivetran, DBT, TensorFlow, PyTorch, and other modern data stack components. Knowledge of data integration and ETL more »
processes, and technologies. Strong SQL skills (ideally with Azure SQL), experience working with relational databases, and programming experience in Python or Scala. Experience with Snowflake and its tooling (Snowpark, Snowpipe, etc.). Familiarity with Fivetran, DBT, TensorFlow, PyTorch, and other modern data stack components. Knowledge of data integration and ETL more »
london, south east england, United Kingdom Hybrid / WFH Options
Orbis Group
CD. professional experience with SQL and data transformation, ideally with dbt or similar. with at least one of these Cloud technologies: AWS, Microsoft Azure, Snowflake, GCP. Apply to the Role Roles like these are snapped up very quickly, so act now if you do not want to miss out! Reply more »
skills in Python and Java 11+, with a good grasp of frameworks like DropWizard. Lakehouse Architectures: Familiarity with modern data technologies such as Dremio, Snowflake, Iceberg, (Py)Spark/Glue/EMR, dbt, and Airflow/Dagster. AWS Services: Hands-on experience with AWS, especially S3, ECS, and EC2/ more »
london, south east england, United Kingdom Hybrid / WFH Options
Durlston Partners
skills in Python and Java 11+, with a good grasp of frameworks like DropWizard. Lakehouse Architectures: Familiarity with modern data technologies such as Dremio, Snowflake, Iceberg, (Py)Spark/Glue/EMR, dbt, and Airflow/Dagster. AWS Services: Hands-on experience with AWS, especially S3, ECS, and EC2/ more »
automation, data visualization tools, DevOps practices, machine learning frameworks, performance tuning, and data governance tools. Technical proficiency in Microsoft Azure SQL (PaaS & IaaS), CosmosDB, Snowflake Data Warehouse, Power Apps, Reporting Services, Tableau, T-SQL, Python Programming, and Azure Purview. If you're ready to join a dynamic team and drive more »
automation, data visualization tools, DevOps practices, machine learning frameworks, performance tuning, and data governance tools. Technical proficiency in Microsoft Azure SQL (PaaS & IaaS), CosmosDB, Snowflake Data Warehouse, Power Apps, Reporting Services, Tableau, T-SQL, Python Programming, and Azure Purview. If you're ready to join a dynamic team and drive more »
for achieving project success. Key Responsibilities: Software Development: Write high-quality, maintainable code using languages such as Python and SQL Establish data tools like Snowflake and Azure Data Lake Services (ADLS) Gen 2 Utilize PowerBI, Tableau, or similar tools to design and create interactive and visually appealing dashboards and reports. more »
experience with data profiling, sourcing, and cleansing routines utilizing typical data quality functions involving standardization, transformation, rationalization, linking and matching, Good knowledge of Databricks, Snowflake, Azure/AxoWS/Oracle cloud, R, Python. Additional Information Location: This role can be delivered in a hybrid nature from one of these offices more »
for 5 or more consecutive years Demonstrated experience in data architecture or similar role Practical experience across a variety of platforms and languages i.e.Databricks, Snowflake, Azure, AWS, Oracle Cloud, R, Python or similar Understanding of data management best practices including demonstrated experience with data profiling, sourcing, and cleansing routines utilizing more »
london, south east england, United Kingdom Hybrid / WFH Options
Harrington Starr
an extremely fast paced environment. Within this role, you will be responsible for building data pipelines for a cloud-based warehouse using Azure and Snowflake, enhancing data capabilities for analytics and science. What you need: 3+ years of hands-on experience as a Data Engineer, building ETL pipelines and managing more »
for achieving project success. Key Responsibilities: Software Development: Write high-quality, maintainable code using languages such as Python , SQL . Establish data tools like Snowflake and Azure Data Lake Services (ADLS) Gen 2 Utilize PowerBI, Tableau, or similar tools to design and create interactive and visually appealing dashboards and reports. more »
london, south east england, United Kingdom Hybrid / WFH Options
Anson McCade
SSIS, Talend or Pentaho Data governance and data management tools such as Informatica MDM, Informatica AXON, Informatica EDC, and Collibra MySQL, SQL Server, Oracle, Snowflake, PostgreSQL and NoSQL databases Programming languages such as Spark or Python Amazon Web Services, Microsoft Azure or Google Cloud and distributed processing technologies such as more »
experience developing ML or statistical models related to pricing. Strong familiarity with data visualization software (e.g., Tableau, PowerBI) and data management tools (e.g., SQL, Snowflake). Bonus Points for: Experience implementing Machine Learning models and familiarity with large language models. Knowledge of cloud-based solutions on major providers (Azure, GCP more »
data and information systems design Experience with data management, database management systems, data services (including APIs), and enterprise data platform technologies (e.g. Fabric, Databricks, Snowflake, etc.) Experience architecting data-centric solutions at a conceptual, logical and physical level Experience in architecting, designing and implementing solutions on AWS and Microsoft Azure more »
their investment models. Tech is also completely flexible. Most of the work is done within Python, C# and Scala with a range of databases. Snowflake is widely used, as are Docker and Kubernetes for containerisation. ETL and ELT tech are also used every day, primarily Airflow, Spark, Hive and a more »
their investment models. Tech is also completely flexible. Most of the work is done within Python, C# and Scala with a range of databases. Snowflake is widely used, as are Docker and Kubernetes for containerisation. ETL and ELT tech are also used every day, primarily Airflow, Spark, Hive and a more »
design, implement and manage data lake/data warehouse platforms. (Some of the following types of providers: AWS, Microsoft Azure, Google Cloud Platform, Databricks, Snowflake, Cloudera, Spark, MongoDB) Done this at companies using high volumes of data, ideally in retailing. Other sectors where used high volume data would also be more »
in Docker Good experience in Linux Good experience in Airflow Good knowledge of cloud architecture Good experience in Terraform Expert experience with database systems (snowflake, sql, postgres etc.) Experience of micro-service development and API development Strong understanding of Agile delivery methodologies Demonstrable ability to design and deliver complex systems more »