experience with BI tools such as Looker is highly advantageous Experience working with cloud data warehouses, ideally with AWS/Redshift, Azure, GCP, or Snowflake Experience with dbt is highly advantageous Responsibilities Analyze, organize, and prepare raw data for modeling and data analytics Architect and assist in building data systems more »
with data orchestration tools: e.g. Apache Airflow, Dagster. Experience with big data storage and processing technologies: e.g. DBT, Spark, SQL, Athena/Trino, Redshift, Snowflake, RDBMSs (PostgreSQL/MySQL). Knowledge of event-driven architectures and streaming technologies: e.g. Apache Kafka, Kafka Streams, Apache Flink. Experience with public cloud environments more »
technical leadership roleExperience in one of the main cloud services (AWS, Google Cloud or Azure) and Big Data services (EMR, Databricks, Synapse, HDInsight, Kinesis, Snowflake, etc.)Experience Skilled in use of the Power Platform, including Power Apps, Power Automate and Power BIAdvanced knowledge of Microsoft Excel and other office applications.Relevant more »
Extensive use of cloud technologies such as AWS and GCP. • Good working knowledge of Data Warehousing technologies (such as AWS Redshift, GCP BigQuery or Snowflake). • Experience in deploying and scheduling code bases in a data development environment, using technologies such as Airflow. • Demonstrable experience of working alongside cross-functional more »
required Expertise with Core Java, namely multithreading, accompanied with some Python is also acceptable Advanced SQL. Experience with cloud technologies is a plus (AWS, Snowflake, etc) Familiarity with equities and equity derivatives within a real-time electronic trading environment is required Strong communication skills; ability to liaise with investment professionals more »
first use cases for genAI. QualificationsWhat you’ll need:• Technical expertise in tools spanning: data warehousing, ETL, internal visualisation and analytics. Good examples are Snowflake, GCP, Azure Analytics, Sagemaker, Databricks, Tableau, PowerBI, Looker, Quicksight, Airflow, astronomer.io, Alteryx, Collibra.• Hands on experience and\or a detailed and deep understanding of the more »
with varying data proficiencyThorough understanding of data lake and data warehousing principles and full project involvement in one or more major technology platforms, e.g. Snowflake, DatabricksProven experience with one or more Cloud Services provider, e.g. AWS, Azure or Google Cloud Platform.Good understanding of role-based access control, its importance in more »
with varying data proficiencyThorough understanding of data lake and data warehousing principles and full project involvement in one or more major technology platforms, e.g. Snowflake, DatabricksProven experience with one or more Cloud Services provider, e.g. AWS, Azure or Google Cloud Platform.Good understanding of role-based access control, its importance in more »
Birmingham, West Midlands (County), United Kingdom
Hippo Digital
with varying data proficiencyThorough understanding of data lake and data warehousing principles and full project involvement in one or more major technology platforms, e.g. Snowflake, DatabricksProven experience with one or more Cloud Services provider, e.g. AWS, Azure or Google Cloud Platform.Good understanding of role-based access control, its importance in more »
Data Engineer/Software Engineer/ML Engineer on an open-source tech stack with Python, Scala, Spark, ‘modern data’ platforms such as Databricks, Snowflake etc., cloud platforms, database technologies.· Track record of leading and motivating high performance engineering staff through ‘leading by example’ hands-on approach.Senior EM would have more »
solutions including the choice of data sources and ETL approach Familiar with engineering processes for developing APIs Understanding the principles of building solutions using Snowflake, open-source frameworks, multi-cloud infrastructure In Return: A bonus scheme that pays up to 20%, and a benefits package that is one of the more »
CD. professional experience with SQL and data transformation, ideally with dbt or similar. with at least one of these Cloud technologies: AWS, Microsoft Azure, Snowflake, GCP. Apply to the Role Roles like these are snapped up very quickly, so act now if you do not want to miss out! Reply more »
skills in Python and Java 11+, with a good grasp of frameworks like DropWizard. Lakehouse Architectures: Familiarity with modern data technologies such as Dremio, Snowflake, Iceberg, (Py)Spark/Glue/EMR, dbt, and Airflow/Dagster. AWS Services: Hands-on experience with AWS, especially S3, ECS, and EC2/ more »
total experience in DWBI, Big Data, Cloud Technologies • Implementation experience and hands on experience in either of the 2 Cloud technologies – Azure, AWS, GCP, Snowflake, Databricks • Must Have Hands on experience on at least 2 Hyperscalers (GCP/AWS/Azure platforms) and specifically in Big Data processing services (Apache more »
automation, data visualization tools, DevOps practices, machine learning frameworks, performance tuning, and data governance tools. Technical proficiency in Microsoft Azure SQL (PaaS & IaaS), CosmosDB, Snowflake Data Warehouse, Power Apps, Reporting Services, Tableau, T-SQL, Python Programming, and Azure Purview. If you're ready to join a dynamic team and drive more »
experience with Python.Experience building scalable, high-quality data models that serve complex business use cases.Knowledge of dbt and experience with cloud data warehouses (BigQuery, Snowflake, Databricks, Azure Synapse etc).Proficiency in building BI dashboards and self-service capabilities using tools like Tableau and Looker.Excellent communication skills and experience in managing more »