experience with BI tools such as Looker is highly advantageous Experience working with cloud data warehouses, ideally with AWS/Redshift, Azure, GCP, or Snowflake Experience with dbt is highly advantageous Responsibilities Analyze, organize, and prepare raw data for modeling and data analytics Architect and assist in building data systems more »
cloud technologies such as EMR, Lambda, EC2, and data pipelines. Experience leading data warehousing and analytics projects, including using technologies such as Airflow, Jenkins, Snowflake, and Kinesis. Experience with Agile, DevOps, and CICD frameworks in cloud-based environments. Exposure to at least one dashboarding tool like Tableau, Power BI, Sisense more »
solutions including the choice of data sources and ETL approach Familiar with engineering processes for developing APIs Understanding the principles of building solutions using Snowflake, open-source frameworks, multi-cloud infrastructure In Return: A bonus scheme that pays up to 20%, and a benefits package that is one of the more »
CD. professional experience with SQL and data transformation, ideally with dbt or similar. with at least one of these Cloud technologies: AWS, Microsoft Azure, Snowflake, GCP. Apply to the Role Roles like these are snapped up very quickly, so act now if you do not want to miss out! Reply more »
required Expertise with Core Java, namely multithreading, accompanied with some Python is also acceptable Advanced SQL. Experience with cloud technologies is a plus (AWS, Snowflake, etc) Familiarity with equities and equity derivatives within a real-time electronic trading environment is required Strong communication skills; ability to liaise with investment professionals more »
Better Placed Ltd - A Sunday Times Top 10 Employer in 2023!
pipelines using ETL tools like Apache Spark or Apache Beam Experience with cloud-based data warehousing solutions such as Amazon Redshift, Google BigQuery, or Snowflake Proficiency in SQL and NoSQL databases Familiarity with data modeling techniques and implementing efficient database schemas Knowledge of distributed systems, containerization, and container orchestration tools more »
processes, and technologies. Strong SQL skills (ideally with Azure SQL), experience working with relational databases, and programming experience in Python or Scala. Experience with Snowflake and its tooling (Snowpark, Snowpipe, etc.). Familiarity with Fivetran, DBT, TensorFlow, PyTorch, and other modern data stack components. Knowledge of data integration and ETL more »
skills in Python and Java 11+, with a good grasp of frameworks like DropWizard. Lakehouse Architectures: Familiarity with modern data technologies such as Dremio, Snowflake, Iceberg, (Py)Spark/Glue/EMR, dbt, and Airflow/Dagster. AWS Services: Hands-on experience with AWS, especially S3, ECS, and EC2/ more »
automation, data visualization tools, DevOps practices, machine learning frameworks, performance tuning, and data governance tools. Technical proficiency in Microsoft Azure SQL (PaaS & IaaS), CosmosDB, Snowflake Data Warehouse, Power Apps, Reporting Services, Tableau, T-SQL, Python Programming, and Azure Purview. If you're ready to join a dynamic team and drive more »
experience with data profiling, sourcing, and cleansing routines utilizing typical data quality functions involving standardization, transformation, rationalization, linking and matching, Good knowledge of Databricks, Snowflake, Azure/AxoWS/Oracle cloud, R, Python. Additional Information Location: This role can be delivered in a hybrid nature from one of these offices more »
for 5 or more consecutive years Demonstrated experience in data architecture or similar role Practical experience across a variety of platforms and languages i.e.Databricks, Snowflake, Azure, AWS, Oracle Cloud, R, Python or similar Understanding of data management best practices including demonstrated experience with data profiling, sourcing, and cleansing routines utilizing more »
an extremely fast paced environment. Within this role, you will be responsible for building data pipelines for a cloud-based warehouse using Azure and Snowflake, enhancing data capabilities for analytics and science. What you need: 3+ years of hands-on experience as a Data Engineer, building ETL pipelines and managing more »
Experience working in an electronic/systematic trading or investment firm. Experience working directly with Portfolio Managers, Traders, Quants and/or Researchers. AWS, Snowflake JavaScript, Typescript, HTML5, React .Net, C#, Java, JEE, Jakarta EE, Spring, Object-relational Mappers (ORM). RESTful Web Services Microservices Implementations. Data visualisation. Role Description more »
an opportunity you would like to apply to, please review the necessary competencies below: Knowledge of ETL, Analytics & Data Warehousing Experience with Cloud, AWS, Snowflake, Bigquery Experience in building a data driven culture, in the form of self-serve analytics Previous hands on experience in building a best in class more »
experience developing ML or statistical models related to pricing. Strong familiarity with data visualization software (e.g., Tableau, PowerBI) and data management tools (e.g., SQL, Snowflake). Bonus Points for: Experience implementing Machine Learning models and familiarity with large language models. Knowledge of cloud-based solutions on major providers (Azure, GCP more »