required Expertise with Core Java, namely multithreading, accompanied with some Python is also acceptable Advanced SQL. Experience with cloud technologies is a plus (AWS, Snowflake, etc) Familiarity with equities and equity derivatives within a real-time electronic trading environment is required Strong communication skills; ability to liaise with investment professionals more »
Experience working in an electronic/systematic trading or investment firm. Experience working directly with Portfolio Managers, Traders, Quants and/or Researchers. AWS, Snowflake JavaScript, Typescript, HTML5, React .Net, C#, Java, JEE, Jakarta EE, Spring, Object-relational Mappers (ORM). RESTful Web Services Microservices Implementations. Data visualisation. Role Description more »
skills in Python and Java 11+, with a good grasp of frameworks like DropWizard. Lakehouse Architectures: Familiarity with modern data technologies such as Dremio, Snowflake, Iceberg, (Py)Spark/Glue/EMR, dbt, and Airflow/Dagster. AWS Services: Hands-on experience with AWS, especially S3, ECS, and EC2/ more »
modelling and data vault 2.0 architectures) Key Responsibilities: Build and maintain scalable data pipelines written in Python and SQL and ran on AWS/Snowflake Taking ownership of data quality within projects Managing and educating a range of stakeholders when gathering requirements and delivering data projects Building effective and collaborative more »
experience with BI tools such as Looker is highly advantageous Experience working with cloud data warehouses, ideally with AWS/Redshift, Azure, GCP, or Snowflake Experience with dbt is highly advantageous Responsibilities Analyze, organize, and prepare raw data for modeling and data analytics Architect and assist in building data systems more »
for achieving project success. Key Responsibilities: Software Development: Write high-quality, maintainable code using languages such as Python and SQL Establish data tools like Snowflake and Azure Data Lake Services (ADLS) Gen 2 Utilize PowerBI, Tableau, or similar tools to design and create interactive and visually appealing dashboards and reports. more »
solutions including the choice of data sources and ETL approach Familiar with engineering processes for developing APIs Understanding the principles of building solutions using Snowflake, open-source frameworks, multi-cloud infrastructure This is a contract position. more »
General knowledge of relational databases (e.g., SQL Server, Oracle, MySQL) and NoSQL databases (e.g., MongoDB, Cassandra, Couchbase). Experience with data warehousing solutions (e.g., Snowflake, Redshift, BigQuery). Excellent scripting skills (e.g., Python, SQL). Strong analytical and problem-solving skills. Excellent communication and interpersonal skills, with the ability to more »
automation, data visualization tools, DevOps practices, machine learning frameworks, performance tuning, and data governance tools. Technical proficiency in Microsoft Azure SQL (PaaS & IaaS), CosmosDB, Snowflake Data Warehouse, Power Apps, Reporting Services, Tableau, T-SQL, Python Programming, and Azure Purview. If you're ready to join a dynamic team and drive more »
Analytics. Experience of working with large, complex data warehouses and/or data lakes. Familiarity with cloud-based analytics platforms such as AWS, Azure, Snowflake, Google Cloud Platform (Big Query), Spark, and Splunk. Proficiency in SQL and experience using one or more of the following languages: R, Python, Scala, and more »
of AWS infrastructure, including S3, Redshift, Lambda, Step Functions, DynamoDB, AWS Glue, RDS, Athena, Kinesis, Quicksight. We also widely use other tech such as Snowflake, DBT, Databricks, Informatica, Matillion, Airflow, Tableau, Power BI etc. The Lead Data Architect will liaise with clients to define requirements, refine solutions and ultimately hand more »
Advanced knowledge of data visualization tools and dashboard design experience essential (Tableau preferred) Experience in the use of large databases and data warehouses required (Snowflake preferred) Experience with transitioning and deploying data science and quantitative models to Production environment required along with exposure to Agile development process -- ability to articulate more »
languages or toolsets: AutoSys, Azure Function App, Azure GIT, Azure Portal, C#, Databricks, GraphQL/Graph API, Informatica CDI, Informatica Power Center, Java, Javascript, Snowflake, PowerBI, PyRecs, Python, Selenium, Spark, and SQL Nice to have skills Ability to propose and estimate the financial impact of architectural alternatives Existing knowledge of more »
Spark experience Must have strong AWS experience Must have Terraform experience SQL & NoSQL experience Have built out Data Warehouses & built Data Pipelines Strong Databricks & Snowflake experience Docker, ECS, Kubernetes & Orchestration tools like Airflow or Step Functions are nice to have Contracts are running for 6 months initially, paying up to more »
interactive visual reporting dashboards + integrating visualizations into web applications. Azure Cloud: designing + implementing Azure cloud based solutions + using data tools like Snowflake and Azure Data Lake Services (ADLS). Developing solutions for Proof of Concept (POC) + Minimum Viable Products (MVP). Advantageous: Banking, JavaScript, HTML, CSS more »
interactive visual reporting dashboards + integrating visualizations into web applications. Azure Cloud: designing + implementing Azure cloud based solutions + using data tools like Snowflake and Azure Data Lake Services (ADLS). Developing solutions for Proof of Concept (POC) + Minimum Viable Products (MVP). Advantageous: active SC Security Clearance more »