required Expertise with Core Java, namely multithreading, accompanied with some Python is also acceptable Advanced SQL. Experience with cloud technologies is a plus (AWS, Snowflake, etc) Familiarity with equities and equity derivatives within a real-time electronic trading environment is required Strong communication skills; ability to liaise with investment professionals more »
with data orchestration tools: e.g. Apache Airflow, Dagster. Experience with big data storage and processing technologies: e.g. DBT, Spark, SQL, Athena/Trino, Redshift, Snowflake, RDBMSs (PostgreSQL/MySQL). Knowledge of event-driven architectures and streaming technologies: e.g. Apache Kafka, Kafka Streams, Apache Flink. Experience with public cloud environments more »
Experience working in an electronic/systematic trading or investment firm. Experience working directly with Portfolio Managers, Traders, Quants and/or Researchers. AWS, Snowflake JavaScript, Typescript, HTML5, React .Net, C#, Java, JEE, Jakarta EE, Spring, Object-relational Mappers (ORM). RESTful Web Services Microservices Implementations. Data visualisation. Role Description more »
permanent. Looking for: 3+ years of professional data engineering experience. Proficiency in Python and Java 11+. Familiarity with modern data technologies such as Dremio, Snowflake, Iceberg, (Py)Spark/Glue/EMR, dbt, and Airflow/Dagster. Hands-on experience with AWS. Ability to work effectively with both business and more »
skills in Python and Java 11+, with a good grasp of frameworks like DropWizard. Lakehouse Architectures: Familiarity with modern data technologies such as Dremio, Snowflake, Iceberg, (Py)Spark/Glue/EMR, dbt, and Airflow/Dagster. AWS Services: Hands-on experience with AWS, especially S3, ECS, and EC2/ more »
modelling and data vault 2.0 architectures) Key Responsibilities: Build and maintain scalable data pipelines written in Python and SQL and ran on AWS/Snowflake Taking ownership of data quality within projects Managing and educating a range of stakeholders when gathering requirements and delivering data projects Building effective and collaborative more »
experience with BI tools such as Looker is highly advantageous Experience working with cloud data warehouses, ideally with AWS/Redshift, Azure, GCP, or Snowflake Experience with dbt is highly advantageous Responsibilities Analyze, organize, and prepare raw data for modeling and data analytics Architect and assist in building data systems more »
for achieving project success. Key Responsibilities: Software Development: Write high-quality, maintainable code using languages such as Python and SQL Establish data tools like Snowflake and Azure Data Lake Services (ADLS) Gen 2 Utilize PowerBI, Tableau, or similar tools to design and create interactive and visually appealing dashboards and reports. more »
West Bend, Wisconsin, United States Hybrid / WFH Options
Delta Defense
able to apply them. 2+ years dbt experience required. 5+ years of experience in data engineering required. Prior experience executing within cloud data warehouses (Snowflake, Redshift, BigQuery) required. 2+ years of experience in a business intelligence role desired. Ability to work efficiently with AWS cloud technologies such as S3, EMR more »
solutions including the choice of data sources and ETL approach Familiar with engineering processes for developing APIs Understanding the principles of building solutions using Snowflake, open-source frameworks, multi-cloud infrastructure In Return: A bonus scheme that pays up to 20%, and a benefits package that is one of the more »
General knowledge of relational databases (e.g., SQL Server, Oracle, MySQL) and NoSQL databases (e.g., MongoDB, Cassandra, Couchbase). Experience with data warehousing solutions (e.g., Snowflake, Redshift, BigQuery). Excellent scripting skills (e.g., Python, SQL). Strong analytical and problem-solving skills. Excellent communication and interpersonal skills, with the ability to more »
automation, data visualization tools, DevOps practices, machine learning frameworks, performance tuning, and data governance tools. Technical proficiency in Microsoft Azure SQL (PaaS & IaaS), CosmosDB, Snowflake Data Warehouse, Power Apps, Reporting Services, Tableau, T-SQL, Python Programming, and Azure Purview. If you're ready to join a dynamic team and drive more »
Analytics. Experience of working with large, complex data warehouses and/or data lakes. Familiarity with cloud-based analytics platforms such as AWS, Azure, Snowflake, Google Cloud Platform (Big Query), Spark, and Splunk. Proficiency in SQL and experience using one or more of the following languages: R, Python, Scala, and more »