integrating them with AWS services like SageMaker. Familiarity with multi-cloud environments and hybrid cloud architectures. Knowledge of third-party data tools such as Snowflake, Databricks, or Tableau. Experience with real-time analytics and event-driven architectures using tools such as Apache Kafka. Background in big data technologies such as more »
Spark, Kafka). Experience with AWS and its data services (e.g. S3, Athena, AWS Glue). Familiarity with data warehousing solutions (e.g., Redshift, BigQuery, Snowflake). Knowledge of containerization and orchestration tools (e.g., Docker, ECS, Kubernetes). Knowledge of data orchestration tools (e.g. Prefect, Apache Airflow). Familiarity with CI more »
systems onto the cloud platforms, one of the key strategies for the division in which you'll get exposure to technologies like AWS S3, Snowflake etc. Preferred Qualifications Knowledge or interest in investment banking or financial instruments Experience with big data concepts (we use Hadoop for Data Lake) Experience with more »
Engineering and Analytics: Work closely with data teams to define robust data pipelines and scalable cloud-based data platforms using tools like Apache Kafka, Snowflake, or Databricks. Monitoring and Performance Tuning: Implement advanced monitoring and observability solutions using tools like Prometheus, Grafana, or Datadog to proactively identify and resolve performance more »
Engineering and Analytics: Work closely with data teams to define robust data pipelines and scalable cloud-based data platforms using tools like Apache Kafka, Snowflake, or Databricks. Monitoring and Performance Tuning: Implement advanced monitoring and observability solutions using tools like Prometheus, Grafana, or Datadog to identify and resolve performance bottlenecks more »
Engineering and Analytics: Work closely with data teams to define robust data pipelines and scalable cloud-based data platforms using tools like Apache Kafka, Snowflake, or Databricks. Monitoring and Performance Tuning: Implement advanced monitoring and observability solutions using tools like Prometheus, Grafana, or Datadog to identify and resolve performance bottlenecks more »
as Code (IaC) tools such as CDK or Terraform. Familiarity with monitoring and logging tools (e.g., AWS CloudWatch or similar) for observability. Experience with Snowflake for data storage and retrieval. Where next? Let’s talk about this role About NewDay We help people move forward with credit and help our more »
in financial data visualization tools, with expertise in TypeScript being a plus. Familiarity with data science tools such as Spark, pandas, DuckDB, Databricks, or Snowflake is advantageous. Proven track record of leading development teams in fast-paced environments. Experience working within FinTech or Financial Services sectors. Strong communication and leadership more »
City Of London, England, United Kingdom Hybrid / WFH Options
Harrington Starr
to communicate fluently with both technical and non-technical audiences Experience with data science tools, e.g. one or more of Spark, pandas, DuckDB, DataBricks, Snowflake Knowledge of agile development and continuous delivery methodologies Contact Ciara Clarke at Harrington Starr for a confidential discussion on this role. more »
Engineering and Analytics: Work closely with data teams to define robust data pipelines and scalable cloud-based data platforms using tools like Apache Kafka, Snowflake, or Databricks. Monitoring and Performance Tuning: Implement advanced monitoring and observability solutions using tools like Prometheus, Grafana, or Datadog to proactively identify and resolve performance more »
Engineering and Analytics: Work closely with data teams to define robust data pipelines and scalable cloud-based data platforms using tools like Apache Kafka, Snowflake, or Databricks. Monitoring and Performance Tuning: Implement advanced monitoring and observability solutions using tools like Prometheus, Grafana, or Datadog to proactively identify and resolve performance more »
Engineering and Analytics: Work closely with data teams to define robust data pipelines and scalable cloud-based data platforms using tools like Apache Kafka, Snowflake, or Databricks. Monitoring and Performance Tuning: Implement advanced monitoring and observability solutions using tools like Prometheus, Grafana, or Datadog to proactively identify and resolve performance more »
Engineering and Analytics: Work closely with data teams to define robust data pipelines and scalable cloud-based data platforms using tools like Apache Kafka, Snowflake, or Databricks. Monitoring and Performance Tuning: Implement advanced monitoring and observability solutions using tools like Prometheus, Grafana, or Datadog to proactively identify and resolve performance more »
trading industries. Experience in object-oriented development with strong software engineering foundations. Experience with data-engineering cloud technologies such as Apache Airflow, K8S, Clickhouse, Snowflake, Redis, cache technologies, and Kafka. Experience with relational and non-relational DBs. Proficient in SQL and query optimizations. Experience with designing infrastructure to maintain high more »
consultancy. Knowledge of data modelling techniques (dimensional modelling, data vault, etc.). Proficiency in SQL and Python. Experience with cloud-based data platforms (preferably Snowflake and/or Databricks). A consultative approach to problem-solving, with strong interpersonal and communication skills to liaise effectively with clients and stakeholders. Bachelor more »
JavaScript) Experience building frontend applications (we use TypeScript, React, and GraphQL) Experience with various database technologies and query languages (we use Neo4j, Kafka, MySQL, Snowflake, Elasticsearch, and more) Familiar with low-latency techniques to help improve page load time and reliability Experience with microservices, APIs, and related standards such as more »
python Experience with "big data" technologies and data platforms - we use bigquery, apache ibis, sqlglot, DBT. You might have experience with hadoop, hive, redshift, snowflake, spark or similar. Experience with Version control/CI/CD - we use git and github actions. Fluency with unix or macos shells, ssh Shell more »
problem spaces using a meticulous, precise, and thorough approach to working. We use and teach TypeScript, React and Node.js for our platform Postgres and Snowflake for data storage GitHub for source control and Jira for change management Docker and jFrog to package our services into containers GitHub and Argo CD more »
in financial data visualization tools, with expertise in TypeScript being a plus. Familiarity with data science tools such as Spark, pandas, DuckDB, Databricks, or Snowflake is advantageous. Proven track record of leading development teams in fast-paced environments. Experience working within FinTech or Financial Services sectors. Strong communication and leadership more »
years in data engineering or data science consulting, with strong ETL pipeline development and cloud-based environment experience (Azure, AWS, DataBricks, or Snowflake). Proficient in Python (including numpy, pandas, scikit-learn), SQL, dimensional modeling, Power BI, Git, CI/CD, and VSCode/PyCharm. Proficiency in English and French more »
years in data engineering or data science consulting, with strong ETL pipeline development and cloud-based environment experience (Azure, AWS, DataBricks, or Snowflake). Proficient in Python (including numpy, pandas, scikit-learn), SQL, dimensional modeling, Power BI, Git, CI/CD, and VSCode/PyCharm. Proficiency in English and French more »
a strong data focus. Experience in the front office of a financial company, ideally within commodities will also be required, additional technical experience with Snowflake and Terraform would be a huge plus. You’ll have lots of input into building all kinds of new tools and models and you’ll more »
and develop data warehouses Provide guidance and mentorship to junior team members To be successful in this role you will have. Extensive experience with Snowflake Experience with DBT, Airflow or Python Cloud Data Engineering experience with AWS/Azure/GCP This is a hybrid role based from the companies more »
pipelines, and doing transformation and ingestion in a specific tech suite. Minimum Qualifications: Extensive experience in implementing solutions around the AWS cloud environment (S3, Snowflake, Athena, Glue). In-depth understanding of database structure principles. Strong knowledge of database structure systems and data mining. Excellent understanding of Data Modelling & Kinesis. more »
and doing transformation and ingestion in a certain tech suite. Minimum requirements include: Extensive experience in implementing solutions around the AWS cloud environment (S3, Snowflake, Athena, Glue). In-depth understanding of database structure principles. Strong knowledge of database structure systems and data mining. Excellent understanding of Data Modelling & Kinesis. more »