NoSQL databases is also desirable. Solid experience of at least one Cloud provider such as AWS, Azure or GCP. Solid experience of working with Snowflake and dbt. Strong experience of working with different data formats, e.g., CSV, JSON and XML. Building reliable Data Pipelines. Experience in Data Modelling is highly More ❯
in an Agile Software Development environment. • Demonstrated team-player enabling joint success and a continuous learner. Preferred Qualifications • Experience with data platforms (e.g., Databricks, Snowflake, Amazon Redshift, Palantir, etc.) and technologies for processing large datasets. • Familiarity with distributed platforms such as Kubernetes or those offered by cloud providers (AWS, GCP More ❯
Advanced knowledge of modern data architectures (data lakes, data warehouses, data mesh). Proficiency in cloud platforms (GCP, AWS, Azure) and data tools (Databricks, Snowflake). Comprehensive programming skills (Python, Java, Scala, SQL, NoSQL). Agile delivery experience within a matrixed organisational structure. A history of crafting and executing successful More ❯
to the role. Key Responsibilities: Design, develop, and support data solutions, including performance tuning, troubleshooting, and system administration. Build and maintain data solutions using Snowflake, dbt, Fivetran, Azure Cloud, Python, Docker, and SQL. Participate in the development lifecycle using Agile/DevOps methods and Azure DevOps. Translate requirements into actionable More ❯
with relational and non-relational databases. Qualifications/Nice to have Experience with a messaging middleware platform like Solace, Kafka or RabbitMQ. Experience with Snowflake and distributed processing technologies (e.g., Hadoop, Flink, Spark More ❯
API Gateway, ECS) Experience designing and developing microservices with clean domain boundaries Good working knowledge of SQL and cloud databases (Aurora/Postgres, DynamoDB, Snowflake etc.) Familiarity with IaC tooling such as AWS CDK, CloudFormation or Terraform Experience working in Agile/Kanban teams with strong TDD principles CI/ More ❯
trading industries. Strong object-oriented development skills and software engineering fundamentals. Hands-on experience with cloud data-engineering technologies like Apache Airflow, K8S, Clickhouse, Snowflake, Redis, caching technologies, and Kafka. Proficiency in relational and non-relational databases, SQL, and query optimization. Experience designing infrastructure to meet high availability SLAs. Experience More ❯
Experience of relevant cloud services within AWS, Azure or GCP. Experience working in an Agile environment. Experience working with common vendor products such as Snowflake or Data Bricks. Experience working with CI/CD tooling. What you'll get in return is: 25 days' annual leave, rising to 30 days More ❯
a dynamic consultancy environment where creativity, autonomy, and teamwork are valued. Cutting-Edge Tools: Gain hands-on experience with leading tools like dbt, Airflow, Snowflake, and cloud platforms. Key Responsibilities Design and manage data warehouses using SQL, NoSQL, and cloud platforms. Develop ETL/ELT pipelines using Airflow and dbt. More ❯
team's people and tech budgets in relation to the value created by the team within the business Our Tech Stack Cloud Data Warehouse - Snowflake AWS Data Solutions - Kinesis, SNS, SQS, S3, ECS, Lambda Data Governance & Quality - Collate & Monte Carlo Infrastructure as Code - Terraform Data Integration & Transformation - Python, DBT, Fivetran More ❯
SQL at least basics but by year 3 should be quite proficient with at least pulling data Data Lake implementation and processing. Knowledge of Snowflake or equivalent XGBoost, LightGBM and the ability to use them for tabular data NLP and familiar with modern transformers Experience using Tableau or PowerBI for More ❯
Aladdin). Languages: Java, Python, or C# with Spring Boot or .NET Core. Integrations: Market data feeds (e.g., Refinitiv, Bloomberg). Data Platforms: Warehouses: Snowflake, Google BigQuery, or Amazon Redshift. Analytics: Tableau, Power BI, or Looker for client reporting. Big Data: Apache Spark or Hadoop for large-scale processing. AI More ❯
Aladdin). Languages: Java, Python, or C# with Spring Boot or .NET Core. Integrations: Market data feeds (e.g., Refinitiv, Bloomberg). Data Platforms: Warehouses: Snowflake, Google BigQuery, or Amazon Redshift. Analytics: Tableau, Power BI, or Looker for client reporting. Big Data: Apache Spark or Hadoop for large-scale processing. AI More ❯
data pipelines and ETL systems. 5+ years of hands-on experience with big data technology, systems and tools such as AWS, Hadoop, Hive, and Snowflake Expertise with common Software Engineering languages such as Python, Scala, Java, SQL and a proven ability to learn new programming languages Experience with workflow orchestration More ❯
with AWS data services (S3, Athena, Glue) Experience with Airflow for scheduling and orchestrating workflows Experience working with data lakes or modern data warehouses (Snowflake, Redshift, BigQuery) A pragmatic problem solver who can balance technical excellence with business needs At Funding Circle we are committed to building diverse teams so More ❯
Azure Cloud infrastructure, data stores connections, cloud functions concepts Experience in programming languages like Python, Java, or C# Experience in cloud data platforms like Snowflake, Databricks or Azure Synapse Good understanding of Data Modelling techniques Ideally, you'll also have Relevant academic background Participation in the opensource community Expertise in More ❯
platforms are built with Clojure, employ a polylith architecture, are deployed using CI/CD, heavily exploit automation, and run on AWS, GCP, k8s, Snowflake, and more. We serve 9 petabytes and 77 billion objects annually, which amounts to 20 billion ad impressions across the globe. You'll play a More ❯
or more programming languages - Python, Scala, Spark or Java Experience in working with petabyte scale data sets and developing integration layer solutions in Databricks, Snowflake or similar large platforms. Experience with cloud-based data warehousing, transformation tools like Delta Lake Tables, DBT, Fivetran or Snowflake. Proficiency in machine learning and More ❯
Azure Cloud infrastructure, data stores connections, cloud functions concepts. • Experience in programming languages like Python, Java, or C#. • Experience in cloud data platforms like Snowflake, Databricks or Azure Synapse. • Good understanding of Data Modelling techniques. Ideally, you'll also have • Relevant academic background • Participation in the opensource community • Expertise in More ❯
modeling . Experience with relational and NoSQL databases such as Oracle, Sybase, PostgreSQL, SQL Server, MongoDB . Familiarity with big data platforms (e.g., Hadoop, Snowflake). Prior experience with ETL tools or as a SQL developer . Proficiency in Python for data engineering and Tableau for reporting and dashboards. Exposure More ❯
will need to be very proficient in managing large sets of data, including excellent proficiency with ANSI-SQL querying structured and unstructured data sources (Snowflake, Oracle, SQL, No-SQL). Conduct Code-Reviews, and Peer Reviews. Ability to assess logs. Ability to solve complex performance issues. Experience with ELK, and More ❯
will need to be very proficient in managing large sets of data, including excellent proficiency with ANSI-SQL querying structured and unstructured data sources (Snowflake, Oracle, SQL, No-SQL). Conduct Code-Reviews, and Peer Reviews. Ability to assess logs. Ability to solve complex performance issues. Experience with ELK, and More ❯
will need to be very proficient in managing large sets of data, including excellent proficiency with ANSI-SQL querying structured and unstructured data sources (Snowflake, Oracle, SQL, No-SQL). Conduct Code-Reviews, and Peer Reviews. Ability to assess logs. Ability to solve complex performance issues. Experience with ELK, and More ❯
will need to be very proficient in managing large sets of data, including excellent proficiency with ANSI-SQL querying structured and unstructured data sources (Snowflake, Oracle, SQL, No-SQL). Conduct Code-Reviews, and Peer Reviews. Ability to assess logs. Ability to solve complex performance issues. Experience with ELK, and More ❯
Milton Keynes, Buckinghamshire, United Kingdom Hybrid / WFH Options
Banco Santander SA
career. The difference you'll make: Designing, building, and maintaining our AWS Data Estate. This includes S3 lake formation, Iceburg and extends to our Snowflake platform. Designing AWS-based services and capabilities adopting appropriate modelling techniques following agreed architectures, design standards, patterns, and methodology. Leading AWS-based service design activities More ❯