WHAT YOU'LL DO Design, document & implement the data pipelines to feed data models for subsequent consumption in Snowflake using dbt, airflow . Ensure correctness and completeness of the data being transformed via engineering pipelines for end consumption in Analytical Dashboards. Actively monitor and triage technical challenges in critical … Extremely talented in applying SCD, CDC and DQ/DV framework. Familiar with JIRA & Confluence . Must have exposure to technologies such as dbt, Apacheairflow, and Snowflake . Desire to continually keep up with advancements in data engineering practices. Knowledge of AWS cloud, and Python is a … Requirements 5+ years of IT experience with major focus on data warehouse/database related projects Must have exposure to technologies such as dbt, ApacheAirflow, Snowflake. Experience in data platforms: Snowflake, Oracle, SQL Server, MDM etc Expertise in writing SQL and database objects - Stored procedures, functions, and more »
quickly and apply new skills. Desirable: Solid understanding of microservices development. SQL and NoSQL databases working set. Familiar with or able to quickly learn Apache NiFi, ApacheAirflow, Apache Kafka, KeyCloak, Serverless Computing, GraphQL, APIs, APIM. Good skills working with JSON, XML, YAML files. Experience working more »
Contract Data Engineer: Fintech | AWS | Airflow | Glu This dynamic FinTech are looking to add to their already strong DataOps team. As a Data Engineer, you will be working closely with devOps and analytics teams building out and supporting the Data Warehouse (Redshift, migrating to Snowflake), core data pipelines, and … and dealing with ad-hoc requests About You We’re looking for an experienced Data Engineer with excellent knowledge of Snowflake, AWS, Python, and ApacheAirflow who is ready to lead by example and is used to rolling up their sleeves to get things done. The successful candidate … 3NF and dimensional modelling, Kimball, DV 2.0 etc.) Strong experience in building robust and scalable ELT/ETL data pipelines Proficient coding in - python, Apache Spark and expert knowledge of SQL and good experience with shell-scripting languages Working knowledge of orchestration tools, e.g. ApacheAirflow Experience more »
capable of making decisions, and skilled in troubleshooting issues. Technical Skills: Primary: Python, Snowflake Dev, PostgresDev, APIM (API Management), API, DBT, Azure DevOps Secondary : Apache Kafka, Azure Event hub, ApacheAirflow, Apache Flink, Grafana, Prometheus, Terraform, Kubernetes, Power BI 2 Streaming Lead Data Engineering & Quality Pod more »
a database Working knowledge of Snowflake or Redshift data warehouses Experience with ETL/ELT processes and tools (preferably with Matillion and/or ApacheAirflow) Hands-on experience with AWS (notably ECS and S3) Financial services experience Experience designing and implementing data warehouses Experience developing and maintaining … ApacheAirflow DAGs to implement data pipelines Hands on experience using DBT to manage and implement data transformations A working understanding of Docker Experience working with big data and/or MPP (massively parallel processing) databases Experience supporting applications running on Amazon AWS Experiences in Continuous delivery and more »
of the company's data infrastructure. You will work with some of the most innovative tools in the market including Snowflake, AWS (Glue, S3), Apache Spark, ApacheAirflow and DBT!! The role is hybrid, with 2 days in the office in central London and the company is … Experience developing and maintaining data pipelines from scratch Data modelling, data integration and transformation experience Hands on work with tools such as Snowflake, AWS, Airflow, and DBT Proficiency in data manipulation, scripting and automation with Python Desirable: Experience leading teams Version control systems such as Git or Bitbucket Agile more »
designing and building robust, scalable, distributed data systems and pipelines, using open source and public cloud technologies. Strong experience with data orchestration tools: e.g. ApacheAirflow, Dagster. Experience with big data storage and processing technologies: e.g. DBT, Spark, SQL, Athena/Trino, Redshift, Snowflake, RDBMSs (PostgreSQL/MySQL … . Knowledge of event-driven architectures and streaming technologies: e.g. Apache Kafka, Kafka Streams, Apache Flink. Experience with public cloud environments: e.g. AWS, GCP, Azure, Terraform. Strong knowledge of software engineering practices: e.g. testing, CI/CD (Jenkins, Github Actions), agile development, git/version control, containers etc. more »
Full Stack Python React Fixed Income Equities Multi-Asset Front Office Brokerage Trading Banking Buy Side Buy-Side AWS GCP React TypeScript Finance PostgreSQL Apache Ignite MaterialUI Ag-Grid Redux Front Office Trading Investment Management Asset Manager) required by our asset management client in London. You MUST have the … AWS) or Google Cloud Platform (GCP) Experience in a trading environment with a bank, broker, asset manager or hedge fund PostgreSQL Ag-Grid, Redux, Airflow, Dasher Role: Python Developer (Software Engineer Programmer Full Stack Python React Fixed Income Equities Multi-Asset Front Office Brokerage Trading Banking Buy Side Buy … Side AWS GCP React TypeScript Finance PostgreSQL Apache Ignite MaterialUI Ag-Grid Redux Front Office Trading Investment Management Asset Manager) required by our asset management client in London. The application allows the portfolio managers to generate 'what if' scenarios across their portfolios so that they can simulate market conditions. more »
Full Stack Python React Fixed Income Equities Multi-Asset Front Office Brokerage Trading Banking Buy Side Buy-Side AWS GCP React TypeScript Finance PostgreSQL Apache Ignite MaterialUI Ag-Grid Redux Front Office Trading Investment Management Asset Manager) required by our asset management client in London. You MUST have the … AWS) or Google Cloud Platform (GCP) Experience in a trading environment with a bank, broker, asset manager or hedge fund PostgreSQL Ag-Grid, Redux, Airflow, Dasher Role: Full-Stack Developer (Software Engineer Programmer Full Stack Python React Fixed Income Equities Multi-Asset Front Office Brokerage Trading Banking Buy Side … Buy-Side AWS GCP React TypeScript Finance PostgreSQL Apache Ignite MaterialUI Ag-Grid Redux Front Office Trading Investment Management Asset Manager) required by our asset management client in London. The application allows the portfolio managers to generate 'what if' scenarios across their portfolios so that they can simulate market more »
data models and build data transformation pipelines You have experience with data orchestration and ETL pipelines using tools such as Azure Data Factory or Airflow You have a strong understanding of Data Modeling, Data Warehousing concepts, and query optimisation for large complex datasets You have strong SQL and Python more »
City of London, London, United Kingdom Hybrid / WFH Options
GCS Ltd
harnessing diverse AWS services. Key Requirements: High level of experience in both SQL and Python programming (10+ years) Experience managing data engineering pipelines using ApacheAirflow Proficiency in CI/CD pipelines and automation Git proficiency for version control (branching strategies and repo management) Competent in monitoring tools more »
Full Stack Python React Fixed Income Equities Multi-Asset Front Office Brokerage Trading Banking Buy Side Buy-Side AWS GCP React TypeScript Finance PostgreSQL Apache Ignite MaterialUI Ag-Grid Redux Front Office Trading Investment Management Asset Manager) required by our asset management client in London. You MUST have the … AWS) or Google Cloud Platform (GCP) Experience in a trading environment with a bank, broker, asset manager or hedge fund PostgreSQL Ag-Grid, Redux, Airflow, Dasher Role: Full-Stack Developer (Software Engineer Programmer Full Stack Python React Fixed Income Equities Multi-Asset Front Office Brokerage Trading Banking Buy Side … Buy-Side AWS GCP React TypeScript Finance PostgreSQL Apache Ignite MaterialUI Ag-Grid Redux Front Office Trading Investment Management Asset Manager) required by our asset management client in London. The application allows the portfolio managers to generate 'what if' scenarios across their portfolios so that they can simulate market more »
Birmingham, England, United Kingdom Hybrid / WFH Options
Lorien
in SQL, NoSQL, Blob,Delta Lake, and other enterprise scale data stores. Data Orchestration - Enterprise scale usage of technology such as Azure Data Factory, ApacheAirflow, Logic Apps, DBT, SnapLogic, Spark or similar tools. Software Tooling - GIT/GitHub, CI/CD, deployment tools like Octopus, Terraform infrastructure more »
it is key they have experience of working with Streaming & Batch technology stack – Confluent Kafka, Mongo DB, Streamsets, IBM CDC, Hive, Hadoop, API, Informatica, Airflow, and other similar technologies SME level skills and experience of designing/architecting test automation solutions, ability to creatively problem solve is critical for more »