WHAT YOU'LL DO Design, document & implement the data pipelines to feed data models for subsequent consumption in Snowflake using dbt, airflow . Ensure correctness and completeness of the data being transformed via engineering pipelines for end consumption in Analytical Dashboards. Actively monitor and triage technical challenges in critical … Extremely talented in applying SCD, CDC and DQ/DV framework. Familiar with JIRA & Confluence . Must have exposure to technologies such as dbt, Apacheairflow, and Snowflake . Desire to continually keep up with advancements in data engineering practices. Knowledge of AWS cloud, and Python is a … Requirements 5+ years of IT experience with major focus on data warehouse/database related projects Must have exposure to technologies such as dbt, ApacheAirflow, Snowflake. Experience in data platforms: Snowflake, Oracle, SQL Server, MDM etc Expertise in writing SQL and database objects - Stored procedures, functions, and more »
quickly and apply new skills. Desirable: Solid understanding of microservices development. SQL and NoSQL databases working set. Familiar with or able to quickly learn Apache NiFi, ApacheAirflow, Apache Kafka, KeyCloak, Serverless Computing, GraphQL, APIs, APIM. Good skills working with JSON, XML, YAML files. Experience working more »
Contract Data Engineer: Fintech | AWS | Airflow | Glu This dynamic FinTech are looking to add to their already strong DataOps team. As a Data Engineer, you will be working closely with devOps and analytics teams building out and supporting the Data Warehouse (Redshift, migrating to Snowflake), core data pipelines, and … and dealing with ad-hoc requests About You We’re looking for an experienced Data Engineer with excellent knowledge of Snowflake, AWS, Python, and ApacheAirflow who is ready to lead by example and is used to rolling up their sleeves to get things done. The successful candidate … 3NF and dimensional modelling, Kimball, DV 2.0 etc.) Strong experience in building robust and scalable ELT/ETL data pipelines Proficient coding in - python, Apache Spark and expert knowledge of SQL and good experience with shell-scripting languages Working knowledge of orchestration tools, e.g. ApacheAirflow Experience more »
of the company's data infrastructure. You will work with some of the most innovative tools in the market including Snowflake, AWS (Glue, S3), Apache Spark, ApacheAirflow and DBT!! The role is hybrid, with 2 days in the office in central London and the company is … Experience developing and maintaining data pipelines from scratch Data modelling, data integration and transformation experience Hands on work with tools such as Snowflake, AWS, Airflow, and DBT Proficiency in data manipulation, scripting and automation with Python Desirable: Experience leading teams Version control systems such as Git or Bitbucket Agile more »
designing and building robust, scalable, distributed data systems and pipelines, using open source and public cloud technologies. Strong experience with data orchestration tools: e.g. ApacheAirflow, Dagster. Experience with big data storage and processing technologies: e.g. DBT, Spark, SQL, Athena/Trino, Redshift, Snowflake, RDBMSs (PostgreSQL/MySQL … . Knowledge of event-driven architectures and streaming technologies: e.g. Apache Kafka, Kafka Streams, Apache Flink. Experience with public cloud environments: e.g. AWS, GCP, Azure, Terraform. Strong knowledge of software engineering practices: e.g. testing, CI/CD (Jenkins, Github Actions), agile development, git/version control, containers etc. more »
Software Engineer Programmer Full Stack Python React Fixed Income Front Office Brokerage Trading Banking Buy Side Buy-Side AWS GCP React TypeScript Finance PostgreSQL Apache Ignite MaterialUI Ag-Grid Redux Saga Cypress Node gRPC Front Office Trading Investment Management Asset Manager) required by our asset management client in London. … Software Engineer Programmer Full Stack Python React Fixed Income Front Office Brokerage Trading Banking Buy Side Buy-Side AWS GCP React TypeScript Finance PostgreSQL Apache Ignite MaterialUI Ag-Grid Redux Saga Cypress Node gRPC Front Office Trading Investment Management Asset Manager) required by our asset management client in London. … expanded to cover elements of risk and provide further analysis. The product is built in React, TypeScript, Redux, Redux Saga, Ag-Grid, Node, Python, Airflow, Ignite, gRPC, PostgreSQL, protobuf and AWS. This is a full stack role with an emphasis on the Python Back End. Candidates also need some more »
Software Engineer Programmer Full Stack Python React Fixed Income Front Office Brokerage Trading Banking Buy Side Buy-Side AWS GCP React TypeScript Finance PostgreSQL Apache Ignite MaterialUI Ag-Grid Redux Saga Cypress Node gRPC Front Office Trading Investment Management Asset Manager) required by our asset management client in London. … Software Engineer Programmer Full Stack Python React Fixed Income Front Office Brokerage Trading Banking Buy Side Buy-Side AWS GCP React TypeScript Finance PostgreSQL Apache Ignite MaterialUI Ag-Grid Redux Saga Cypress Node gRPC Front Office Trading Investment Management Asset Manager) required by our asset management client in London. … expanded to cover elements of risk and provide further analysis. The product is built in React, TypeScript, Redux, Redux Saga, Ag-Grid, Node, Python, Airflow, Ignite, gRPC, PostgreSQL, protobuf and AWS. This is a full stack role with an emphasis on the Back End. Candidates also need some exposure more »
City of London, London, United Kingdom Hybrid / WFH Options
GCS Ltd
harnessing diverse AWS services. Key Requirements: High level of experience in both SQL and Python programming (10+ years) Experience managing data engineering pipelines using ApacheAirflow Proficiency in CI/CD pipelines and automation Git proficiency for version control (branching strategies and repo management) Competent in monitoring tools more »
Birmingham, England, United Kingdom Hybrid / WFH Options
Lorien
in SQL, NoSQL, Blob,Delta Lake, and other enterprise scale data stores. Data Orchestration - Enterprise scale usage of technology such as Azure Data Factory, ApacheAirflow, Logic Apps, DBT, SnapLogic, Spark or similar tools. Software Tooling - GIT/GitHub, CI/CD, deployment tools like Octopus, Terraform infrastructure more »
it is key they have experience of working with Streaming & Batch technology stack – Confluent Kafka, Mongo DB, Streamsets, IBM CDC, Hive, Hadoop, API, Informatica, Airflow, and other similar technologies SME level skills and experience of designing/architecting test automation solutions, ability to creatively problem solve is critical for more »
Degree in Computer Science, Engineering, Management Information Systems, Mathematics, a related field, or equivalent work experience (3+ years) Experience in: Database orchestration technologies, specifically Airflow and/or DBT Experience with streaming data architectures, specifically Kafka Knowledge of semi structured data: Parquet, Avro, JSONA deep understanding of AWS Cloud more »
ends (React, Redux, NodeJS, Webpack) • Strong understanding of AWS ecosystems like Lambdas, step functions and ECS services. • Experience with data stack technologies, such as Apache Iceberg & DBT. Preferred Skills • Experience on RDBMS like PostgreSQL would be a plus. Exposure to ApacheAirflow, Prefect, Dagster would be beneficial. more »
field (STEM) Technical proficiency in cloud-based data solutions (AWS, Azure or GCP), engineering languages including Python, SQL, Java, and pipeline management tools e.g., Apache Airflow. Familiarity with big data technologies, Hadoop, or Spark. If this opportunity is of interest, or you know anyone who would be interested in more »
proficiency in SQL for data querying and transformation. ● Programming skills in Python, including experience with basic libraries like os, csv, and pandas. ● Experience with ApacheAirflow for workflow management. ● Experience with enterprise DBMS (e.g., DB2, MS SQL Server) and cloud data warehouses, particularly Google BigQuery. ● Proficiency in Google more »
out on the front end! YOUR EXPERIENCE Python Cloud experience - AWS/GCP/Azure CI/CD Data modeling experience will be useful Airflow & DBT experience will be useful THE BENEFITS An education budget is available to learn and develop with the company Matched pension Travel budget in more »
existing systems and ingestion pipelines. Requirements: Proven experience working with Python or Java or C# Experience working with ELT/ELT technologies such as Airflow, Argo, Dagster, Spark, Hive Strong technical expertise, especially in data processing and exploration, with a willingness to learn new technologies. A passion for automation more »
plus. • A solid understanding of modern lakehouse architectures and corresponding technologies, such as Dremio, Snowflake, Iceberg, (Py)Spark/Glue/EMR, dbt, and Airflow/Dagster. • Experience with AWS services. Familiarity with S3, ECS, and EC2/Fargate would be particularly beneficial. • Proven ability to collaborate effectively with more »
Cardiff, Wales, United Kingdom Hybrid / WFH Options
Creditsafe
creating ETL pipelines in python * Exposure to analytical data warehouses such as Snowflake, Redshift or BigQuery (Redshift Preferred) * Experience with data orchestrators such as Airflow, AWS Step Functions, AWS Batch * Knowledge of Agile development methodologies * Knowledge of automated delivery processes * Experience designing and building autonomous data pipelines BENEFITS Competitive more »