we will be happy to support you. KEYWORDS: Python/Google Cloud Platform/GCP/SQL/PostgreSQL/Pandas/SQLAlchemy/ApacheAirflow/DataBricks/Data Bricks/Snowflake/Luigi/BigTable/Redis/CouchDB/RethinkDB/Elasticsearch/Insurance/… Asset Management/Reinsurance/Big Data/Python/Google Cloud Platform/GCP/SQL/PostgreSQL/Pandas/SQLAlchemy/ApacheAirflow/DataBricks/Data Bricks/Snowflake/Luigi/BigTable/Redis/CouchDB/RethinkDB/Elasticsearch/Insurance/… Asset Management/Reinsurance/Big Data/Python/Google Cloud Platform/GCP/SQL/PostgreSQL/Pandas/SQLAlchemy/ApacheAirflow/DataBricks/Data Bricks/Snowflake/Luigi/BigTable/Redis/CouchDB/RethinkDB/Elasticsearch/Insurance/ more »
London, England, United Kingdom Hybrid / WFH Options
Aventum Group
Profisee), Snowflake Data Integration, Azure Service Bus, Deltalake, BigQuery, Azure DevOps, Azure Monitor, Azure Data Factory, SQL Server, Azure DataLake Storage, Azure App Service, ApacheAirflow, Apache Iceberg, Apache Spark, Apache Hudi, Apache Kafka, Power BI, BigQuery, Azure ML is a plus Experience with more »
WHAT YOU'LL DO Design, document & implement the data pipelines to feed data models for subsequent consumption in Snowflake using dbt, airflow . Ensure correctness and completeness of the data being transformed via engineering pipelines for end consumption in Analytical Dashboards. Actively monitor and triage technical challenges in critical … Extremely talented in applying SCD, CDC and DQ/DV framework. Familiar with JIRA & Confluence . Must have exposure to technologies such as dbt, Apacheairflow, and Snowflake . Desire to continually keep up with advancements in data engineering practices. Knowledge of AWS cloud, and Python is a … Requirements 5+ years of IT experience with major focus on data warehouse/database related projects Must have exposure to technologies such as dbt, ApacheAirflow, Snowflake. Experience in data platforms: Snowflake, Oracle, SQL Server, MDM etc Expertise in writing SQL and database objects - Stored procedures, functions, and more »
or Angular good but not necessary) Agile The following is DESIRABLE, not essential: AWS or GCP Buy-side Data tools such as Glue, Athena, Airflow, Ignite, DBT, Arrow, Iceberg, Dremio Fixed Income performance, risk or attribution TypeScript and Node Role: Python Developer (Software Engineer Programmer Developer Python Fixed Income … work is largely down to you. It can be entirely Back End. Otherwise, the stack includes Redux Saga, Ag-Grid, Node, TypeScript, gRPC, protobuf, Apache Ignite, ApacheAirflow and AWS. As the application suite grows and advances in complexity, there is a decent amount of interaction with … the office 1-2 times a week. The tech environment is very new and will soon likely include exposure to the following: Glue, Athena, Airflow, Ignite, DBT, Arrow, Iceberg, Dremio This is an environment that has been described as the only corporate environment with a start-up/fintech more »
of the company's data infrastructure. You will work with some of the most innovative tools in the market including Snowflake, AWS (Glue, S3), Apache Spark, ApacheAirflow and DBT!! The role is hybrid, with 2 days in the office in central London and the company is … Experience developing and maintaining data pipelines from scratch Data modelling, data integration and transformation experience Hands on work with tools such as Snowflake, AWS, Airflow, and DBT Proficiency in data manipulation, scripting and automation with Python Desirable: Experience leading teams Version control systems such as Git or Bitbucket Agile more »
designing and building robust, scalable, distributed data systems and pipelines, using open source and public cloud technologies. Strong experience with data orchestration tools: e.g. ApacheAirflow, Dagster. Experience with big data storage and processing technologies: e.g. DBT, Spark, SQL, Athena/Trino, Redshift, Snowflake, RDBMSs (PostgreSQL/MySQL … . Knowledge of event-driven architectures and streaming technologies: e.g. Apache Kafka, Kafka Streams, Apache Flink. Experience with public cloud environments: e.g. AWS, GCP, Azure, Terraform. Strong knowledge of software engineering practices: e.g. testing, CI/CD (Jenkins, Github Actions), agile development, git/version control, containers etc. more »
Westminster, Colorado, United States Hybrid / WFH Options
Maxar Technologies
Prior experience with CI/CD technologies such as Jenkins Prior experience with any of the following: Trino/Starburst, dbt (core or cloud), Apache Superset, OpenMetadata, ApacheAirflow, Tableau. Prior experience with RDS databases, or Postgres. Agile software development lifecycle experience These skills would be amazing more »
Birmingham, England, United Kingdom Hybrid / WFH Options
Lorien
in SQL, NoSQL, Blob,Delta Lake, and other enterprise scale data stores. Data Orchestration - Enterprise scale usage of technology such as Azure Data Factory, ApacheAirflow, Logic Apps, DBT, SnapLogic, Spark or similar tools. Software Tooling - GIT/GitHub, CI/CD, deployment tools like Octopus, Terraform infrastructure more »
Maven, Jenkins, GitHub, etc. Experience with Amazon Web Services a strong plus - CloudFormation, EMR, S3, EC2, Athena etc. Experience with scheduling services such as Airflow, Oozie. Experience with Data ETL and data modeling Experience with building large-scale systems with extensive knowledge in data warehousing solutions. Developing prototypes and more »
Terraform, and infrastructure as code (IaC) best practices. Familiarity with Python programming language when applied to Spark and machine learning. Familiarity with Databricks and ApacheAirflow products. Required Education & Experience Bachelor’s degree in Computer Science, Information Systems, Software, Electrical or Electronics Engineering, or comparable field of study more »
impact on the passenger experience. It would also be advantageous to have experience of either Software services at scale. Any knowledge of AWS Infrastructure, Airflow, Kafka, and data streaming with Spark/Scala. Comprehension of networking fundamentals. A Background in cloud computing, enterprise computing, server, and virtualization technologies. An more »
pandas, numpy, pyspark Good understanding of OOP, software design patterns, and SOLID principles Good experience in Docker Good experience in Linux Good experience in Airflow Good knowledge of cloud architecture Good experience in Terraform Expert experience with database systems (snowflake, sql, postgres etc.) Experience of micro-service development and more »
Tech: - AWS (S3, Glue, EMR, Athena, Lambda) - Snowflake, Redshift - DBT (Data Build Tool) - Programming: Python, Scala, Spark, PySpark or Ab Initio - Data pipeline orchestration (ApacheAirflow) - Knowledge of SQL This is a 6 month initial contract with a trusted client of ours. CVs are being presented on Friday more »
required: Python SQL Kubernetes CI/CD experience with relevant tooling with Jenkins, Docker or Terraform Cloud services experience with AWS/Azure Ideally: Airflow Java Experience working with front office trading systems and financial market data For more information on this role or any other contract/permanent more »
Hybrid ( 2 days a week) JD : Experience of working with Streaming & Batch technology stack – Confluent Kafka, Mongdb , Streamsets, IBM CDC, Hive, Hadoop, API, Informatica, Airflow, and other similar technologies SME level skills and experience of designing/architecting test automation solutions, ability to creatively problem solve is critical for more »
existing systems and ingestion pipelines. Requirements: Proven experience working with Python or Java or C# Experience working with ELT/ELT technologies such as Airflow, Argo, Dagster, Spark, Hive Strong technical expertise, especially in data processing and exploration, with a willingness to learn new technologies. A passion for automation more »