we will be happy to support you. KEYWORDS: Python/Google Cloud Platform/GCP/SQL/PostgreSQL/Pandas/SQLAlchemy/ApacheAirflow/DataBricks/Data Bricks/Snowflake/Luigi/BigTable/Redis/CouchDB/RethinkDB/Elasticsearch/Insurance/… Asset Management/Reinsurance/Big Data/Python/Google Cloud Platform/GCP/SQL/PostgreSQL/Pandas/SQLAlchemy/ApacheAirflow/DataBricks/Data Bricks/Snowflake/Luigi/BigTable/Redis/CouchDB/RethinkDB/Elasticsearch/Insurance/… Asset Management/Reinsurance/Big Data/Python/Google Cloud Platform/GCP/SQL/PostgreSQL/Pandas/SQLAlchemy/ApacheAirflow/DataBricks/Data Bricks/Snowflake/Luigi/BigTable/Redis/CouchDB/RethinkDB/Elasticsearch/Insurance/ more »
understanding of AWS ecosystems like Lambdas, step functions and ECS services.Experience of Dremio is a nice to haveExperience with data stack technologies, such as Apache Iceberg & Spark. Exposure to ApacheAirflow, Prefect, Dagster, DBT.Expertise in data analysis with exposure to data services (such as Glue, Lake Formation more »
designing and building robust, scalable, distributed data systems and pipelines, using open source and public cloud technologies. Strong experience with data orchestration tools: e.g. ApacheAirflow, Dagster. Experience with big data storage and processing technologies: e.g. DBT, Spark, SQL, Athena/Trino, Redshift, Snowflake, RDBMSs (PostgreSQL/MySQL … . Knowledge of event-driven architectures and streaming technologies: e.g. Apache Kafka, Kafka Streams, Apache Flink. Experience with public cloud environments: e.g. AWS, GCP, Azure, Terraform. Strong knowledge of software engineering practices: e.g. testing, CI/CD (Jenkins, Github Actions), agile development, git/version control, containers etc. more »
like Lambdas, step functions and ECS services.Strong understanding of AWS ecosystems like Lambdas, step functions and ECS services.Experience with data stack technologies, such as Apache Iceberg & Spark. Exposure to ApacheAirflow, Prefect, Dagster, DBT.Expertise in data analysis with exposure to data services (such as Glue, Lake Formation more »
like Lambdas, step functions and ECS services.Strong understanding of AWS ecosystems like Lambdas, step functions and ECS services.Experience with data stack technologies, such as Apache Iceberg & Spark. Exposure to ApacheAirflow, Prefect, Dagster, DBT.Expertise in data analysis with exposure to data services (such as Glue, Lake Formation more »
Birmingham, England, United Kingdom Hybrid / WFH Options
Lorien
excellence in SQL, NoSQL, Blob,Delta Lake, and other enterprise scale data stores.Data Orchestration - Enterprise scale usage of technology such as Azure Data Factory, ApacheAirflow, Logic Apps, DBT, SnapLogic, Spark or similar tools.Software Tooling - GIT/GitHub, CI/CD, deployment tools like Octopus, Terraform infrastructure as more »
Birmingham, England, United Kingdom Hybrid / WFH Options
Lorien
in SQL, NoSQL, Blob,Delta Lake, and other enterprise scale data stores. Data Orchestration - Enterprise scale usage of technology such as Azure Data Factory, ApacheAirflow, Logic Apps, DBT, SnapLogic, Spark or similar tools. Software Tooling - GIT/GitHub, CI/CD, deployment tools like Octopus, Terraform infrastructure more »
Maven, Jenkins, GitHub, etc. Experience with Amazon Web Services a strong plus - CloudFormation, EMR, S3, EC2, Athena etc. Experience with scheduling services such as Airflow, Oozie. Experience with Data ETL and data modeling Experience with building large-scale systems with extensive knowledge in data warehousing solutions. Developing prototypes and more »
pipelines solutions for the ingestion, transformation, and serving of data, as well as solutions for the orchestration of pipeline components (e.g. AWS Step Functions, ApacheAirflow).Good understanding of data modelling, algorithm, and data transformation techniques to work with data platforms.Working knowledge of cloud development practices (AWS/ more »
the cloud (AWS preferred)Solid understanding of libraries like Pandas and NumPyExperience in data warehousing tools like Snowflake, Databricks, BigQueryFamiliar with AWS Step Functions, Airflow, Dagster, or other workflow orchestration toolsCommercial experience with performant database programming in SQLCapability to solve complex technical issues, comprehending risks prior to the circumstanceComfortable more »
enhancements to existing systems and ingestion pipelines.Requirements:Proven experience working with Python or Java or C#Experience working with ELT/ELT technologies such as Airflow, Argo, Dagster, Spark, HiveStrong technical expertise, especially in data processing and exploration, with a willingness to learn new technologies.A passion for automation and driving more »
Python client for Google BigQuery Advanced SQL (GoogleSQL, MySQL) Google Cloud Services Advanced BigQuery Advanced Google Cloud Storage Google Dataform Google Cloud Function Advanced ApacheAirflow Basic Tableau: ability to create basic visualisations Ability to integrate multiple data sources and databases into one system Able to create database more »
really value their employees. As a testament to this, you’ll also receive an unrivalled benefits package.🛠Tech: Snowflake, AWS (or Azure/GCP), Airflow, dbt🌳Environment: Agile✍️Process: 3 stagesNo CV? No problem. Email me at athomas@trg-uk.com, and let’s arrange a call more »
South East London, England, United Kingdom Hybrid / WFH Options
Burns Sheehan
data lake or warehouse.Data Pipeline Development: Design and construct data pipelines to automate data flow, involving ETL processes as needed.Modern tech stack - Python, AWS, Airflow and DBT Must haves:A team player, happy to work with several teams, this is key as you will be reporting directly to the more »
South East London, England, United Kingdom Hybrid / WFH Options
Cititec Talent
FAST APIs for building APIs & SQLAlchmy for database interactions.Strong experience in cloud-based development (AWS).Proficiency with both Docker & Kubernetes for containerisation & orchestration.Understanding or Airflow & DAGs.Experience in building applications using Kafka.Solid OOP principles & design patterns.Permanent/Full-Time Employment.Hybrid working environment (2/3 days' in the office).If more »
methodologies such as CI/CD, Applicant Resiliency, and Security Preferred qualifications, capabilities, and skills: · Skilled with Python or PySpark · Exposure to cloud technologies (Airflow, Astronomer, Kubernetes, AWS, Spark, Kafka) · Experience with Big Data solutions or Relational DB. · Experience in Financial Service Industry is a bonus. more »
project delivery.A commitment to mentoring and developing junior team members, coupled with a dedication to high standards in your work.Tech Stack: dbt, SQL, Snowflake, Airflow, AWS, Tableau.DE&I is also at the heart of the business and they strongly believe that ensuring diversity of background and experience will lead more »
spanning a number of systems At least 10 years of relevant experience Hands-on in-depth experience in the following: Snowflake/DBT/Airflow Background/working experience in the following: Azure Power BI/DAX Traditional SQL (SqlServer, MySql, Postgres) JIRA, Confluence, (Github/BitBucket) Azure Data more »
Southampton, Hampshire, South East, United Kingdom Hybrid / WFH Options
Datatech Analytics
record of taking data science products from conception through to deployment in production. You've had experience deploying data science workflows with tools like Airflow or Databricks. Proven experience of leading a data science team Proven experience of working with customer data You've a strong commitment to accuracy more »
Durham, County Durham, North East, United Kingdom Hybrid / WFH Options
Reed Technology
a data science team and managing complex projects. Expertise in machine learning, statistics, data management, and relevant technologies (e.g., Python, R, SQL, AWS SageMaker, ApacheAirflow, Dbt, AWS Kinesis). Strong communication skills with the ability to explain complex data concepts to a non-technical audience. Knowledge of more »
South East London, England, United Kingdom Hybrid / WFH Options
Job N Job
of successfully converting SAS-based modules to Python-based solutions.Strong understanding of data management principles and experience working with Snowflake.Proficiency in Python, DBT, and Airflow or similar technologies.Excellent problem-solving skills and ability to troubleshoot complex issues.Experience working in an Agile environment and collaborating with cross-functional teams.Excellent communication more »
Manchester Area, United Kingdom Hybrid / WFH Options
Maxwell Bond®
ensuring best practises, quality in data transformation and modelling. Essential experience with tech including; GCP, SQL and DBT. Preferably working experience with: Kafka, Dataform, Airflow, Tableau, PowerBI, Redshift, Snowflake, Terraform and BigQuery. This position does not offer Visa Sponsorship , please refrain from applying if you require sponsorship at any more »
range of databases. Snowflake is widely used, as are Docker and Kubernetes for containerisation. ETL and ELT tech are also used every day, primarily Airflow, Spark, Hive and a lot more.You’ll need to come from a strong academic background with some commercial experience in a data heavy software more »
Abingdon-On-Thames, England, United Kingdom Hybrid / WFH Options
Mirus Talent
mandatory, familiarity with the following technologies and tools would be advantageous: Dagster (or similar Orchestration Tools) : Experience with Dagster or other orchestration tools like Airflow for managing complex data workflows and pipelines. Qlik Sense Cloud (or similar Reporting Tools) : Knowledge of Qlik Sense Cloud or similar reporting tools such more »