we will be happy to support you. KEYWORDS: Python/Google Cloud Platform/GCP/SQL/PostgreSQL/Pandas/SQLAlchemy/ApacheAirflow/DataBricks/Data Bricks/Snowflake/Luigi/BigTable/Redis/CouchDB/RethinkDB/Elasticsearch/Insurance/… Asset Management/Reinsurance/Big Data/Python/Google Cloud Platform/GCP/SQL/PostgreSQL/Pandas/SQLAlchemy/ApacheAirflow/DataBricks/Data Bricks/Snowflake/Luigi/BigTable/Redis/CouchDB/RethinkDB/Elasticsearch/Insurance/… Asset Management/Reinsurance/Big Data/Python/Google Cloud Platform/GCP/SQL/PostgreSQL/Pandas/SQLAlchemy/ApacheAirflow/DataBricks/Data Bricks/Snowflake/Luigi/BigTable/Redis/CouchDB/RethinkDB/Elasticsearch/Insurance/ more »
Birmingham, England, United Kingdom Hybrid / WFH Options
Lorien
in SQL, NoSQL, Blob,Delta Lake, and other enterprise scale data stores. Data Orchestration - Enterprise scale usage of technology such as Azure Data Factory, ApacheAirflow, Logic Apps, DBT, SnapLogic, Spark or similar tools. Software Tooling - GIT/GitHub, CI/CD, deployment tools like Octopus, Terraform infrastructure more »
Python client for Google BigQuery Advanced SQL (GoogleSQL, MySQL) Google Cloud Services Advanced BigQuery Advanced Google Cloud Storage Google Dataform Google Cloud Function Advanced ApacheAirflow Basic Tableau: ability to create basic visualisations Ability to integrate multiple data sources and databases into one system Able to create database more »
methodologies such as CI/CD, Applicant Resiliency, and Security Preferred qualifications, capabilities, and skills: · Skilled with Python or PySpark · Exposure to cloud technologies (Airflow, Astronomer, Kubernetes, AWS, Spark, Kafka) · Experience with Big Data solutions or Relational DB. · Experience in Financial Service Industry is a bonus. more »
spanning a number of systems At least 10 years of relevant experience Hands-on in-depth experience in the following: Snowflake/DBT/Airflow Background/working experience in the following: Azure Power BI/DAX Traditional SQL (SqlServer, MySql, Postgres) JIRA, Confluence, (Github/BitBucket) Azure Data more »
Southampton, Hampshire, South East, United Kingdom Hybrid / WFH Options
Datatech Analytics
record of taking data science products from conception through to deployment in production. You've had experience deploying data science workflows with tools like Airflow or Databricks. Proven experience of leading a data science team Proven experience of working with customer data You've a strong commitment to accuracy more »
Durham, County Durham, North East, United Kingdom Hybrid / WFH Options
Reed Technology
a data science team and managing complex projects. Expertise in machine learning, statistics, data management, and relevant technologies (e.g., Python, R, SQL, AWS SageMaker, ApacheAirflow, Dbt, AWS Kinesis). Strong communication skills with the ability to explain complex data concepts to a non-technical audience. Knowledge of more »
Manchester Area, United Kingdom Hybrid / WFH Options
Maxwell Bond®
ensuring best practises, quality in data transformation and modelling. Essential experience with tech including; GCP, SQL and DBT. Preferably working experience with: Kafka, Dataform, Airflow, Tableau, PowerBI, Redshift, Snowflake, Terraform and BigQuery. This position does not offer Visa Sponsorship , please refrain from applying if you require sponsorship at any more »
Abingdon-On-Thames, England, United Kingdom Hybrid / WFH Options
Mirus Talent
mandatory, familiarity with the following technologies and tools would be advantageous: Dagster (or similar Orchestration Tools) : Experience with Dagster or other orchestration tools like Airflow for managing complex data workflows and pipelines. Qlik Sense Cloud (or similar Reporting Tools) : Knowledge of Qlik Sense Cloud or similar reporting tools such more »
team. Qualifications You will have expertise within the following: Java and Python development knowledge (Essential) Previous experience with Spark or Hadoop (Essential) Trino or Airflow (Desirable) Architecture and capabilities. Designing and implementing complex solutions with a focus on scalability and security. Excellent communication and collaboration skills. Additional Information Location more »
South East London, London, United Kingdom Hybrid / WFH Options
Stepstone UK
TMS (Tealium IQ, GTM and Adobe Dynamic Tag Manager) changes. Integrate data sources via web and REST APIs. Data pippingand modelling using SQL, DBT, Airflow, ETL, Data Warehousing, Redshift and Python. Transfer knowledge of the business processes and requirements to the development teams. Collaborate with Product, Marketing and Development more »
Staines-Upon-Thames, England, United Kingdom Hybrid / WFH Options
IFS
diverse environments, leveraging Azure and other modern technologies. Proven ability to orchestrate complex data workflows and manage Kubernetes clusters on AKS , utilizing tools like Airflow, Kubeflow, Argo, and Dagster. Familiarity with data ingestion tools such as Airbyte and Fivetran, accommodating a wide array of data sources. Mastery of large more »
Glasgow, Scotland, United Kingdom Hybrid / WFH Options
Synchro
in Python or PySpark, we encourage you to apply. Python App Developer Requirements: Proficiency with Python or PySpark. Exposure to cloud technologies such as Airflow, Astronomer, Kubernetes, AWS, Spark, Kafka. Experience with Big Data solutions or Relational DB. Demonstrated knowledge of software applications and technical processes within a cloud more »
Staines-Upon-Thames, England, United Kingdom Hybrid / WFH Options
IFS
Langchain, Semantic Kernel, and tools like MS tooling, Co-Pilot Studio, ML Studio, Prompt flow, Kedro, etc. Proficiency in pipeline orchestration with tools like Airflow, Kubeflow, and Argo . Exceptional communication skills, with the ability to articulate complex statistical concepts clearly. Personal Competencies: A results-oriented professional with a more »
Staines-Upon-Thames, England, United Kingdom Hybrid / WFH Options
IFS
for advanced content analysis and indexing & developing RAG services . Experience in managing data workflows and Kubernetes clusters on AKS, utilizing tools such as Airflow, Kubeflow, Argo, and Dagster. Familiarity using scripting languages and tools such as Bash, PowerShell, Azure CLI, Terraform, and Helm Charts. Additional Information Location : This more »
on with projects but no requirements, previous experience in this is essential SKILLS AND EXPERIENCE NEEDED: Experience in Redshift database (AWS) Experience with DBT, Airflow and Fivetran Working collaboratively with multiple teams across the business Management/mentoring experience of a team INTERVIEW PROCESS: 1st Stage- Initial Chat 2nd more »
Cardiff, Wales, United Kingdom Hybrid / WFH Options
Creditsafe
the above project of redesigning the Creditsafe platform into the cloud space. You will be expected to work with technologies such as Python, Linux, Airflow, AWS DynamoDB, S3, Glue, Athena, Redshift, lambda, API Gateway, Terraform, CI/CD. KEY DUTIES AND RESPONSIBILITIES You will actively contribute to the codebase … and participate in peer reviews. Design and build metadata driven, event based distributed data processing platform using technologies such as Python, Airflow, Redshift, DynamoDB, AWS Glue, S3. As an experienced Engineer, you will play a critical role in the design, development, and deployment of our business-critical system. You more »
Nottingham, Nottinghamshire, East Midlands, United Kingdom
Microlise
data practices Possess strong knowledge of data tools, data management tools, and various data and information technologies. E.g. DAMA DMBOK, Microsoft SQL Server, Couchbase, Apache Druid, Spark, Kafka, Airflow, etc In-depth understanding of modern data principles, methodologies, and tools Excellent communication and collaboration skills, with the ability … native computing concepts and experience working with hybrid or private cloud platforms is a plus. Demonstrable technical experience working with a Microsoft, Redhat, and Apache data and software engineering environment. A team-oriented individual with a passion for engineered excellence and the ability to lead and motivate a team more »
using Docker, Kubernetes and Cloud services. Experience with Azure stack will be an asset. Experience designing and implementing event-driven/microservices applications using Apache Kafka, Flink, etc. Exposure to model deployment and serving tools like Seldon Core, KServe, etc. Experience with drift detection and adaptation techniques as well … and familiarity with tools to manage infrastructure as code, like Terraform and package managers like Helm Charts. Proficiency with pipeline orchestration tools, such as Airflow, Kubeflow, and Argo Workflows. Outstanding communication skills, ability to convey complex technical concepts to non-technical stakeholders and collaborate with cross-functional teams. A more »