problem-solving and communication skills Proficiency in scripting for automating deployment and maintenance tasks. Understanding of DAG (Directed Acyclic Graph) models and experience with ApacheAirflow for managing complex data processing work-flows. Solid understanding of software development best practices, including version control (Git), testing, and code review more »
London, England, United Kingdom Hybrid / WFH Options
Aventum Group
Profisee), Snowflake Data Integration, Azure Service Bus, Deltalake, BigQuery, Azure DevOps, Azure Monitor, Azure Data Factory, SQL Server, Azure DataLake Storage, Azure App Service, ApacheAirflow, Apache Iceberg, Apache Spark, Apache Hudi, Apache Kafka, Power BI, BigQuery, Azure ML is a plus Experience with more »
of the company's data infrastructure. You will work with some of the most innovative tools in the market including Snowflake, AWS (Glue, S3), Apache Spark, ApacheAirflow and DBT!! The role is hybrid, with 2 days in the office in central London and the company is … Experience developing and maintaining data pipelines from scratch Data modelling, data integration and transformation experience Hands on work with tools such as Snowflake, AWS, Airflow, and DBT Proficiency in data manipulation, scripting and automation with Python Desirable: Experience leading teams Version control systems such as Git or Bitbucket Agile more »
and transformation. 4. Develop and maintain ETL workflows, scripts, and data processing jobs using programming languages (e.g., Python, Java, Scala) and ETL tools (e.g., Apache Spark, ApacheAirflow). 5. Identify and address data quality issues and implement data cleansing, validation, and enrichment processes. 6. Collaborate with more »
designing and building robust, scalable, distributed data systems and pipelines, using open source and public cloud technologies. Strong experience with data orchestration tools: e.g. ApacheAirflow, Dagster. Experience with big data storage and processing technologies: e.g. DBT, Spark, SQL, Athena/Trino, Redshift, Snowflake, RDBMSs (PostgreSQL/MySQL … . Knowledge of event-driven architectures and streaming technologies: e.g. Apache Kafka, Kafka Streams, Apache Flink. Experience with public cloud environments: e.g. AWS, GCP, Azure, Terraform. Strong knowledge of software engineering practices: e.g. testing, CI/CD (Jenkins, Github Actions), agile development, git/version control, containers etc. more »
Degree in Computer Science, Engineering, Management Information Systems, Mathematics, a related field, or equivalent work experience (3+ years) Experience in: Database orchestration technologies, specifically Airflow and/or DBT Experience with streaming data architectures, specifically Kafka Knowledge of semi structured data: Parquet, Avro, JSONA deep understanding of AWS Cloud more »
Birmingham, England, United Kingdom Hybrid / WFH Options
Lorien
in SQL, NoSQL, Blob,Delta Lake, and other enterprise scale data stores. Data Orchestration - Enterprise scale usage of technology such as Azure Data Factory, ApacheAirflow, Logic Apps, DBT, SnapLogic, Spark or similar tools. Software Tooling - GIT/GitHub, CI/CD, deployment tools like Octopus, Terraform infrastructure more »
Maven, Jenkins, GitHub, etc. Experience with Amazon Web Services a strong plus - CloudFormation, EMR, S3, EC2, Athena etc. Experience with scheduling services such as Airflow, Oozie. Experience with Data ETL and data modeling Experience with building large-scale systems with extensive knowledge in data warehousing solutions. Developing prototypes and more »
existing systems and ingestion pipelines. Requirements: Proven experience working with Python or Java or C# Experience working with ELT/ELT technologies such as Airflow, Argo, Dagster, Spark, Hive Strong technical expertise, especially in data processing and exploration, with a willingness to learn new technologies. A passion for automation more »
field (STEM) Technical proficiency in cloud-based data solutions (AWS, Azure or GCP), engineering languages including Python, SQL, Java, and pipeline management tools e.g., Apache Airflow. Familiarity with big data technologies, Hadoop, or Spark. If this opportunity is of interest, or you know anyone who would be interested in more »
really value their employees. As a testament to this, you’ll also receive an unrivalled benefits package. 🛠Tech: Snowflake, AWS (or Azure/GCP), Airflow, dbt 🌳Environment: Agile ✍️Process: 3 stages No CV? No problem. Email me at athomas@trg-uk.com, and let’s arrange a call more »
the business and our customers. About you; Proficiency with language/tools for data processing and analytics, such as SQL, Python/Scala, Spark, Airflow, etc Strong understanding of data architectures, data modelling, and designing scalable and fault-tolerant data pipelines and data lakes/warehouses. You’ve worked more »
methodologies such as CI/CD, Applicant Resiliency, and Security Preferred qualifications, capabilities, and skills: · Skilled with Python or PySpark · Exposure to cloud technologies (Airflow, Astronomer, Kubernetes, AWS, Spark, Kafka) · Experience with Big Data solutions or Relational DB. · Experience in Financial Service Industry is a bonus. more »
A commitment to mentoring and developing junior team members, coupled with a dedication to high standards in your work. Tech Stack : dbt, SQL, Snowflake, Airflow, AWS, Tableau. DE&I is also at the heart of the business and they strongly believe that ensuring diversity of background and experience will more »
spanning a number of systems At least 10 years of relevant experience Hands-on in-depth experience in the following: Snowflake/DBT/Airflow Background/working experience in the following: Azure Power BI/DAX Traditional SQL (SqlServer, MySql, Postgres) JIRA, Confluence, (Github/BitBucket) Azure Data more »
testing, and maintenance of data pipelines and data storage systems on Google Cloud Platform (GCP). You will be working with technologies such as ApacheAirflow, BigQuery, Python, and SQL to transform and load large data sets, ensuring high data quality and accessibility for business intelligence and analytics more »
really value their employees. As a testament to this, you’ll also receive an unrivalled benefits package. Tech: Snowflake, AWS (or Azure/GCP), Airflow, dbt TC: £85,000 + bonus + up to 22% pension Process: 2 stages No CV? No problem. Email me at athomas@trg-uk.com more »
Abingdon-On-Thames, England, United Kingdom Hybrid / WFH Options
Mirus Talent
mandatory, familiarity with the following technologies and tools would be advantageous: Dagster (or similar Orchestration Tools) : Experience with Dagster or other orchestration tools like Airflow for managing complex data workflows and pipelines. Qlik Sense Cloud (or similar Reporting Tools) : Knowledge of Qlik Sense Cloud or similar reporting tools such more »
experience working in cross-functional agile teams. Technologies such as Docker and orchestration tools like Kubernetes for containerized deployments. Workflow management tools such as Airflow for orchestrating complex data pipelines and ETL processes. Certifications in Azure cloud services and data engineering technologies, demonstrating expertise and proficiency in the Azure more »
our customers. Skills and experience we’re looking for: Experience of creating advanced visualisations in Tableau. Experience in SQL and Python. Experience of AWS, Airflow, S3 and working with Snowflake in a large complex organisation is advantageous. Experience of establishing processes to identify and managing issues, in data or more »
Staines-Upon-Thames, England, United Kingdom Hybrid / WFH Options
IFS
diverse environments, leveraging Azure and other modern technologies. Proven ability to orchestrate complex data workflows and manage Kubernetes clusters on AKS , utilizing tools like Airflow, Kubeflow, Argo, and Dagster. Familiarity with data ingestion tools such as Airbyte and Fivetran, accommodating a wide array of data sources. Mastery of large more »
Proven experience in MLOps and deploying machine learning models on Kubernetes. Proficiency in cloud technologies, AWS, GCP, Azure Experience with data orchestration tools (e.g., ApacheAirflow). Familiarity with Terraform for CI/CD and infrastructure as code. Strong programming skills in software development. A cloud-agnostic mindset more »
data manipulation and analysis Experience with Kubernetes and Docker for scalable infrastructure deployment Hands on experience with CI/CD and Orchestration tools eg. Airflow, Jenkins, Bamboo, etc. Experience with MongoDB is highly beneficial Knowledge of data pipeline development and optimization Familiarity with data science principles and workflows is more »
Staines-Upon-Thames, England, United Kingdom Hybrid / WFH Options
IFS
Langchain, Semantic Kernel, and tools like MS tooling, Co-Pilot Studio, ML Studio, Prompt flow, Kedro, etc. Proficiency in pipeline orchestration with tools like Airflow, Kubeflow, and Argo . Exceptional communication skills, with the ability to articulate complex statistical concepts clearly. Personal Competencies: A results-oriented professional with a more »
Staines-Upon-Thames, England, United Kingdom Hybrid / WFH Options
IFS
for advanced content analysis and indexing & developing RAG services . Experience in managing data workflows and Kubernetes clusters on AKS, utilizing tools such as Airflow, Kubeflow, Argo, and Dagster. Familiarity using scripting languages and tools such as Bash, PowerShell, Azure CLI, Terraform, and Helm Charts. Additional Information Location : This more »