e.g. GDPR compliance) Proficiency in ELT/ETL processes Strong experience in data ingestion, transformation & orchestration technology (ETL tools such as Informatica, Datastage, SSIS, etc ) or open source Meltano, Airbyte, and Airflow Proven experience with DBT (data build tool) Proficiency with business intelligence tools (Power BI, Tableau, SAP BI, or similar). Integration & Programming Hands-on experience with API development More ❯
managing a team of Data Engineers Experience with Data modelling, Data warehousing, and building ETL pipelines Experience with AWS (S3, EKS, EC2, RDS) or similar cloud services, Snowflake, Fivetran, Airbyte, dbt, Docker, Argo Experience in SQL, Python, and Terraform Experience with building Data pipelines and applications to stream and process datasets Robust understanding of DevOps principles is required Experience managing More ❯
inform strategic decisions both at the Board/Executive level and at the business unit level. Key Responsibilities Design, develop, and maintain scalable ETL pipelines using technologies like dbt, Airbyte, Cube, DuckDB, Redshift, and Superset Work closely with stakeholders across the company to gather data requirements and setup dashboards Promote a data driven culture at Notabene and train, upskill power More ❯
products are recognised by industry leaders like Gartner's Magic Quadrant, Forrester Wave and Frost Radar. Our tech stack: Superset and similar data visualisation tools. ETL tools: Airflow, DBT, Airbyte, Flink, etc. Data warehousing and storage solutions: ClickHouse, Trino, S3. AWS Cloud, Kubernetes, Helm. Relevant programming languages for data engineering tasks: SQL, Python, Java, etc. What you will be doing More ❯
Technical Leadership & Design DevSecOps tooling and practices Application Security Testing SAFe (scaled agile) Processes Data Integration Focused: Data Pipeline Orchestration and ELT tooling such as Apache Airflow, Apache NiFi, Airbyte, and Singer Message Brokers and streaming data processors like Apache Kafka Object Storage solutions such as S3, MinIO, LakeFS CI/CD Pipeline and Integration, ideally with Azure DevOps Python More ❯
and pipeline development Experience with IaC tools such as Terraform or Ansible for deployment and infrastructure management Hands-on experience with; ETL/ELT orchestration and pipeline tools (Airflow, Airbyte, DBT, etc.) Data warehousing tools and platforms (Snowflake, Iceberg, etc.) SQL databases, particularly MySQL Desired Experience: Experience with cloud-based services, particularly AWS Proven ability to manage stakeholders, their expectations More ❯
a Senior Data Warehouse Engineer, you'll: Help redesign and modernise the existing data warehouse using ELT best practices. Migrate legacy ETL workflows to modern tools like Fivetran (or Airbyte) and dbt. Optimise data models in Amazon Redshift for performance, scalability, and cost-effectiveness. Replace legacy orchestration scripts with Python or Bash for enhanced automation. Enforce best practices for data More ❯
much more. We are technology agnostic (but opinionated!) and work across all major cloud providers. Depending on client choice we may either leverage third party tools such as Fivetran, Airbyte, Stitch or build custom pipelines. We use the main data warehouses for dbt modelling and have extensive experience with Redshift, BigQuery and Snowflake. Recently we've been rolling out a More ❯
Excellent communication skills - able to simplify technical concepts for non-technical stakeholders Nice-to-Have: Experience working in client-facing consultancy projects Knowledge of modern data stack tools (Fivetran, Airbyte, Stitch) Python for data transformation or automation Familiarity with Git and version control best practices If you would like to be considered for the Analytics Consultant role and feel you More ❯
data-focused backend services Have hands-on experience with cloud infrastructure (GCP/AWS/Azure), infrastructure-as-code (Terraform), containerisation (Docker/k8s) and data pipelines (SQL, dbt, Airbyte) Love automation, process improvement and finding ways to help others work efficiently Are comfortable working autonomously and taking responsibility for the delivery of large technical projects Are eager to learn More ❯
Prometheus. We're in the process of transitioning to OpenTelemetry and Honeycomb for our application telemetry (traces and metrics). - We manage a data pipeline using Pub/Sub, Airbyte, and dbt. Our Current Focus We're currently driving a big shift in how we think about and monitor reliability across the engineering organisation, with a focus on early detection More ❯
Prior experience or interest in working with geospatial data Technologies we use ️ Programming languages: SQL, Python, LookML, (+ Go for other backend services) Development tools and frameworks: dbt, dagster, Airbyte, dlt, data-diff, Elementary Data lake and warehouse: GCS, BigQuery Analytics: Looker, Looker Studio and geospatial analytics tools How we reward our team Dynamic working environment with a diverse and More ❯
You will be working for the high growth womenswear brand Sisters & Seekers within a creative and visionary team, all striving towards taking the brand to the next level. We are based near Chester but have a global reach with customers More ❯
Leeds, West Yorkshire, Yorkshire, United Kingdom Hybrid / WFH Options
ISG Talent Partners Ltd
platform. Key Skills Proven experience with GCP infrastructure solutions Strong DevOps expertise including Terraform and Kubernetes Solid understanding of cloud security best practices Experience with data engineering integration, specifically Airbyte Strong knowledge of ETL/ELT processes Rate/Working Arrangement £600£650 per day (Outside IR35) 100% remote 4-day work week You will be working cross-functionally with … a Data Engineering team, so hands-on experience with data workflows and tools like Airbyte is essential. If this sounds like the right role for you, please apply now More ❯