edinburgh, central scotland, United Kingdom Hybrid / WFH Options
Net Talent
related field, with a focus on building scalable data systems and platforms. Strong expertise with modern data tools and frameworks such as Spark , dbt , Airflow , Kafka , Databricks , and cloud-native services (AWS, GCP, or Azure). Deep understanding of data modeling , distributed systems , streaming architectures , and ETL/ELT More ❯
roadmaps, plans and delivery Knowledge of some of the specific technologies we leverage would be an advantage and these are; Python, SQL, Snowflake, Tableau, Airflow, Amazon SageMaker, Kafka and React More ❯
our product experience. You'll typically be working in Java or Python, and with a technology stack that includes AWS, Kinesis, S3, Kubernetes, Spark, Airflow, gRPC, New Relic, Databricks, and more. This role requires expertise in distributed systems, microservices, and data pipelines, combined with a strong focus on observability More ❯
including data pipelines, orchestration and modelling. Lead the team in building and maintaining robust data pipelines, data models, and infrastructure using tools such as Airflow, AWS Redshift, DBT and Looker.Ensuring the team follows agile methodologies to improve delivery cadence and responsiveness. Contribute to hands-on coding, particularly in areas … foster team growth and development Strong understanding of the data engineering lifecycle, from ingestion to consumption Hands-on experience with our data stack (Redshift, Airflow, Python, DVT, MongoDB, AWS, Looker, Docker) Understanding of data modelling, transformation, and orchestration best practices Experience delivering both internal analytics platforms and external data More ❯