Edinburgh, Scotland, United Kingdom Hybrid / WFH Options
Widen the Net Limited
Senior Data Engineer/Data Analytics Engineer - Full Remote Working from anywhere in the UK SQL+Python+ETL+ ApacheAirflow Our client is a global leading and fast growing high tech company: -Over 6,500 employees across 20+ offices; -300 Million+ active users on some of the platforms they developed … ensure data quality, and support business decision-making with high-quality datasets. -Work across technology stack: SQL, Python, ETL, Big Query, Spark, Hadoop, Git, ApacheAirflow, Data Architecture, Data Warehousing -Design and develop scalable ETL pipelines to automate data processes and optimize delivery -Implement and manage data warehousing … solutions, ensuring data integrity through rigorous testing and validation -Lead, plan and execute workflow migration and data orchestration using ApacheAirflow -Focus on data engineering and data analytics Requirements: -5+ years of experience in SQL -5+ years of development in Python -MUST have strong experience in More ❯
broughton, central scotland, united kingdom Hybrid / WFH Options
Widen the Net Limited
Senior Data Engineer/Data Analytics Engineer - Full Remote Working from anywhere in the UK SQL+Python+ETL+ ApacheAirflow Our client is a global leading and fast growing high tech company: -Over 6,500 employees across 20+ offices; -300 Million+ active users on some of the platforms they developed … ensure data quality, and support business decision-making with high-quality datasets. -Work across technology stack: SQL, Python, ETL, Big Query, Spark, Hadoop, Git, ApacheAirflow, Data Architecture, Data Warehousing -Design and develop scalable ETL pipelines to automate data processes and optimize delivery -Implement and manage data warehousing … solutions, ensuring data integrity through rigorous testing and validation -Lead, plan and execute workflow migration and data orchestration using ApacheAirflow -Focus on data engineering and data analytics Requirements: -5+ years of experience in SQL -5+ years of development in Python -MUST have strong experience in More ❯
livingston, central scotland, united kingdom Hybrid / WFH Options
Widen the Net Limited
Senior Data Engineer/Data Analytics Engineer - Full Remote Working from anywhere in the UK SQL+Python+ETL+ ApacheAirflow Our client is a global leading and fast growing high tech company: -Over 6,500 employees across 20+ offices; -300 Million+ active users on some of the platforms they developed … ensure data quality, and support business decision-making with high-quality datasets. -Work across technology stack: SQL, Python, ETL, Big Query, Spark, Hadoop, Git, ApacheAirflow, Data Architecture, Data Warehousing -Design and develop scalable ETL pipelines to automate data processes and optimize delivery -Implement and manage data warehousing … solutions, ensuring data integrity through rigorous testing and validation -Lead, plan and execute workflow migration and data orchestration using ApacheAirflow -Focus on data engineering and data analytics Requirements: -5+ years of experience in SQL -5+ years of development in Python -MUST have strong experience in More ❯
dunfermline, north east scotland, united kingdom Hybrid / WFH Options
Widen the Net Limited
Senior Data Engineer/Data Analytics Engineer - Full Remote Working from anywhere in the UK SQL+Python+ETL+ ApacheAirflow Our client is a global leading and fast growing high tech company: -Over 6,500 employees across 20+ offices; -300 Million+ active users on some of the platforms they developed … ensure data quality, and support business decision-making with high-quality datasets. -Work across technology stack: SQL, Python, ETL, Big Query, Spark, Hadoop, Git, ApacheAirflow, Data Architecture, Data Warehousing -Design and develop scalable ETL pipelines to automate data processes and optimize delivery -Implement and manage data warehousing … solutions, ensuring data integrity through rigorous testing and validation -Lead, plan and execute workflow migration and data orchestration using ApacheAirflow -Focus on data engineering and data analytics Requirements: -5+ years of experience in SQL -5+ years of development in Python -MUST have strong experience in More ❯
Edinburgh, Scotland, United Kingdom Hybrid / WFH Options
JR United Kingdom
early and providing strategic guidance. Support the ongoing development of integration strategies involving Managed File Transfer solutions (e.g., GoAnywhere) and data orchestration platforms (e.g., ApacheAirflow). Provide hands-on support and detailed guidance on particularly complex integration designs where necessary. Maintain current knowledge of industry trends, technology … and migrations. Familiarity with IBM Maximo asset management platform. Knowledge and experience with Managed File Transfer solutions (e.g., GoAnywhere). Understanding and experience with ApacheAirflow orchestration platform. Strong grasp of integration best practices, security considerations, and data flow management. Ability to work collaboratively across distributed teams and More ❯
Spark/Scala/Kafka Unix: Scripting and Config Other Highly Valued Skills Include Automation - Python/Bash Scripting DataBase - Teradata, Oracle Workflow Management: ApacheAirflow You may be assessed on the key critical skills relevant for success in role, such as risk and controls, change and transformation More ❯
processes. Technical Skills Programming: Proficiency in Python, Java, Scala, or similar languages. Big Data Technologies: Hands-on experience with big data tools e.g. (Databricks, Apache Spark, Hadoop). Cloud Platforms: Familiarity with AWS, Azure, GCP, or other cloud ecosystems for data engineering tasks. Expertise in relational databases (e.g. postgres … sql server) Data Integration Tools: Knowledge of platforms like Airflow, Apache NiFi, or Talend. Data Storage and Modelling: Experience with data warehousing tools (e.g. Snowflake, Redshift, BigQuery) and schema design. Version Control and CI/CD: Familiarity with Git, Docker, and CI/CD pipelines for deployment. Experience More ❯
with tools and packages like Pandas, NumPy, scikit-learn, Plotly/Matplotlib, and Jupyter Notebooks. Knowledge of ML-adjacent technologies, including AWS SageMaker and Apache Airflow. Proficiency in data pre-processing, data wrangling, and augmentation techniques. Experience with cloud platforms (e.g. AWS, Google Cloud, or Azure) for deploying scalable More ❯
processes. Technical Skills Programming: Proficiency in Python, Java, Scala, or similar languages. Big Data Technologies: Hands-on experience with big data tools e.g. (Databricks, Apache Spark, Hadoop). Cloud Platforms: Familiarity with AWS, Azure, GCP, or other cloud ecosystems for data engineering tasks. Expertise in relational databases (e.g. postgres … sql server) Data Integration Tools: Knowledge of platforms like Airflow, Apache NiFi, or Talend. Data Storage and Modelling: Experience with data warehousing tools (e.g. Snowflake, Redshift, BigQuery) and schema design. Version Control and CI/CD: Familiarity with Git, Docker, and CI/CD pipelines for deployment. Experience More ❯
Edinburgh, Scotland, United Kingdom Hybrid / WFH Options
Net Talent
related field, with a focus on building scalable data systems and platforms. Strong expertise with modern data tools and frameworks such as Spark , dbt , Airflow , Kafka , Databricks , and cloud-native services (AWS, GCP, or Azure). Deep understanding of data modeling , distributed systems , streaming architectures , and ETL/ELT More ❯
Edinburgh, Scotland, United Kingdom Hybrid / WFH Options
CreateFuture
love to talk to you if: You've got solid experience working with Python, SQL, Spar and data pipeline tools such as dbt or Airflow You're comfortable working across cloud platforms - especially AWS (Glue, Lambda, Athena), Azure (Data Factory, Synapse), or GCP (Dataflow, BigQuery, Cloud Composer) You have More ❯
a related field, with a focus on building scalable data systems and platforms. Expertise in modern data tools and frameworks such as Spark, dbt, Airflow, Kafka, Databricks, and cloud-native services (AWS, GCP, or Azure) Understanding of data modeling, distributed systems, ETL/ELT pipelines, and streaming architectures Proficiency More ❯
Gateway, and Python to maintain and enhance integrations. Analyze and understand existing R-based data pipelines created by data scientists. Migrate these pipelines into Airflow, dbt, and Terraform workflows. Modernize and scale legacy infrastructure running on AWS. Collaborate with engineering teams to ensure a smooth transition and system stability. … Phase 2: Legacy Pipeline Migration (Months 2-3) Analyze and understand existing R-based data pipelines created by data scientists. Migrate these pipelines into Airflow, dbt, and Terraform workflows. Modernize and scale legacy infrastructure running on AWS. Collaborate with engineering teams to ensure a smooth transition and system stability. … beneficial for interpreting existing scripts) Cloud & Infrastructure: AWS services including Lambda, API Gateway, S3, CloudWatch, Kinesis Firehose Terraform for infrastructure as code Orchestration & Transformation: ApacheAirflow dbt CRM & Marketing Tools: Braze (preferred) Familiarity with other CRM/marketing automation tools such as Iterable or Salesforce Marketing Cloud is More ❯
Gateway, and Python to maintain and enhance integrations. Analyze and understand existing R-based data pipelines created by data scientists. Migrate these pipelines into Airflow, dbt, and Terraform workflows. Modernize and scale legacy infrastructure running on AWS. Collaborate with engineering teams to ensure a smooth transition and system stability. … Phase 2: Legacy Pipeline Migration (Months 2-3) Analyze and understand existing R-based data pipelines created by data scientists. Migrate these pipelines into Airflow, dbt, and Terraform workflows. Modernize and scale legacy infrastructure running on AWS. Collaborate with engineering teams to ensure a smooth transition and system stability. … beneficial for interpreting existing scripts) Cloud & Infrastructure: AWS services including Lambda, API Gateway, S3, CloudWatch, Kinesis Firehose Terraform for infrastructure as code Orchestration & Transformation: ApacheAirflow dbt CRM & Marketing Tools: Braze (preferred) Familiarity with other CRM/marketing automation tools such as Iterable or Salesforce Marketing Cloud is More ❯
Gateway, and Python to maintain and enhance integrations. Analyze and understand existing R-based data pipelines created by data scientists. Migrate these pipelines into Airflow, dbt, and Terraform workflows. Modernize and scale legacy infrastructure running on AWS. Collaborate with engineering teams to ensure a smooth transition and system stability. … Phase 2: Legacy Pipeline Migration (Months 2-3) Analyze and understand existing R-based data pipelines created by data scientists. Migrate these pipelines into Airflow, dbt, and Terraform workflows. Modernize and scale legacy infrastructure running on AWS. Collaborate with engineering teams to ensure a smooth transition and system stability. … beneficial for interpreting existing scripts) Cloud & Infrastructure: AWS services including Lambda, API Gateway, S3, CloudWatch, Kinesis Firehose Terraform for infrastructure as code Orchestration & Transformation: ApacheAirflow dbt CRM & Marketing Tools: Braze (preferred) Familiarity with other CRM/marketing automation tools such as Iterable or Salesforce Marketing Cloud is More ❯
Edinburgh, Scotland, United Kingdom Hybrid / WFH Options
NatWest Group
documentation base, as well as contribute to it Strong software engineering, systems architecture, and unit testing capabilities and experience using pipeline tools such as ApacheAirflow, Amazon SageMaker, or similar Familiarity with SQL and experience with AWS or other cloud providers Experience with GitLab CI/CD pipelines More ❯
Spark/Scala/Kafka Unix: Scripting and Config Other highly valued skills include : Automation - Python/Bash Scripting DataBase - Teradata, Oracle Workflow Management: ApacheAirflow You may be assessed on the key critical skills relevant for success in role, such as risk and controls, change and transformation More ❯
related field, with a focus on building scalable data systems and platforms. Strong expertise in modern data tools and frameworks such as Spark, dbt, Airflow, Kafka, Databricks, and cloud-native services (AWS, GCP, or Azure) Deep understanding of data modeling, distributed systems, ETL/ELT pipelines, and streaming architectures More ❯
related field, with a focus on building scalable data systems and platforms. Strong expertise in modern data tools and frameworks such as Spark, dbt, Airflow, Kafka, Databricks, and cloud-native services (AWS, GCP, or Azure) Deep understanding of data modeling, distributed systems, ETL/ELT pipelines, and streaming architectures More ❯
related field, with a focus on building scalable data systems and platforms. Strong expertise in modern data tools and frameworks such as Spark, dbt, Airflow, Kafka, Databricks, and cloud-native services (AWS, GCP, or Azure) Deep understanding of data modeling, distributed systems, ETL/ELT pipelines, and streaming architectures More ❯
related field, with a focus on building scalable data systems and platforms. Strong expertise in modern data tools and frameworks such as Spark, dbt, Airflow, Kafka, Databricks, and cloud-native services (AWS, GCP, or Azure) Deep understanding of data modeling, distributed systems, ETL/ELT pipelines, and streaming architectures More ❯
related field, with a focus on building scalable data systems and platforms. Strong expertise in modern data tools and frameworks such as Spark, dbt, Airflow, Kafka, Databricks, and cloud-native services (AWS, GCP, or Azure) Deep understanding of data modeling, distributed systems, ETL/ELT pipelines, and streaming architectures More ❯
related field, with a focus on building scalable data systems and platforms. Strong expertise with modern data tools and frameworks such as Spark , dbt , Airflow OR Kafka , Databricks , and cloud-native services (AWS, GCP, or Azure). Deep understanding of data modeling , distributed systems , streaming architectures , and ETL/ More ❯
Edinburgh, Scotland, United Kingdom Hybrid / WFH Options
JR United Kingdom
related field, with a focus on building scalable data systems and platforms. Strong expertise with modern data tools and frameworks such as Spark , dbt , Airflow , Kafka , Databricks , and cloud-native services (AWS, GCP, or Azure). Deep understanding of data modeling , distributed systems , streaming architectures , and ETL/ELT More ❯
related field, with a focus on building scalable data systems and platforms. Strong expertise with modern data tools and frameworks such as Spark , dbt , Airflow OR Kafka , Databricks , and cloud-native services (AWS, GCP, or Azure). Deep understanding of data modeling , distributed systems , streaming architectures , and ETL/ More ❯