Required Skills: Proven experience managing Power BI deployments (including workspaces, datasets, and reports). Strong understanding of data pipeline deployment using tools like Azure Data Factory, AWS Glue, or Apache Airflow. Hands-on experience with CI/CD tools (Azure DevOps, GitHub Actions, Jenkins). Proficiency in scripting (PowerShell, Python, or Bash) for deployment automation. Experience with manual deployment More ❯
Cambridge, England, United Kingdom Hybrid / WFH Options
Bit Bio
Spark-Streaming. Relational SQL and NoSQL databases, including Postgres and Cassandra. Experience designing and implementing knowledge graphs for data integration and analysis. Data pipeline and workflow management tools: Luigi, Airflow, etc. AWS cloud services: EC2, S3, Glue, Athena, API Gateway, Redshift. Experience with object-oriented and scripting languages: Python, R. Designing and building APIs (RESTful, etc.) Understanding of FAIR More ❯
Job Description Snowflake Architect with Azure (Permanent Role) Basildon, UK (Work from Client office 5 days a week) Responsibilities: Design, develop, and maintain robust data pipelines and ETL processes using Snowflake on AWS. Implement data warehousing solutions, ensuring efficient storage More ❯
Social network you want to login/join with: Akkodis is a global leader in engineering, technology, and R&D, harnessing the power of connected data to drive digital transformation and innovation for a smarter, more sustainable future. As part More ❯
Akkodis is a global leader in engineering, technology, and R&D, harnessing the power of connected data to drive digital transformation and innovation for a smarter, more sustainable future. As part of the Adecco Group, Akkodis combines the expertise of More ❯
platform, ensuring scalability, reliability, and security. Drive modernisation by transitioning from legacy systems to a lean, scalable platform. Act as a lead expert for technologies such as AWS, DBT, Airflow, and Databricks. Establish best practices for data modelling, ingestion, storage, streaming, and APIs. Governance & Standards Ensure all technical decisions are well-justified, documented, and aligned with business needs. Lead … in data engineering and cloud engineering, including data ingestion, transformation, and storage. Significant hands-on experience with AWS and its data services. Expert-level skills in SQL, Python, DBT, Airflow and Redshift. Confidence in coding, scripting, configuring, versioning, debugging, testing, and deploying. Ability to guide and mentor others in technical best practices. A product mindset, focusing on user needs More ❯
Norwich, England, United Kingdom Hybrid / WFH Options
Political, International and Development Studies Student Association
and/or Data Warehousing. An understanding (doesn’t need to be experience of) of Cloud Technologies. Development of data pipeline based on SQL with orchestration tools such as Airflow or Tivoli scheduler. Intermediate programming skills around Python. Awareness of CI/CD practices. We’re looking for confident communicators who can break down technical concepts in a clear More ❯
common life sciences data acquisition software, such as Scientific Data Management Systems (SDMS) or Laboratory Information Management Systems (LIMS). Hands-on experience with data pipeline orchestration tools (e.g., ApacheAirflow) and data parsing. Familiarity with cloud service models, SaaS infrastructure, and related SDLC. Familiarity with containerization and container orchestration tools (e.g., Docker, Kubernetes). ZONTAL is an More ❯
common life sciences data acquisition software, such as Scientific Data Management Systems (SDMS) or Laboratory Information Management Systems (LIMS). Hands-on experience with data pipeline orchestration tools (e.g., ApacheAirflow) and data parsing. Familiarity with cloud service models, SaaS infrastructure, and related SDLC. Familiarity with containerization and container orchestration tools (e.g., Docker, Kubernetes). ZONTAL is an More ❯
common life sciences data acquisition software, such as Scientific Data Management Systems (SDMS) or Laboratory Information Management Systems (LIMS). Hands-on experience with data pipeline orchestration tools (e.g., ApacheAirflow) and data parsing. Familiarity with cloud service models, SaaS infrastructure, and related SDLC. Familiarity with containerization and container orchestration tools (e.g., Docker, Kubernetes). ZONTAL is an More ❯
common life sciences data acquisition software, such as Scientific Data Management Systems (SDMS) or Laboratory Information Management Systems (LIMS). Hands-on experience with data pipeline orchestration tools (e.g., ApacheAirflow) and data parsing. Familiarity with cloud service models, SaaS infrastructure, and related SDLC. Familiarity with containerization and container orchestration tools (e.g., Docker, Kubernetes). ZONTAL is an More ❯
common life sciences data acquisition software, such as Scientific Data Management Systems (SDMS) or Laboratory Information Management Systems (LIMS). Hands-on experience with data pipeline orchestration tools (e.g., ApacheAirflow) and data parsing. Familiarity with cloud service models, SaaS infrastructure, and related SDLC. Familiarity with containerization and container orchestration tools (e.g., Docker, Kubernetes). ZONTAL is an More ❯
common life sciences data acquisition software, such as Scientific Data Management Systems (SDMS) or Laboratory Information Management Systems (LIMS). Hands-on experience with data pipeline orchestration tools (e.g., ApacheAirflow) and data parsing. Familiarity with cloud service models, SaaS infrastructure, and related SDLC. Familiarity with containerization and container orchestration tools (e.g., Docker, Kubernetes). ZONTAL is an More ❯
common life sciences data acquisition software, such as Scientific Data Management Systems (SDMS) or Laboratory Information Management Systems (LIMS). Hands-on experience with data pipeline orchestration tools (e.g., ApacheAirflow) and data parsing. Familiarity with cloud service models, SaaS infrastructure, and related SDLC. Familiarity with containerization and container orchestration tools (e.g., Docker, Kubernetes). ZONTAL is an More ❯
common life sciences data acquisition software, such as Scientific Data Management Systems (SDMS) or Laboratory Information Management Systems (LIMS). Hands-on experience with data pipeline orchestration tools (e.g., ApacheAirflow) and data parsing. Familiarity with cloud service models, SaaS infrastructure, and related SDLC. Familiarity with containerization and container orchestration tools (e.g., Docker, Kubernetes). ZONTAL is an More ❯
common life sciences data acquisition software, such as Scientific Data Management Systems (SDMS) or Laboratory Information Management Systems (LIMS). Hands-on experience with data pipeline orchestration tools (e.g., ApacheAirflow) and data parsing. Familiarity with cloud service models, SaaS infrastructure, and related SDLC. Familiarity with containerization and container orchestration tools (e.g., Docker, Kubernetes). ZONTAL is an More ❯
common life sciences data acquisition software, such as Scientific Data Management Systems (SDMS) or Laboratory Information Management Systems (LIMS). Hands-on experience with data pipeline orchestration tools (e.g., ApacheAirflow) and data parsing. Familiarity with cloud service models, SaaS infrastructure, and related SDLC. Familiarity with containerization and container orchestration tools (e.g., Docker, Kubernetes). ZONTAL is an More ❯
causes. Key Responsibilities: BI Operations & System Monitoring: Ensure day-to-day smooth running of BI data pipelines, reporting systems, and cloud-based ETL/ELT workflows (AWS, Redshift, Glue, Airflow).Monitor system uptime and performance, proactively addressing issues to maintain SLA compliance.Setup monitoring dashboards and look after systems availability and operations. Strong observability setup skills to constantly look for … Experience: 5+ years of experience (minimum 3 years in Production Support for AWS BI/Data Engineering). Technical Experience: Hands-on experience with AWS BI stack (Redshift, Glue, Airflow, S3, Step Functions). Data Engineering & ETL: Redshift, Glue, Airflow, S3 Step Functions, Spark. Reporting & Visualisation: Power BI (preferred), Business Objects, Qlik. Data Warehousing: Data modeling, ETL processes More ❯
Luton, England, United Kingdom Hybrid / WFH Options
easyJet Airline Company PLC
a mixed team of onshore and offshore resources. Key Skills Required: Experience with a combination of cloud-based big data technologies (e.g. HDFS, Blob Storage, Spark, Kafka, Delta, Hive, Airflow, and DBT) and OLTP and Data Warehousing – within SQL Server or other RDBMS’s. Familiarity with Lake House Architecture and Databricks. Extensive data modeling experience from conceptual to logical More ❯
ETL using big data tools (HIVE/Presto, Redshift) Previous experience with web frameworks for Python such as Django/Flask is a plus Experience writing data pipelines using Airflow Fluency in Looker and/or Tableau Strong understanding of data warehousing principles, pipelines, and APIs Strong communication skills The ability to work independently and across multiple time zones More ❯
ideally with some prior management or lead responsibility. A real passion for coaching and developing engineers. Hands-on experience with their tech stack - any cloud, Snowflake (or equivalent), Python, Airflow, Docker Ability to juggle multiple products and effectively gather requirements. Experience with real-time data products is a big plus. Strong communication skills and a good academic background. HOW … ideally with some prior management or lead responsibility. A real passion for coaching and developing engineers. Hands-on experience with their tech stack - any cloud, Snowflake (or equivalent), Python, Airflow, Docker Ability to juggle multiple products and effectively gather requirements. Experience with real-time data products is a big plus. Strong communication skills and a good academic background. HOW … in your career with a leading global firm? Please register your interest by sending your CV via the Apply link on this page. Desired Skills and Experience Snowflake, Python, Airflow, Docker #J-18808-Ljbffr More ❯
Cambridge, Cambridgeshire, UK Hybrid / WFH Options
RP International
Code • Create Terraform scripts for the infrastructure provisioning • Set up infrastructure security • Implementation of IAM policy and permissions for applications and GitHub • Changes of environment variables for applications and Airflow DAGS If you are interested please hit apply with your updated CV and we will arrange a call to further your application. More ❯
Code • Create Terraform scripts for the infrastructure provisioning • Set up infrastructure security • Implementation of IAM policy and permissions for applications and GitHub • Changes of environment variables for applications and Airflow DAGS If you are interested please hit apply with your updated CV and we will arrange a call to further your application. More ❯
Code • Create Terraform scripts for the infrastructure provisioning • Set up infrastructure security • Implementation of IAM policy and permissions for applications and GitHub • Changes of environment variables for applications and Airflow DAGS If you are interested please hit apply with your updated CV and we will arrange a call to further your application. More ❯
Code • Create Terraform scripts for the infrastructure provisioning • Set up infrastructure security • Implementation of IAM policy and permissions for applications and GitHub • Changes of environment variables for applications and Airflow DAGS If you are interested please hit apply with your updated CV and we will arrange a call to further your application. More ❯