Amazon EKS, Amazon S3, AWS Glue, Amazon RDS, Amazon DynamoDB, Amazon Aurora, Amazon SageMaker, Amazon Bedrock (including LLM hosting and management). Expertise in workflow orchestration tools such as ApacheAirflow Experience implementing DataOps best practices and tooling, including DataOps.Live Advanced skills in data storage and management platforms like Snowflake Ability to deliver insightful analytics via business intelligence More ❯
ensure the smooth delivery of daily data loads across multiple systems. Key Responsibilities Lead daily data operations and ensure end-to-end completion of data loads. Monitor and debug Airflow workflows, resolving issues efficiently. Manage a team of offshore data engineers, providing technical direction and support. Collaborate with stakeholders to troubleshoot data discrepancies and root-cause issues in reports. … business and technical stakeholders to improve data reliability and transparency. Identify opportunities for automation and process optimisation once BAU stability is achieved. Technical Environment AWS (data storage and processing) ApacheAirflow (workflow orchestration) Power BI (reporting and analytics) What We're Looking For Strong background in data engineering or data operations. Experience managing or mentoring offshore technical teams. More ❯
enterprise architecture frameworks (TOGAF, Zachman). Expertise in cloud-native data platforms (Azure, AWS, GCP) and data modelling. Experience with data mesh, governance, and integration/orchestration tools (Kafka, Airflow, dbt). Knowledge of lakehouse and virtualization concepts. Exposure to AI/GenAI initiatives and ethical data practices. Skilled communicator with experience mentoring others in matrix environments. Awareness of More ❯
enterprise architecture frameworks (TOGAF, Zachman). Expertise in cloud-native data platforms (Azure, AWS, GCP) and data modelling. Experience with data mesh, governance, and integration/orchestration tools (Kafka, Airflow, dbt). Knowledge of lakehouse and virtualization concepts. Exposure to AI/GenAI initiatives and ethical data practices. Skilled communicator with experience mentoring others in matrix environments. Awareness of More ❯
ECS images – largely in NodeJS.) ETL Pipeline Management: Develop and optimise data pipelines to enable seamless data flow and transformation. (We currently use a mix of SSIS, ETL Works, Airflow, Snowflake and are moving to Airflow/Snowflake only architecture.) Snowflake Management: Create production-ready procedures in Snowflake for moving and analysing data. System Optimisation: Improve existing backend … Can handle sensitive and confidential information Experience working with non-data stakeholders to translate their needs and generate useful results presented in an understandable way Familiarity with orchestration tools (Airflow, DBT) and data warehouse modelling Managing other data engineers Experience with customer and commercial datasets, especially in retail or FMCG A love of pets! About Jollyes Pets Jollyes are More ❯
Akkodis is a global leader in engineering, technology, and R&D, harnessing the power of connected data to drive digital transformation and innovation for a smarter, more sustainable future. As part of the Adecco Group, Akkodis combines the expertise of More ❯
Akkodis is a global leader in engineering, technology, and R&D, harnessing the power of connected data to drive digital transformation and innovation for a smarter, more sustainable future. As part of the Adecco Group, Akkodis combines the expertise of More ❯
watford, hertfordshire, east anglia, united kingdom
Akkodis
Akkodis is a global leader in engineering, technology, and R&D, harnessing the power of connected data to drive digital transformation and innovation for a smarter, more sustainable future. As part of the Adecco Group, Akkodis combines the expertise of More ❯