Senior Data Engineer
Location - London, Bristol or Manchester (1 day a month onsite)
Duration - 6 months
Rate - £550 - £600pd (inside ir35)
As a Data Engineer in the Cyber and Domains Protection Team you will:
- Work within an Agile team to support the development of dashboards and build automated reports to meet the needs of technical and non-technical users
- Work with the data analyst and user researcher to update relevant data models to allow business intelligence data to meet the organisation's specific needs
- Develop business intelligence reports that can be automated, reused and shared with users directly
- Implement data flows to connect operational systems, data for analytics and business intelligence (BI) systems
- Build accessible data for analysis
- Deliver data solutions in accordance with agreed organisational standards that ensure services are resilient, scalable and future-proof
- Investigate problems in systems, processes and services
This role aligns to the Data Analyst role in the Government Digital and Data Profession Capability Framework. At this role level, your skills include:
- Applying statistical and analytical tools and techniques
- Communicating between the technical and non-technical
- Data ethics and privacy
- Data management
- Data preparation and linkage
- Data visualisation
- Developing code for analysis
You will also have the following specialist skills, at Working level:
- Advanced SQL proficiency: expertise in writing complex, highly-performant SQL queries, including common table expressions (CTEs), window functions, and complex joins. Experience with query optimization and performance tuning on relational databases like PostgreSQL, MySQL, or similar
- Cloud data ecosystem (AWS): hands-on experience with core AWS data services. Key services include:
- S3 for data lake storage
- AWS Glue for ETL and data cataloging
- Amazon Redshift or Athena for data warehousing and analytics
- Lambda for event-driven data processing.
- ETL/ELT pipeline development: experience in designing, building, and maintaining robust, automated data pipelines. You should be comfortable with both the theory and practical application of extracting, transforming, and loading data between systems
- Programming for data: Strong scripting skills, including Python
- Infrastructure as code (IaC): Experience deploying and managing cloud infrastructure using tools like Terraform or AWS CDK / CloudFormation
Data modelling and warehousing:
- Dimensional Data Modeling: Deep understanding of data warehousing concepts and best practices. Experience of, and ability to, transform raw transactional data into well-structured analytics-ready datasets using schemas like the star schema (Kimball methodology)
- Data Quality & Governance: build trust in data by implementing data validation checks, testing frameworks, and clear documentation within your pipelines
Experience in the following areas is not essential but would be beneficial:
- Data Orchestration Tools: Familiarity with modern workflow management tools like Apache Airflow, Prefect, or Dagster
- Modern Data Transformation: Experience with dbt (Data Build Tool) for managing the transformation layer of the data warehouse
- BI Tool Familiarity: An understanding of how BI tools like AWS QuickSight consume data, and the ability to structure datasets optimally for visualization and reporting e
Please submit a copy of your latest CV for more information on this vacancy.
- Company
- Adecco
- Location
- City of London, London, United Kingdom
- Employment Type
- Contract
- Salary
- £550 - £600/day
- Posted
- Company
- Adecco
- Location
- City of London, London, United Kingdom
- Employment Type
- Contract
- Salary
- £550 - £600/day
- Posted