Data Engineer II
Data Engineer - II Would you like to ensure the successful delivery of the Data Platform and Software Innovations? Do you enjoy creating a collaborative and customer-focused working environment? About Team: LexisNexis Intellectual Property, which serves customers in more than 150 countries with 11,300 employees worldwide, is part of RELX, a global provider of information-based analytics and decision tools for professional and business customers. About Role: As a Data Engineer at LexisNexis Intellectual Property (LNIP), you'll contribute to building and maintaining our next-generation Strategic Data Platform. This platform ingests, enriches, and transforms global patent and IP-related data to power key products like PatentSight+, as well as a growing ecosystem of internal tools and customer-facing solutions. In this early-career role, you will collaborate with senior engineers and technical leads to design robust data pipelines, apply engineering best practices, and support the delivery of high-quality data through modern platforms such as Databricks, APIs, and event-driven systems. You'll gain practical experience working at scale while contributing to the delivery of data directly to customers and systems across the organisation. Key Responsibilities:
- Contributing to the development and maintenance of data pipelines using Python, PySpark, and Databricks
- Supporting the delivery of enriched datasets to customers via Databricks, RESTful APIs, and event-driven delivery mechanisms (e.g., Kafka or similar)
- Assisting in data ingestion, transformation, and enrichment across the medallion architecture (bronze → silver → gold)
- Collaborating with cross-functional teams, including engineers, data analysts, and product managers
- Participating in code reviews, unit testing, and documentation to ensure high code quality and maintainability
- Troubleshooting and debugging data issues across development and production environments
- Following and contributing to internal best practices around data engineering and software development
- Continuously developing technical skills and understanding of the business domain
- Hands-on experience in a software/data engineering role.
- Proficiency in Python and working knowledge of PySpark or similar distributed data frameworks.
- Familiarity with Databricks or a strong interest in learning and working with the platform.
- Understanding of data delivery patterns, including REST APIs and event-driven architectures.
- Experience with SQL and structured data manipulation.
- Familiarity with version control systems (e.g., Git).
- Strong problem-solving mindset and willingness to learn from feedback.
- Good communication skills and ability to work in a team setting.
- Exposure to cloud platforms like AWS, Azure, or GCP.
- Experience working with large-scale or open datasets.
- Familiarity with medallion architecture or similar data lake patterns.
- Understanding of data quality principles and CI/CD pipelines for data workflows.
- Working flexible hours - flexing the times when you work in the day to help you fit everything in and work when you are the most productive.
- Generous holiday allowance with the option to buy additional days.
- Health screening, eye care vouchers and private medical benefits
- Wellbeing programs
- Life assurance
- Access to a competitive contributory pension scheme
- Save As You Earn share option scheme.
- Travel Season ticket loan.
- Electric Vehicle Scheme
- Optional Dental Insurance
- Maternity, paternity, and shared parental leave
- Employee Assistance Programme
- Access to emergency care for both the elderly and children
- RECARES days, giving you time to support the charities and causes that matter to you.
- Access to employee resource groups with dedicated time to volunteer.
- Access to extensive learning and development resources
- Access to the employee discounts scheme via Perks at Work
- Company
- Disability Solutions
- Location
- London, UK
- Posted
- Company
- Disability Solutions
- Location
- London, UK
- Posted