Permanent Market Data Developer - Commodities - Python/Market Data/DMS/ETL/Data Warehousing/tick data As a Senior market data developer, you will be responsible of developing market data capabilities ensuring reliable and accurate data feeds consumption and distribution. You will advance the development of our DMS aimed to ensure market data reliability … building and delivering services for data parsing and distribution. Deliver and enhance data parsers and other data processing mechanisms aligned to a standardised data consumption/distribution model. Steward ETL coding standards: ensuring that code is standardised, self-documenting and can be reliably tested. Lead a team of market data professionals on development projects. Act as SME for market … integrity, among other. Understanding and experience of tooling and technology that support all aspects of the Data solutions development life cycle in an agile environment. Detailed working knowledge of ETL/ELT, data warehousing methodologies and best practice including dealing with EOD and tick data. Knowledge of different schema structures & design. Deep understanding of deployment and automation workflows. Knowledge More ❯
Lead Market Data Developer - PERM ETL/ELT, Data Warehousing, Market Data Systems, Python, Docker, Databricks A global trading firm is seeking a Lead Market Data Developer to join their team permanently. This is a senior hands-on role with leadership responsibilities, focused on ensuring reliable and accurate market data delivery for business-critical operations. Key Responsibilities … projects Gather requirements, design solutions, build & deliver services for data parsing/distribution Build and enhance data parsers aligned to standardized models Enforce coding standards and best practices across ETL processes Manage and mentor a team of market data developers Act as SME for market data-related initiatives Provide second-line support for daily parsing and delivery Collaborate with … processing Proven experience designing market data systems Delivery of high-quality data applications in agile environments Solid grasp of performance, scalability, and data integrity best practices Deep knowledge of ETL/ELT and data warehousing methodologies Experience with EOD and tick data handling Schema design and optimization Familiarity with deployment/automation workflows Bonus: Experience with Docker, Databricks By More ❯
We are currently supporting a public sector organisation in the Yorkshire & Humber region that is seeking a forward-thinking ETLDeveloper to help modernise and enhance their data infrastructure. This role will focus on redesigning and refreshing existing ETL pipelines, which currently use a combination of Python and SSIS, to support the transition to a new centralised … warehouse. The successful candidate will play a key part in migrating these pipelines to Azure Data Factory and providing ongoing support and iteration. Key Responsibilities: Redesign and develop new ETL pipelines to support modern data warehousing. Migrate and refine existing pipelines from on-premise solutions to Azure Data Factory. Access and work with SQL Server databases, including writing T … SQL queries and stored procedures. Support and maintain both new and existing ETL processes. Structure and prepare data to optimise performance and enable deeper insights. Skills & Experience: Proven experience writing new ETL pipelines. Strong understanding of data structuring for performance and analytical use. Confident in working with SQL Server, including advanced T-SQL and stored procedures. Experience refining More ❯
We are currently supporting a public sector organisation in the Yorkshire & Humber region that is seeking a forward-thinking ETLDeveloper to help modernise and enhance their data infrastructure. This role will focus on redesigning and refreshing existing ETL pipelines, which currently use a combination of Python and SSIS, to support the transition to a new centralised … warehouse. The successful candidate will play a key part in migrating these pipelines to Azure Data Factory and providing ongoing support and iteration. Key Responsibilities: Redesign and develop new ETL pipelines to support modern data warehousing. Migrate and refine existing pipelines from on-premise solutions to Azure Data Factory. Access and work with SQL Server databases, including writing T … SQL queries and stored procedures. Support and maintain both new and existing ETL processes. Structure and prepare data to optimise performance and enable deeper insights. Skills & Experience: Proven experience writing new ETL pipelines. Strong understanding of data structuring for performance and analytical use. Confident in working with SQL Server, including advanced T-SQL and stored procedures. Experience refining More ❯
appropriate use cases for Data lineage, DQ, Profiling, big data technologies and cloud platform usage. Role/Position Overview We are looking to bring in a Senior Abinitio Metadata Developer with sufficient MDH (Metadata Hub) knowledge to work on one of the large industry wide transformation Product in Citi to move into Single ledger solution for all Citi financial … platform and next generation Business Rules platform using latest innovative Abinitio technologies. The Candidate is required to possess relevant design and development experience in the Tools and technologies on ETL programming. Person should be a strong team player. Exposure to Finance OR Risk Functions on the Retail Banking products or Wholesale/Investment banking is preferred. This is a … significant opportunity for an experienced developer with experience in modern Abinitio platforms to move into a role working with a variety of development teams, including close collaboration with an on-site finance team. To be successful in this role, you will need to have proven experience in development of solutions/platforms for financial markets environment. It is expected More ❯