the data engineering role, ideally as a senior or lead engineer. Hands on experience of building and maintaining Python based Azure data bricks notebooks in Azure synapse analytics. Strong SQL experience, including programming ie, SQL queries and stored procedures, and formal database design methodologies. Experience in setting up monitoring and data quality exception handling. Strong data modelling experience. Experience managing More ❯
integrates seamlessly with their backend database to automate and streamline the trade execution operations processes. The platform will optimize operations-based trade processing, leveraging Azure hosting and tools like SQL, and Excel. This role is ideal for a candidate with expertise in backend systems integration, database optimization, and workflow automation. Key Responsibilities: • Design and develop a robust trade execution platform … to process trades efficiently, integrating with existing backend databases. • Enable secure and efficient data file exchange between users. • Database Integration: • Work with SQL databases to handle data extraction, transformation, and loading (ETL), ensuring real-time accuracy. • Build capabilities to handle input and output of data in a range of formats including Excel for seamless data exchange and communication with corporate … user operations, and maintenance. Candidate Specification: Technical Skills: • Programming Languages: Strong proficiency in Python, Rust, C#, or Java, ASP.NET, or Django for backend development. • Database Expertise: Advanced skills in SQL (querying, optimization, ETL processes) and working knowledge of integrating relational databases. • File Handling: Hands-on experience processing and manipulating Excel files programmatically. • Cloud Technologies: Proficiency in Azure services, including Azure More ❯
relevant technical strategies, principles and standards are appliedcorrectly. Contribute to application, technical, and data architecture design principles. Required Knowledge : Expert knowledge and experience of software development in Java and SQL for reporting applications. Good knowledge and experienced of Spring-boot and Spring-batch application development. Expert knowledge of data frameworks, data modelling, database design, SQL, Stored Procedures, MDX, and APIs. … Good knowledge and experience of reporting technical stack MS SQL Server, SSRS, PowerBI, SSAS, PostgreSQL. Knowledge of integration and ETL technologies Informatica ETL, Spring-Batch and SSIS. Good experience of managing software development teams, process and tools. Good knowledge and experience of software engineering design, development, build, and test process and practice. Good experience in agile delivery and ceremonies e.g. More ❯
impact through data engineering, software development, or analytics Demonstrated success in launching and scaling technical products or platforms Strong programming skills in at least two of the following: Python, SQL, Java Commercial experience in client-facing projects is a plus, especially within multi-disciplinary teams Deep knowledge of database technologies: Distributed systems (e.g., Spark, Hadoop, EMR) RDBMS (e.g., SQL Server More ❯
processes Stay current with industry trends and best practices in BI and data analytics Required Skills: Proficiency in BI tools (Power BI, Tableau, QlikView) and data visualization techniques Strong SQL skills and experience with relational databases (e.g., SQL Server, Oracle, MySQL) Familiarity with ETL processes and data warehousing concepts Experience in data modeling and designing effective reporting structures Knowledge of More ❯
house development). Proven expertise in Front Office , Operations , and Risk requirements-specifically for Natural Gas trading. Strong understanding of the full SDLC within Agile methodologies. Highly proficient in SQL and database structures. Excellent interpersonal and communication skills-able to work with technical and non-technical stakeholders in a multicultural environment. Strong analytical, problem-solving, and documentation skills. Self-motivated More ❯
experience with Spring-based technologies (Spring Boot etc.) Experience in designing and implementing REST APIs & micro services-based solutions. Should have experience writing unit/integration tests Experience writing SQL queries and good understanding of data model Working knowledge in AWS cloud (EC2, ECS, Load Balancer, Security Group, Lambda, S3 etc.) Experience in DevOps development and deployment using c ontainer More ❯
Science, Information Systems, Finance, or a related field. Proven experience as a Data Analyst/Analytics Engineer role, preferably in the payments industry with issuer processors. Proven experience in SQL,DBT and Snowflake. Proficiency in building and managing data transformations with dbt, with experience in optimizing complex transformations and documentation. Hands-on experience with Snowflake as a primary data warehouse … including knowledge of performance optimization, data modeling, and query tuning. Strong proficiency in data analysis tools and languages (e.g., SQL, Python). Strong understanding of data modeling principles and experience applying modeling techniques. Proficiency with data visualization tools such as Tableau, Power BI, or similar. Knowledge of payment processing system, card issuance, and related services. Experience with cloud-based data More ❯
and cost management. Knowledge of data security, compliance, and governance in Azure, including Azure Active Directory (AAD), RBAC, and encryption. Experience working with big data technologies (Spark, Python, Scala, SQL). Strong problem-solving and troubleshooting skills. Excellent communication skills with the ability to collaborate with cross-functional teams to understand requirements, data solutions, data models and mapping documents. Preferred More ❯
london, south east england, united kingdom Hybrid / WFH Options
Axis Capital
and cost management. Knowledge of data security, compliance, and governance in Azure, including Azure Active Directory (AAD), RBAC, and encryption. Experience working with big data technologies (Spark, Python, Scala, SQL). Strong problem-solving and troubleshooting skills. Excellent communication skills with the ability to collaborate with cross-functional teams to understand requirements, data solutions, data models and mapping documents. Preferred More ❯
to the company’s success. There are opportunities for professional development, such as training programs, certifications, and career advancement paths. KEY RESPONSIBILITIES Design, develop, and maintain scalable data pipelines SQL, Azure ADF, Azure Functions, DBT Collaborate with analysts and stakeholders to understand their data needs, scoping and implementing solutions Optimising and cleaning of data warehouse, cleaning existing codebase and creating … documentation. Monitor and troubleshoot data pipeline issues Provide technical support and guidance to junior team members Provide ongoing support for business applications. Provide help with data analysis using SQL, Qube, Tableau, Hubspot and excel. Perform systems analysis, business process analysis and design. Project Management, developing project plans, and monitoring performance. Monitoring deliverables and ensuring timely completion of projects. Research and More ❯
Experience and Qualifications: Product Specific Qualifications: Openlink Endur Product experience in implementation per role profile, Connex Integrations JMS and WebServices, Advanced Report Builder, Advanced Data Model, Advanced OpenJVS and SQL, User Defined Sim Results, Advanced TPM workflows, Advanced knowledge of concepts such as: Regulatory Reporting, Inventory Management, External Pricing Models, Grid Enabling. OpenComponents .Net is a plus. Business knowledge of … years working in consulting capacity. Strong experience with Openlink: C# .NET, Java, C++, Soap Webservices. Experience with business reporting tools such as Crystal Reports and Openlink’s DMS. Strong SQL scripting skills in Oracle or MSSQL. High proficiency with RDBMS such Oracle or MS SQL Server is required. High proficiency with Interface implementation following industry standard integration patterns. Expert ability More ❯
Experience And Qualifications Product Specific Qualifications: Openlink Endur Product experience in implementation per role profile, Connex Integrations JMS and WebServices, Advanced Report Builder, Advanced Data Model, Advanced OpenJVS and SQL, User Defined Sim Results, Advanced TPM workflows, Advanced knowledge of concepts such as: Regulatory Reporting, Inventory Management, External Pricing Models, Grid Enabling. OpenComponents .Net is a plus. Business knowledge of … years working in consulting capacity Strong experience with Openlink: C# .NET, Java, C++, Soap Webservices Experience with business reporting tools such as Crystal Reports and Openlink’s DMS Strong SQL scripting skills in Oracle or MSSQL High proficiency with RDBMS such Oracle or MS SQL Server is required High proficiency with Interface implementation following industry standard integration patterns Expert ability More ❯
to the company's success. There are opportunities for professional development, such as training programs, certifications, and career advancement paths. KEY RESPONSIBILITIES Design, develop, and maintain scalable data pipelines SQL, Azure ADF, Azure Functions, DBT Collaborate with analysts and stakeholders to understand their data needs, scoping and implementing solutions Optimising and cleaning of data warehouse, cleaning existing codebase and creating … documentation. Monitor and troubleshoot data pipeline issues Provide technical support and guidance to junior team members Provide ongoing support for business applications. Provide help with data analysis using SQL, Qube, Tableau, Hubspot and excel. Perform systems analysis, business process analysis and design. Project Management, developing project plans, and monitoring performance. Monitoring deliverables and ensuring timely completion of projects. Research and More ❯
to the company's success. There are opportunities for professional development, such as training programs, certifications, and career advancement paths. KEY RESPONSIBILITIES Design, develop, and maintain scalable data pipelines SQL, Azure ADF, Azure Functions, DBT Collaborate with analysts and stakeholders to understand their data needs, scoping and implementing solutions Optimising and cleaning of data warehouse, cleaning existing codebase and creating … documentation. Monitor and troubleshoot data pipeline issues Provide technical support and guidance to junior team members Provide ongoing support for business applications. Provide help with data analysis using SQL, Qube, Tableau, Hubspot and excel. Perform systems analysis, business process analysis and design. Project Management, developing project plans, and monitoring performance. Monitoring deliverables and ensuring timely completion of projects. Research and More ❯
Oriented design, SOLID principles, and modern design patterns - Developmentexperience in Microsoft .Net Framework and experience inFront-end JavaScript frameworks like Angular & React - Traditional Relational Database technologies like Oracle, MS SQL Server and - No SQL Databases like MongoDB or DynamoDB - SOA & Microservices architecture implementation using REST APIs, queue-based messaging patterns, exposure toMulesoft/Kong is a plus - On-premise/ More ❯
Data Analyst. Technical expertise regarding data models, database design development, data mining, and segmentation techniques. Strong knowledge of and experience with reporting packages (Business Objects, Tableau, Power BI), databases (SQL, etc.), programming (XML, JavaScript, or ETL frameworks), and data visualization (Power BI, Tableau, etc.). Demonstrated analytical skills with the ability to collect, organize, analyze, and disseminate significant amounts of More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Hanson Lee
Data Analyst. Technical expertise regarding data models, database design development, data mining, and segmentation techniques. Strong knowledge of and experience with reporting packages (Business Objects, Tableau, Power BI), databases (SQL, etc.), programming (XML, JavaScript, or ETL frameworks), and data visualization (Power BI, Tableau, etc.). Demonstrated analytical skills with the ability to collect, organize, analyze, and disseminate significant amounts of More ❯
automation. Experience with front-end development and cloud-platforms/technologies offered by AWS, Microsoft Azure. Experience with databases such as Postgres and Oracle and database technologies such as SQL and NoSQL. Code versioning, maintenance and deployment tools like GitHub, JIRA. Excellent problem-solving ability with solid communication and collaboration skills. What Would Make You Stand Out: Experience with popular More ❯
london, south east england, united kingdom Hybrid / WFH Options
Fitch Group
automation. Experience with front-end development and cloud-platforms/technologies offered by AWS, Microsoft Azure. Experience with databases such as Postgres and Oracle and database technologies such as SQL and NoSQL. Code versioning, maintenance and deployment tools like GitHub, JIRA. Excellent problem-solving ability with solid communication and collaboration skills. What Would Make You Stand Out: Experience with popular More ❯
be beneficial. Understanding of the System Delivery Life Cycle. Experience of using agile delivery tools such as JIRA, Pivotal, Collab, Confluence Experience of engineering based on the likes of SQL, SSIS, Python, Java, Scala, XML/FpML and Power BI Data architecture, data lineage and all aspects of AI including, but not limited to, NLP, ML, deep learning and Generative More ❯
remotely and in person to all levels of the organization. You will have a passion for Data technologies and will code, test, and debug new and existing applications using SQL and other Data Engineering and BI Tools. You will be driven to learn about Freewheel's Ad-Tech driven products (MRM, Beeswax, Strata, and others) and how their processes are … software and Data Engineering applications in the Snowflake Environment • Displays in-depth knowledge of engineering methodologies, concepts, skills and their application in Snowflake and related technologies. • Develop advanced Snowflake SQL and ETL code for support of FreeWheel's BI Systems • Utilize Snowflake standards and best practices • Work closely with internal stakeholders and business partners to understand their needs and transform … quantitative or similar field required. • 4+ years of experience in Data Analysis and Data Pipeline Engineering • Knowledge of Snowflake Architecture, Best Practices, and Implementation required • Strong knowledge of Snowflake SQL Development is required • Experience with Visualization BI tools (Looker is required. Tableau and/or Power BI is a plus). • Knowledge of Microsoft Fabric and Microsoft SQL Server is More ❯
across Africa, Europe, the UK and the US. ABOUT THE ROLE Sand Technologies focuses on cutting-edge cloud-based data projects, leveraging tools such as Databricks, DBT, Docker, Python, SQL, and PySpark to name a few. We work across a variety of data architectures such as Data Mesh, lakehouse, data vault and data warehouses. Our data engineers create pipelines that … practices. Understanding of machine learning workflows and how to support them with robust data pipelines. DESIRABLE LANGUAGES/TOOLS Proficiency in programming languages such as Python, Java, Scala, or SQL for data manipulation and scripting. Strong understanding of data modelling concepts and techniques, including relational and dimensional modelling. Experience in big data technologies and frameworks such as Databricks, Spark, Kafka … such as AWS, Azure, or GCP for deploying and managing data solutions. Strong problem-solving and analytical skills with the ability to diagnose and resolve complex data-related issues. SQL (for database management and querying) Apache Spark (for distributed data processing) Apache Spark Streaming, Kafka or similar (for real-time data streaming) Experience using data tools in at least one More ❯
solutions based on business requirements, adhering to SDLC best practices. BI Solution Development: •Design, develop, test, and deploy end-to-end Business Intelligence solutions using Alteryx for data preparation, SQL for data transformation and storage, Tableau for data visualization, and R/Python for advanced analytics. •Design, implement, and maintain ETL pipelines using Alteryx to extract data from diverse source … and implement complex ETL workflows, and optimize performance. •Data Visualization Proficiency: Proficiency in Tableau Desktop, including the ability to create interactive dashboards, complex calculations, custom visualizations, and performance optimization. •SQL Development Skills: Mastery of SQL, including the ability to write complex queries, stored procedures, views and perform query optimization. •Programming/Statistical Analysis Skills: Working knowledge of R or Python More ❯
and principles. Solid understanding of data warehousing data modeling and data integration principles. Proficiency in at least one scripting/programming language (e.g. Python Scala Java). Experience with SQL and NoSQL databases. Familiarity with data quality and data governance best practices. Strong analytical and problem-solving skills. Excellent communication interpersonal and presentation skills. Desired Skills: Experience with containerization technologies More ❯