downstream consumption - Work with customers to build Dashboards with the right KPIs, Metrics for decision making - Data Quality checks, ETL/ELT processes, automation Technical Requirements: - Strong proficiency in SQL and Python programming - Extensive experience with data modeling and data warehouse concepts - Advanced knowledge of AWS data services, including: S3, Redshift, AWS Glue, AWS Lambda - Experience with Infrastructure as Code More ❯
Employment Type: Contract
Rate: £350 - £400/day PTO, pension and national insurance
financial reporting, planning, budgeting, and forecasting processes preferred. Previous experience with financial systems such as Oracle Hyperion Essbase, Oracle Hyperion Planning, JDE, Hubble, EPMWare, Authority Suite, Control M, MS SQL, Power BI, MS Azure, or MS Copilot is advantageous. Additional Information: This role comes with a competitive salary, including a bonus (company and individual), non-contributory pension, and one-off More ❯
stakeholder management skills. Expertise in enterprise-grade data warehouses (Snowflake, BigQuery, Redshift). Hands-on experience with Apache Airflow (or similar orchestration tools). Strong proficiency in Python and SQL for pipeline development. Deep understanding of data architecture, dimensional modelling, and metadata management. Experience with cloud platforms (AWS, GCP, or Azure). Familiarity with version control, CI/CD , and More ❯
related discipline. - 4–5 years of experience in data analysis, preferably in financial services or private credit/asset management. - Strong Excel, Power BI and PowerPoint skills. - Familiarity with SQL, Python, or other relevant programming languages is highly desirable. - Experience working with large data sets and enterprise data systems. - Strong attention to detail and a commitment to high-quality output. More ❯
related discipline. - 4–5 years of experience in data analysis, preferably in financial services or private credit/asset management. - Strong Excel, Power BI and PowerPoint skills. - Familiarity with SQL, Python, or other relevant programming languages is highly desirable. - Experience working with large data sets and enterprise data systems. - Strong attention to detail and a commitment to high-quality output. More ❯
City of London, London, United Kingdom Hybrid / WFH Options
83data
stakeholder management skills. Expertise in enterprise-grade data warehouses (Snowflake, BigQuery, Redshift). Hands-on experience with Apache Airflow (or similar orchestration tools). Strong proficiency in Python and SQL for pipeline development. Deep understanding of data architecture, dimensional modelling, and metadata management. Experience with cloud platforms (AWS, GCP, or Azure). Familiarity with version control, CI/CD , and More ❯
quality, utilizing TDD methodologies to ensure code reliability and maintainability. Security Practices: Knowledgeable in cybersecurity practices, including OAuth, OpenID Connect, and secure coding practices. Advanced Database Knowledge: Proficient in SQL and data modelling. SOLID Principles: Proficient in applying SOLID principles for object-oriented programming, ensuring clean, maintainable, and scalable code. Ideal to have: SC Clearance Python Docker and Kubernetes: These More ❯
stakeholder management skills. Expertise in enterprise-grade data warehouses (Snowflake, BigQuery, Redshift). Hands-on experience with Apache Airflow (or similar orchestration tools). Strong proficiency in Python and SQL for pipeline development. Deep understanding of data architecture, dimensional modelling, and metadata management. Experience with cloud platforms (AWS, GCP, or Azure). Familiarity with version control, CI/CD , and More ❯
within ERP systems as part of transformation and integration projects What You Bring: 5+ years of experience in data analytics or reporting Strong skills in Power BI or Tableau, SQL, and Python Experience with cloud platforms (Azure preferred), Alteryx, and Snowflake Hands-on experience with data migration and integration across ERP platforms Essential Bachelor’s degree in Data Science, Computer More ❯
Qualifications: Essential: Proven experience in designing, building, optimizing, deploying, and managing business-critical machine learning models using Azure ML in production environments. Strong skills in data wrangling using Python, SQL, and ADF . Proficiency in CI/CD, DevOps/MLOps , and version control systems. Familiarity with data visualization and reporting tools, ideally PowerBI. Excellent communication and interpersonal skills, with More ❯
Qualifications: Essential: Proven experience in designing, building, optimizing, deploying, and managing business-critical machine learning models using Azure ML in production environments. Strong skills in data wrangling using Python, SQL, and ADF . Proficiency in CI/CD, DevOps/MLOps , and version control systems. Familiarity with data visualization and reporting tools, ideally PowerBI. Excellent communication and interpersonal skills, with More ❯
Qualifications: Essential: Proven experience in designing, building, optimizing, deploying, and managing business-critical machine learning models using Azure ML in production environments. Strong skills in data wrangling using Python, SQL, and ADF . Proficiency in CI/CD, DevOps/MLOps , and version control systems. Familiarity with data visualization and reporting tools, ideally PowerBI. Excellent communication and interpersonal skills, with More ❯
engineering technologies, tools and practices. Qualifications Bachelor's degree in computer science, engineering or related field. Minimum of 3 years of experience in data engineering roles. Strong proficiency in SQL, ETL processes and database management systems (e.g., MySQL, PostgreSQL, MongoDB). Hands-on experience with AWS services for data processing, storage and analysis (e.g., S3, Redshift, EMR, Glue). Familiarity More ❯
Data Science, Computer Science, Information Systems, or other equivalent Experienced in data analysis, BI, or a related area Familiarity with BI tools (e.g. Tableau, Power BI, Looker) and basic SQL for data querying Proficiency in Microsoft Excel for data manipulation and reporting Solid ability to analyse and interpret data, with keen attention to detail Basic experience with data modelling Exposure More ❯
London, England, United Kingdom Hybrid / WFH Options
83zero Limited
stakeholder management skills. Expertise in enterprise-grade data warehouses (Snowflake, BigQuery, Redshift). Hands-on experience with Apache Airflow (or similar orchestration tools). Strong proficiency in Python and SQL for pipeline development. Deep understanding of data architecture, dimensional modelling, and metadata management. Experience with cloud platforms (AWS, GCP, or Azure). Familiarity with version control, CI/CD , and More ❯
analytical and communication skills; ability to influence stakeholders and translate complex requirements clearly You have experience with data visualization/reporting tools (e.g. Tableau, Power BI, Looker) You have SQL or data querying experience to support analysis and validation You are fluent in English Our Benefits An opportunity to work in a fast growing fintech revolutionizing investment reporting Hybrid style More ❯
you’re the right candidate, you likely: Are pragmatic and outcome-focused Think scientifically, validate assumptions, seek evidence Have hands-on experience with machine learning models Proficient with Python, SQL, Bash, HTML/CSS/JS, and Excel; familiar with Jupyter, Pandas, SciKit, PyTorch, CI/CD, Git Understand probability and statistics Experienced with containerisation (Docker, Kubernetes) Knowledge of cloud More ❯
a Lead Data Analyst or similar role in a data-driven environment. Hands-on expertise with Earnix or similar pricing/analytics platforms is highly desirable. Strong proficiency in SQL and Snowflake for data manipulation and transformation. Experience working with Power BI for dashboard development and data visualization. Solid understanding of data validation, cleansing, and ETL processes. Excellent communication and More ❯
looking for? Hands-on experience designing greenfield, scalable data platforms in cloud using Azure D&A stack, Databricks, and Azure Open AI solutions. Proficiency in coding (Python, PL/SQL, Shell Script), relational and non-relational databases, ETL tooling (such as Informatica), and scalable data platforms. Proficiency in Azure Data and Analytics stack; working knowledge of AWS and GCP data More ❯
London, England, United Kingdom Hybrid / WFH Options
83data
stakeholder management skills. Expertise in enterprise-grade data warehouses (Snowflake, BigQuery, Redshift). Hands-on experience with Apache Airflow (or similar orchestration tools). Strong proficiency in Python and SQL for pipeline development. Deep understanding of data architecture, dimensional modelling, and metadata management. Experience with cloud platforms (AWS, GCP, or Azure). Familiarity with version control, CI/CD , and More ❯
and supporting automated workflows for various areas of the business. Experience with the deployment of Alteryx workflows to the Production Gallery. Knowledge of database fundamentals including data design and SQL data warehouse design concepts and querying concepts is beneficial. Exposure to PowerBI, Databricks, Microsoft Azure and Profisee is an advantage. Knowledge of Json, Python, XML and R is an advantage. More ❯
downstream consumption - Work with customers to build Dashboards with the right KPIs, Metrics for decision making - Data Quality checks, ETL/ELT processes, automation Technical Requirements: - Strong proficiency in SQL and Python programming - Extensive experience with data modeling and data warehouse concepts - Advanced knowledge of AWS data services, including: S3, Redshift, AWS Glue, AWS Lambda - Experience with Infrastructure as Code More ❯
or related fields. Certifications or a Master’s degree (e.g., MBA, MS in Business Analytics) are a plus. Tools: Experience with visualization and analysis tools like Tableau, Power BI, SQL, Excel, PowerPoint. #J-18808-Ljbffr More ❯
great Grayce Analyst: 2:1 Undergraduate Degree: In a STEM field, with a solid academic basis in data analysis; MSc is a plus. Data Tools: Experience with Excel, R, SQL, or Python is essential. Experience with Tableau or Power BI is advantageous. Analytical Problem Solving: Ability to spot patterns, think creatively, and dissect complex issues. Soft Skills: Effective communication, planning More ❯
London, England, United Kingdom Hybrid / WFH Options
Mirai Talent
supporting data workflows. Promote best practices regarding code quality, testing, observability, and operational stability. Ideal Candidate: Possesses 2+ years of practical data engineering experience. Has strong skills in Python, SQL, and PySpark. Experienced working with data lakes, warehouses, lakehouses, and cloud platforms, preferably Azure. Knowledgeable in data modelling, including Kimball and star schemas. Familiar with ETL tools such as Azure More ❯