such as: Hadoop, Kafka, Apache Spark, Apache Flink, object, relational and NoSQL data stores. Hands-on experience with big data application development and cloud data warehousing (e.g. Hadoop, Spark, Redshift, Snowflake, GCP BigQuery) Expertise in building data architectures that support batch and streaming paradigms Experience with standards such as JSON, XML, YAML, Avro, Parquet Strong communication skills Open to More ❯
improve our technology stack. Develop oneself into a Subject Matter Expert (SME) on Technical and Functional domain areas. What we value Demonstrated experience in Python, PySpark and SQL (AWS Redshift, Postgres, Oracle). Demonstrated experience building data pipelines with PySpark and AWS. Application development experience in financial services with hands on designing, developing, and deploying complex applications.Demonstrated ability to More ❯
Get AI-powered advice on this job and more exclusive features. Fintellect Recruitment provided pay range This range is provided by Fintellect Recruitment. Your actual pay will be based on your skills and experience — talk with your recruiter to learn More ❯
Our Client A new UK-based financial services provider is launching a credit card offering aimed at delivering fair, flexible, and user-friendly financial products to consumers. The organisation is committed to empowering individuals by enhancing their understanding and control More ❯
Our Client A new UK-based financial services provider is launching a credit card offering aimed at delivering fair, flexible, and user-friendly financial products to consumers. The organisation is committed to empowering individuals by enhancing their understanding and control More ❯
Business Intelligence Engineer II, Amazon Sub Same Day The role of the Sub Same Day business is to provide ultrafast speeds (2 hour and same day scheduled) and reliable delivery for selection that customers fast. Customers find their daily essentials and a curated selection of Amazon's top-selling items with sub same day promises. The program is … and experienced Business Analyst It is a pivotal role that will contribute to the evolution and success of one of the fastest growing businesses in the company. Joining the Amazon team means partnering with a dynamic and creative group who set a high bar for innovation and success in a fast-paced and changing environment. The Business Analyst is … metrics. - Monitor key metrics and escalate anomalies as needed - Provide input on suggested business actions based on analytical findings. BASIC QUALIFICATIONS - 5+ years of analyzing and interpreting data with Redshift, Oracle, NoSQL etc. experience - Experience with data visualization using Tableau, Quicksight, or similar tools - Experience with data modeling, warehousing and building ETL pipelines - Experience in Statistical Analysis packages such More ❯
Business Intelligence Engineer, Global Selling Central BI Job ID: Amazon (Shanghai) International Trading Company Limited Amazon Global Selling has been helping individuals and businesses increase sales and reach new customers around the globe. Today, more than 50% of Amazon's total unit sales come from third-party selection. The Global Selling team in China is responsible for … recruiting local businesses to sell on Amazon's 19+ overseas marketplaces, and supporting local Sellers' success and growth on the Amazon. Our vision is to be the first choice for all types of Chinese business to go globally. And the Global Selling Central BI team is looking for a Business Intelligence Engineer to collaborate with cross-functional teams to … business metrics • Design and implement AI/LLM-powered solutions to automate routine tasks and enhance decision-making processes BASIC QUALIFICATIONS - 3+ years of analyzing and interpreting data with Redshift, Oracle, NoSQL etc. experience - Experience with data visualization using Tableau, Quicksight, or similar tools - Experience with data modeling, warehousing and building ETL pipelines - Experience in Statistical Analysis packages such More ❯
Amazon Fulfillment Technologies & Robotics - Central Support Team is currently looking for a Database Engineer position for its Hyderabad, India office to design, develop, and manage persistence solutions that serve and support FTR needs. The database engineer will be part of the worldwide operations team and responsible for designing, managing, and maintaining highly complex, confidential, mission-critical, and high-availability … an entrepreneurial start-up feel. This role offers the opportunity to operate and engineer systems at a massive scale and gain experience in DB storage technologies. About the team Amazon Fulfillment Technologies & Robotics (FTR) powers Amazon's global fulfillment network by inventing and delivering software, hardware, and data science solutions that coordinate processes, robots, machines, and people. We … integrate the physical and virtual worlds to ensure Amazon customers receive their orders promptly. The Platform Engineering Database Engineering Team is responsible for engineering, architecture, operations, support, and scaling of databases for Amazon Fulfillment Technologies (AFT), SCOT (Supply Chain Optimization Technologies), WWCR (Worldwide Customer Returns), Gift Cards, and Amazon Payments systems. BASIC QUALIFICATIONS Bachelor's degree in More ❯
Senior Delivery Consultant: Data Analytics & GenAI Job ID: Amazon Web Services Australia Pty Ltd Are you a Senior Data Analytics and GenAI consulting specialist? Do you have real-time Data Analytics, Data Warehousing, Big Data, Modern Data Strategy, Data Lake, Data Engineering and GenAI experience? Do you have senior stakeholder engagement experience to support pre-sales and deliver consulting … Vetting Agency clearance (see ). Key job responsibilities Expertise: Collaborate with pre-sales and delivery teams to help partners and customers learn and use services such as AWS Glue, Amazon S3, Amazon DynamoDB, Amazon Relational Database Service (RDS), Amazon Elastic Map Reduce (EMR), Amazon Kinesis, AmazonRedshift, Amazon Athena, AWS Lake Formation … Amazon DataZone, Amazon SageMaker, Amazon Quicksight and Amazon Bedrock. Solutions: Support pre-sales and deliver technical engagements with partners and customers. This includes participating in pre-sales visits, understanding customer requirements, creating consulting proposals and creating packaged data analytics service offerings. Delivery: Engagements include projects proving the use of AWS services to support new distributed computing More ❯
you looking for real world Supply Chain challenges? Do you have a desire to make a major contribution to the future, in the rapid growth environment of Cloud Computing? Amazon Web Services is looking for a highly motivated, analytical and detail oriented candidate to help build scalable, predictive and prescriptive business analytics solutions that supports AWS Supply Chain and … issues as they arise, and has a track record of using data to influence decision makers. Key job responsibilities In this role, you will: Understand a broad range of Amazon's data resources and processes. Manipulate/mine data from database tables using SQL, and from log files by writing scripts (e.g. PERL, Python). Interface with Global Stakeholders … hard-working, and meticulous who gets things done at an effective pace, gets results and is comfortable working with tight deadlines and changing priorities. About the team Diverse Experiences Amazon values diverse experiences. Even if you do not meet all of the preferred qualifications and skills listed in the job description, we encourage candidates to apply. If your career More ❯
proficiency across the data lifecycle Experience with database back-up, recovery, and archiving strategy Proficient knowledge of linear algebra, statistics, and geometrical algorithms Knowledge of data warehousing solutions like AmazonRedshift, Snowflake or Databricks. Preferred Qualifications Understanding of machine learning concepts and tools is a plus. About Us J.P. Morgan is a global leader in financial services, providing More ❯
required: Strong data visualisation using Power BI and coding ability using either normalisation, SQL, or Python). Desirable: Experience working in a Data warehouse, or lake environments e.g. Snowflake, Redshift, DataBricks, and ELT and data pipelines e.g. dbt Familiar with predictive analytics techniques Please apply if this sounds like you More ❯
required: Strong data visualisation using Power BI and coding ability using either normalisation, SQL, or Python). Desirable: Experience working in a Data warehouse, or lake environments e.g. Snowflake, Redshift, DataBricks, and ELT and data pipelines e.g. dbt Familiar with predictive analytics techniques Please apply if this sounds like you More ❯
of Financial Services (FS) Deep expertise in the full data engineering lifecycle—from data ingestion through to end-user consumption Practical experience with modern data tools and platforms, including Redshift, Airflow, Python, DBT, MongoDB, AWS, Looker, and Docker Strong grasp of best practices in data modelling, transformation, and orchestration Proven ability to build and support both internal analytics solutions More ❯
of Financial Services (FS) Deep expertise in the full data engineering lifecycle—from data ingestion through to end-user consumption Practical experience with modern data tools and platforms, including Redshift, Airflow, Python, DBT, MongoDB, AWS, Looker, and Docker Strong grasp of best practices in data modelling, transformation, and orchestration Proven ability to build and support both internal analytics solutions More ❯
with GDPR and industry regulations. What We’re Looking For 4+ years of experience in Python development (Flask, Django). Strong expertise in AWS cloud environments (EC2, S3, RDS, Redshift, Athena). Experience with SQL databases (PostgreSQL, SQLAlchemy). Knowledge of ETL pipelines, API development, and front-end technologies (HTML, CSS). Experience with Infrastructure as Code tools (Terraform More ❯
Employment Type: Permanent
Salary: £55000 - £75000/annum + Hybrid Working & Benefits
working across large-scale, complex environments. Proven experience in Teradata cloud transformation, particularly within Google Cloud Platform (GCP). Familiarity with other data warehouse technologies such as BigQuery, Snowflake, Redshift, or Azure Synapse is highly beneficial. Contract Details: Duration: 6 months Location: London 2x Per Week/Remote Daily Rate: Up to £475 Per Day (Inside IR35) Senior Teradata More ❯
Leeds, Yorkshire, United Kingdom Hybrid / WFH Options
HyperFinity
a business intelligence development or related role Excellent SQL skills Good understanding of relational and analytical database architectures and methodologies Experience of cloud data warehousing solutions e.g. Snowflake, BigQuery, Redshift Good hands-on experience with at least one Business Intelligence platform Strong analytical and problem-solving skills Excellent communication skills, with the ability to explain technical details to non More ❯
and Databricks , and you'll have the chance to drive impactful data initiatives from day one. Key Responsibilities: Build, maintain, and scale reliable data pipelines Work across AWS services (Redshift, Glue, Lambdas) and Databricks Design and evolve data models, data lakes, and warehouse solutions Apply CI/CD practices using GitHub and modern DevOps workflows Engage with cross-functional More ❯
retraining triggers. Collaboration - work with Data Engineering, Product and Ops teams to translate business constraints into mathematical formulations. Tech stack Python (pandas, NumPy, scikit-learn, PyTorch/TensorFlow) SQL (Redshift, Snowflake or similar) AWS SageMaker → Azure ML migration, with Docker, Git, Terraform, Airflow/ADF Optional extras: Spark, Databricks, Kubernetes. What you'll bring 3-5+ years building More ❯
update an Airflow (or Azure Data Factory) job. Review: inspect dashboards, compare control vs. treatment, plan next experiment. Tech stack Python (pandas, NumPy, scikit-learn, PyTorch/TensorFlow)SQL (Redshift, Snowflake or similar)AWS SageMaker Azure ML migration, with Docker, Git, Terraform, Airflow/ADFOptional extras: Spark, Databricks, Kubernetes. What you'll bring 3-5+ years building optimisation More ❯
update an Airflow (or Azure Data Factory) job. Review: inspect dashboards, compare control vs. treatment, plan next experiment. Tech stack Python (pandas, NumPy, scikit-learn, PyTorch/TensorFlow) SQL (Redshift, Snowflake or similar) AWS SageMaker → Azure ML migration, with Docker, Git, Terraform, Airflow/ADF Optional extras: Spark, Databricks, Kubernetes. What you'll bring 3-5+ years building More ❯
in the team and contribute to deep technical discussions Nice to Have Experience with operating machine learning models (e.g., MLFlow) Experience with Data Lakes, Lakehouses, and Warehouses (e.g., DeltaLake, Redshift) DevOps skills, including terraform and general CI/CD experience Previously worked in agile environments Experience with expert systems Perks & Benefits Comprehensive benefits package Fitness reimbursement Veeva Work-Anywhere More ❯
Functions. Strong knowledge of scripting languages (e.g., Python, Bash, PowerShell) for automation and data transformation. Proficient in working with databases, data warehouses, and data lakes (e.g., SQL, NoSQL, Hadoop, Redshift). Familiarity with APIs and web services for integrating external systems and applications into orchestration workflows. Hands-on experience with data transformation and ETL (Extract, Transform, Load) processes. Strong More ❯
Functions. Strong knowledge of scripting languages (e.g., Python, Bash, PowerShell) for automation and data transformation. Proficient in working with databases, data warehouses, and data lakes (e.g., SQL, NoSQL, Hadoop, Redshift). Familiarity with APIs and web services for integrating external systems and applications into orchestration workflows. Hands-on experience with data transformation and ETL (Extract, Transform, Load) processes. Strong More ❯