will shape our data strategy and support business growth. Responsibilities: Collaborate with teams to understand objectives, data requirements, and analytics goals. Develop data analysis strategies using AWS services like AmazonRedshift, Athena, EMR, and QuickSight. Design and build data pipelines and ETL processes for data extraction, transformation, and loading into AWS. Apply statistical and machine learning techniques for … predictive and segmentation analyses. Identify key metrics, develop KPIs, and create dashboards using Amazon QuickSight. Conduct exploratory data analysis to identify trends and insights for product and user engagement improvements. Work with data engineers to optimize data architecture, quality, and governance in AWS. Requirements: Bachelor's or master's degree in Computer Science, Statistics, Mathematics, or related fields. Experience … as a Data Scientist, preferably with AWS analytics services. Proficiency in AWS analytics tools like Redshift, Athena, EMR, QuickSight. Knowledge of data modeling, ETL, data warehousing, statistical analysis, machine learning. Programming skills in Python, R, or Scala. Experience with SQL and NoSQL databases, visualization tools. Strong analytical and problem-solving skills. Experience with social media analytics and user behavior. More ❯
Collaborate with cross-functional teams to develop data-driven solutions that support business growth. Responsibilities: Work with teams to understand data needs and analytics goals. Use AWS tools like AmazonRedshift, Athena, EMR, and QuickSight for data analysis. Create data pipelines and ETL processes for data integration. Apply statistical and machine learning techniques for analysis and modeling. Develop … AWS. Requirements: Bachelor's or master's in Computer Science, Statistics, Mathematics, or related field. Experience as a Data Scientist, especially with AWS analytics. Proficiency in AWS tools like Redshift, Athena, EMR, QuickSight. Knowledge of data modeling, ETL, data warehousing. Skills in statistical analysis, machine learning, Python/R/Scala. Experience with SQL, NoSQL, and data visualization tools. More ❯
GCP/AWS/Azure platforms) and specifically in Big Data processing services (Apache Spark, Beam or equivalent). In-depth knowledge on key technologies like Big Query/Redshift/Synapse/Pub Sub/Kinesis/MQ/Event Hubs, Kafka Dataflow/Airflow/ADF etc. Excellent consulting experience and ability to design and build solutions … experience in a similar role. Ability to lead and mentor the architects. Mandatory Skills [at least 2 Hyperscalers] GCP, AWS, Azure, Big data, Apache spark, beam on BigQuery/Redshift/Synapse, Pub Sub/Kinesis/MQ/Event Hubs, Kafka Dataflow/Airflow/ADF Designing Databricks based solutions for Azure/AWS, Jenkins, Terraform, StackDriver or More ❯
slough, south east england, united kingdom Hybrid / WFH Options
Ignite Digital Talent
Strong hands-on experience with Python in a data context Proven skills in SQL Experience with Data Warehousing (DWH) ideally with Snowflake or similar cloud data platforms (Databricks or Redshift) Experience with DBT, Kafka, Airflow, and modern ELT/ETL frameworks Familiarity with data visualisation tools like Sisense, Looker, or Tableau Solid understanding of data architecture, transformation workflows, and More ❯
. Strong familiarity with data warehousing, data lake/lakehouse architectures, and cloud-native analytics platforms. Hands-on experience with SQL and cloud data platforms (e.g., Snowflake, Azure, AWS Redshift, GCP BigQuery). Experience with BI/analytics tools (e.g., Power BI, Tableau) and data visualization best practices. Strong knowledge of data governance, data privacy, and compliance frameworks (e.g. More ❯
and manage DBT models for data transformation and modeling in a modern data stack. Proficiency in SQL , Python , and PySpark . Experience with AWS services such as S3, Athena, Redshift, Lambda, and CloudWatch. Familiarity with data warehousing concepts and modern data stack architectures. Experience with CI/CD pipelines and version control (e.g., Git). Collaborate with data analysts More ❯
NiFi) Strong familiarity with data warehousing, data lake/lakehouse architectures, and cloud-native analytics platforms Hands-on experience with SQL and cloud data platforms (e.g., Snowflake, Azure, AWS Redshift, GCP BigQuery) Experience with BI/analytics tools (e.g., Power BI, Tableau) and data visualization best practices Strong knowledge of data governance, data privacy, and compliance frameworks (e.g., GDPR More ❯
as-Code (IaC) and delivering data platform projects in iterative cycles. Non-Microsoft Data Tools – Exposure to or hands-on experience with tools such as Snowflake, Databricks or AWS Redshift Cross-Platform Reporting Tools – Knowledge of BI tools beyond Power BI, such as Tableau or Qlik, for comparative understanding or hybrid deployments. KEY COMPETENCIES REQUIRED FOR ROLE Achievement Focus More ❯
cross-functional teams, and develop data-driven solutions to support business growth. Responsibilities: Work with teams to understand objectives and define analytics goals. Develop strategies using AWS services like AmazonRedshift, Athena, EMR, and QuickSight. Design data pipelines and ETL processes for data extraction and transformation. Apply statistical and machine learning techniques for analysis and pattern recognition. Develop … KPIs and dashboards using Amazon QuickSight for decision-making. Perform exploratory data analysis to uncover insights. Collaborate with data engineers on data architecture and governance in AWS. Requirements: Bachelor's or master's degree in relevant fields. Experience as a Data Scientist, preferably with AWS. Proficiency in AWS analytics tools. Understanding of data modeling, ETL, and warehousing. Skills in More ❯
Slough, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
desktops Requirements Demonstrable experience within consulting/managed service environments Strong experience in building, configuring and optimising the Data Lakes environment Experience in Landing Zones/Transit Gateways/Redshift/Firehose/CloudTrail/Workspaces Experience within Linux based environments Strong understanding of AWS Data Lake solutions and AWS Redshift AWS Certified Solutions Architect, Developer, or SysOps More ❯
architecture, with at least 3 years in the insurance industry . Strong understanding of London Market and specialty insurance operations and data flows. Proven experience with AWS (S3, Glue, Redshift, Lambda, etc.), Databricks , and Snowflake . Expertise in building and optimizing medallion architecture . Solid knowledge of data governance , security , and compliance frameworks. Experience with ETL/ELT tools More ❯
as-Code (IaC) and delivering data platform projects in iterative cycles. Non-Microsoft Data Tools – Exposure to or hands-on experience with tools such as Snowflake, Databricks or AWS Redshift Cross-Platform Reporting Tools – Knowledge of BI tools beyond Power BI, such as Tableau or Qlik, for comparative understanding or hybrid deployments. KEY COMPETENCIES REQUIRED FOR ROLE Achievement Focus More ❯
as-Code (IaC) and delivering data platform projects in iterative cycles. Non-Microsoft Data Tools – Exposure to or hands-on experience with tools such as Snowflake, Databricks or AWS Redshift Cross-Platform Reporting Tools – Knowledge of BI tools beyond Power BI, such as Tableau or Qlik, for comparative understanding or hybrid deployments. KEY COMPETENCIES REQUIRED FOR ROLE Achievement Focus More ❯
as-Code (IaC) and delivering data platform projects in iterative cycles. Non-Microsoft Data Tools – Exposure to or hands-on experience with tools such as Snowflake, Databricks or AWS Redshift Cross-Platform Reporting Tools – Knowledge of BI tools beyond Power BI, such as Tableau or Qlik, for comparative understanding or hybrid deployments. KEY COMPETENCIES REQUIRED FOR ROLE Achievement Focus More ❯
as-Code (IaC) and delivering data platform projects in iterative cycles. Non-Microsoft Data Tools – Exposure to or hands-on experience with tools such as Snowflake, Databricks or AWS Redshift Cross-Platform Reporting Tools – Knowledge of BI tools beyond Power BI, such as Tableau or Qlik, for comparative understanding or hybrid deployments. KEY COMPETENCIES REQUIRED FOR ROLE Achievement Focus More ❯
Our Client A new UK-based financial services provider is launching a credit card offering aimed at delivering fair, flexible, and user-friendly financial products to consumers. The organisation is committed to empowering individuals by enhancing their understanding and control More ❯
required: Strong data visualisation using Power BI and coding ability using either normalisation, SQL, or Python). Desirable: Experience working in a Data warehouse, or lake environments e.g. Snowflake, Redshift, DataBricks, and ELT and data pipelines e.g. dbt Familiar with predictive analytics techniques Please apply if this sounds like you More ❯
with the continued scaling and optimisation of these. Their ideal candidate would have 10+ years experience in Data Engineering/Architecture and have good knowledge within: Data Warehousing (Snowflake, Redshift, BigQuery) ETL (Data Fabric, Data Mesh) DevOps (IaC, CI/CD, Containers) Leadership/Line Management Consulting/Client Facing Experience In return they would be offering Free Certification More ❯
seeking a Managing Data Architect to help scale and optimize new projects. Ideal candidates will have 10+ years in Data Engineering/Architecture with expertise in: Data Warehousing (Snowflake, Redshift, BigQuery) ETL (Data Fabric, Data Mesh) DevOps (IaC, CI/CD, Containers) Leadership/Line Management Consulting/Client Facing Experience The company offers: Free Certification Scheme (ServiceNow, TOGAF More ❯
with the continued scaling and optimisation of these. Their ideal candidate would have 10+ years experience in Data Engineering/Architecture and have good knowledge within: Data Warehousing (Snowflake, Redshift, BigQuery) ETL (Data Fabric, Data Mesh) DevOps (IaC, CI/CD, Containers) Leadership/Line Management Consulting/Client Facing Experience In return they would be offering Free Certification More ❯
with the continued scaling and optimisation of these. Their ideal candidate would have 10+ years experience in Data Engineering/Architecture and have good knowledge within: Data Warehousing (Snowflake, Redshift, BigQuery) ETL (Data Fabric, Data Mesh) DevOps (IaC, CI/CD, Containers) Leadership/Line Management Consulting/Client Facing Experience In return they would be offering Free Certification More ❯
with the continued scaling and optimisation of these. Their ideal candidate would have 10+ years experience in Data Engineering/Architecture and have good knowledge within: Data Warehousing (Snowflake, Redshift, BigQuery) ETL (Data Fabric, Data Mesh) DevOps (IaC, CI/CD, Containers) Leadership/Line Management Consulting/Client Facing Experience In return they would be offering Free Certification More ❯
Modeling Data Integration & Ingestion Data Manipulation & Processing Version Control & DevOps: Skilled in GitHub, GitHub Actions, Azure DevOps Azure Data Factory, Databricks, SQL DB, Synapse, Stream Analytics Glue, Airflow, Kinesis, Redshift SonarQube, PyTest If you're ready to take on a new challenge and shape data engineering in a trading-first environment, submit your CV today to be considered. More ❯
Functions. Strong knowledge of scripting languages (e.g., Python, Bash, PowerShell) for automation and data transformation. Proficient in working with databases, data warehouses, and data lakes (e.g., SQL, NoSQL, Hadoop, Redshift). Familiarity with APIs and web services for integrating external systems and applications into orchestration workflows. Hands-on experience with data transformation and ETL (Extract, Transform, Load) processes. Strong More ❯
Nice-to-Have 5+ yrs building high-throughput backend systems Experience with BI/reporting engines or OLAP stores Deep Ruby/Rails & ActiveRecord expertise Exposure to ClickHouse/Redshift/BigQuery Event-driven or stream processing (Kafka, Kinesis) Familiarity with data-viz pipelines (we use Highcharts.js) AWS production experience (EC2, RDS, IAM, VPC) Contributions to OSS or tech More ❯