Avro , etc. Deep understanding of distributed computing and parallel processing . Experience working on cloud platforms (preferably AWS) and services like S3, EMR, Glue, Redshift, BigQuery . Proficiency in CI/CD pipelines, Docker, Kubernetes, and Terraform . Knowledge of Java, Python, or other JVM-based languages is a More ❯
to detail with strong analytical and problem-solving skills. Excellent communication and stakeholder engagement skills. Experience with cloud data platforms (e.g., Azure Synapse, AWS Redshift, Google BigQuery). Knowledge of data governance frameworks and data quality management. Competence in data modelling and database design techniques. Experience working in agile More ❯
platforms and cloud data solutions . Hands-on experience designing data solutions on Azure (e.g., Azure Data Lake, Synapse, Data Factory) and AWS (e.g., Redshift, Glue, S3, Lambda) . Strong background in data architecture , including data modeling, warehousing, real-time and batch processing, and big data frameworks. Proficiency with More ❯
optimisation of these. Their ideal candidate would have 10+ years experience in Data Engineering/Architecture and have good knowledge within: Data Warehousing (Snowflake, Redshift, BigQuery) ETL (Data Fabric, Data Mesh) DevOps (IaC, CI/CD, Containers) Leadership/Line Management Consulting/Client Facing Experience In return they More ❯
building high-throughput backend systems Experience with BI/reporting engines or OLAP stores Deep Ruby/Rails & ActiveRecord expertise Exposure to ClickHouse/Redshift/BigQuery Event-driven or stream processing (Kafka, Kinesis) Familiarity with data-viz pipelines (we use Highcharts.js) AWS production experience (EC2, RDS, IAM, VPC More ❯
join their team and assist with the continued scaling and optimisation of these. Their ideal candidate would have good knowledge within: Data Warehousing (Snowflake, Redshift, BigQuery) ETL (Data Fabric, Data Mesh) DevOps (IaC, CI/CD, Containers) Consulting/Client Facing Experience In return they would be offering Uncapped More ❯
per day inside ir35 - Umbrella only Job Description: 7+ years of experienced in designing, building, and maintaining data pipelines and architectures on the Amazon Web Services (AWS) cloud platform. Skilled in scalable, reliable, and efficient data solutions, often using AWS services like S3, Redshift, EMR, Glue, and Kinesis. … and collaborating with other teams for data analysis and business requirements Key Skills: AWS Services: Strong understanding and experience with AWS services like S3 , Redshift, EMR, Glue , Kinesis, and Lambda. Programming Languages: Proficiency in programming languages like Python or Java, used for designing and building data pipelines. SQL: Knowledge More ❯
primary, secondary, and tertiary sources. Cloud, Infrastructure, and Platform Engineering: Develop and deploy data workflows on AWS or GCP , using services such as S3, Redshift, Pub/Sub, or BigQuery. Containerize data processing tasks using Docker , orchestrate with Kubernetes , and ensure production-grade deployment. Collaborate with platform teams to … ensure scalability, resilience, and observability of data pipelines. Database Engineering : Write and optimize complex SQL queries on relational (Redshift, PostgreSQL) and NoSQL (MongoDB) databases. Work with ELK stack (Elasticsearch, Logstash, Kibana) for search, logging, and real-time analytics. Support Lakehouse architectures and hybrid data storage models for unified access More ❯
platforms) and specifically in Big Data processing services (Apache Spark, Beam or equivalent). In-depth knowledge on key technologies like Big Query/Redshift/Synapse/Pub Sub/Kinesis/MQ/Event Hubs, Kafka Dataflow/Airflow/ADF etc. Excellent consulting experience and ability … Ability to lead and mentor the architects. Mandatory Skills [at least 2 Hyperscalers] GCP, AWS, Azure, Big data, Apache spark, beam on BigQuery/Redshift/Synapse, Pub Sub/Kinesis/MQ/Event Hubs, Kafka Dataflow/Airflow/ADF Designing Databricks based solutions for Azure/ More ❯
Social network you want to login/join with: At Travelex we are developing modern data technology and data products. Data is central to the way we define and sell our foreign currency exchange products. Our relationship with our customers More ❯