Remote Apache Iceberg Jobs

15 of 15 Remote Apache Iceberg Jobs

Senior Software Engineer - Data Platform Team

Dublin, Ireland
Hybrid / WFH Options
Toast Tab, Inc
role Very strong knowledge of an application programming language, such as Java, Python, Javascript. Strong experience working within the data analysis ecosystem on top of platforms such as Spark, Iceberg, Databricks, or Snowflake. Strong knowledge of microservice based architecture operating at scale, preferably in a cloud environment such as AWS. Experience with Java, Groovy, Kotlin or JVM-based language … is a plus. Experience with Apache Iceberg, Apache Druid, Apache Flink is a plus. This is a hybrid role requiring two days per week in our Dublin office Our Spread of Total Rewards We strive to provide competitive compensation and benefits programs that help to attract, retain, and motivate the best and brightest people in our More ❯
Employment Type: Permanent
Salary: EUR Annual
Posted:

Data Architect (Trading) (London)

London, UK
Hybrid / WFH Options
Keyrock
Expertise in data warehousing, data modelling, and data integration. Experience in MLOps and machine learning pipelines. Proficiency in SQL and data manipulation languages. Experience with big data platforms (including Apache Arrow, Apache Spark, Apache Iceberg, and Clickhouse) and cloud-based infrastructure on AWS. Education & Qualifications Bachelors or Masters degree in Computer Science, Engineering, or a related More ❯
Employment Type: Full-time
Posted:

Head of Data Engineering

London, United Kingdom
Hybrid / WFH Options
Zego
Skills: Proven expertise in designing, building, and operating data pipelines, warehouses, and scalable data architectures. Deep hands-on experience with modern data stacks. Our tech includes Python, SQL, Snowflake, Apache Iceberg, AWS S3, PostgresDB, Airflow, dbt, and Apache Spark, deployed via AWS, Docker, and Terraform. Experience with similar technologies is essential. Coaching & Growth Mindset: Passion for developing More ❯
Employment Type: Permanent
Salary: GBP Annual
Posted:

Head of Data Engineering (London)

London, UK
Hybrid / WFH Options
Zego
Skills: Proven expertise in designing, building, and operating data pipelines, warehouses, and scalable data architectures. Deep hands-on experience with modern data stacks. Our tech includes Python, SQL, Snowflake, Apache Iceberg, AWS S3, PostgresDB, Airflow, dbt, and Apache Spark, deployed via AWS, Docker, and Terraform. Experience with similar technologies is essential. Coaching & Growth Mindset: Passion for developing More ❯
Employment Type: Full-time
Posted:

Senior Software Engineer, Python

London, United Kingdom
Hybrid / WFH Options
YouGov
MySQL Exposure to Docker, Kubernetes, AWS, Helm, Terraform, Vault, Grafana, ELK Stack, New Relic Relevant experience in the maintenance of data APIs and data lake architectures, including experience with Apache Iceberg, Trino/Presto, Clickhouse, Snowflake, BigQuery. Master's degree in Computer Science or Engineering-related field Get to know us better YouGov is a global online research More ❯
Employment Type: Permanent
Salary: GBP Annual
Posted:

Senior Software Engineer, Python (London)

London, UK
Hybrid / WFH Options
YouGov
MySQL Exposure to Docker, Kubernetes, AWS, Helm, Terraform, Vault, Grafana, ELK Stack, New Relic Relevant experience in the maintenance of data APIs and data lake architectures, including experience with Apache Iceberg, Trino/Presto, Clickhouse, Snowflake, BigQuery. Master's degree in Computer Science or Engineering-related field #LI-PM1 Get to know us better... YouGov is a global More ❯
Employment Type: Full-time
Posted:

Senior Data Platform Engineer

United Kingdom
Hybrid / WFH Options
Etsy
S3, RDS, EMR, ECS and more Advanced experience working and understanding the tradeoffs of at least one of the following Data Lake table/file formats: Delta Lake, Parquet, Iceberg, Hudi Previous h ands-on expertise with Spark Experience working with containerisation technologies - Docker, Kubernetes Streaming Knowledge: Experience with Kafka/Flink or other streaming ecosystems, with a solid More ❯
Employment Type: Permanent
Salary: GBP Annual
Posted:

Data Engineer (AWS) with Security Clearance

Columbia, South Carolina, United States
Hybrid / WFH Options
Systemtec Inc
technologies and cloud-based technologies AWS Services, State Machines, CDK, Glue, TypeScript, CloudWatch, Lambda, CloudFormation, S3, Glacier Archival Storage, DataSync, Lake Formation, AppFlow, RDS PostgreSQL, Aurora, Athena, Amazon MSK, Apache Iceberg, Spark, Python ONSITE: Partially onsite 3 days per week (Tue, Wed, Thurs) and as needed. Standard work hours: 8:30 AM - 5:00 PM Required Qualifications of More ❯
Employment Type: Permanent
Salary: USD Annual
Posted:

Software Engineer (Data & Infrastructure)

London, United Kingdom
Hybrid / WFH Options
Modo Energy Limited
pipelines. Implement and optimize automation processes using infrastructure-as-code (Terraform) Build and maintain data pipelines using Airflow. Manage our tech stack including Python, Node.js, PostgreSQL, MongoDB, Kafka, and Apache Iceberg. Optimize infrastructure costs and develop strategies for efficient resource utilization. Provide critical support by monitoring services and resolving production issues. Contribute to the development of new services as More ❯
Employment Type: Permanent
Salary: GBP Annual
Posted:

Software Engineer - Frontline

London, United Kingdom
Hybrid / WFH Options
Palantir Technologies
solve any given problem. Technologies We Use A variety of languages, including Java, Python, Rust and Go for backend and Typescript for frontend Open-source technologies like Cassandra, Spark, Iceberg, ElasticSearch, Kubernetes, React, and Redux Industry-standard build tooling, including Gradle for Java, Cargo for Rust, Hatch for Python, Webpack & PNPM for Typescript What We Value Strong engineering background More ❯
Employment Type: Permanent
Salary: GBP Annual
Posted:

Graduate Software Engineer (Data & Infrastructure)

London, United Kingdom
Hybrid / WFH Options
Modo Energy Limited
pipelines. Implement and optimize automation processes using infrastructure-as-code (Terraform) Build and maintain data pipelines using Airflow. Manage our tech stack including Python, Node.js, PostgreSQL, MongoDB, Kafka, and Apache Iceberg. Optimize infrastructure costs and develop strategies for efficient resource utilization. Provide critical support by monitoring services and resolving production issues. Contribute to the development of new services as More ❯
Employment Type: Permanent
Salary: GBP Annual
Posted:

Senior Data Architect

South East, United Kingdom
Hybrid / WFH Options
Anson Mccade
DBT, and data governance frameworks Preferred: Certifications in cloud/data technologies Experience with API/interface modelling and CI/CD (e.g. GitHub Actions) Knowledge of Atlan and iceberg tables Reference: AMC/SCU/SDA/3007 Postcode: SW1 #secu More ❯
Employment Type: Permanent
Posted:

Data Architect

London Area, United Kingdom
Hybrid / WFH Options
Anson McCade
PowerDesigner Strong SQL and Python skills (Snowflake or similar) AWS experience (Lambda, SNS, S3, EKS, API Gateway) Familiarity with data governance (GDPR, HIPAA) Bonus points for: DBT, Airflow, Atlan, Iceberg, CI/CD, API modelling The vibe: You’ll be joining a collaborative, inclusive team that values technical excellence and continuous learning. Flexible working, strong L&D support, and More ❯
Posted:

Data Architect

City of London, London, United Kingdom
Hybrid / WFH Options
Anson McCade
PowerDesigner Strong SQL and Python skills (Snowflake or similar) AWS experience (Lambda, SNS, S3, EKS, API Gateway) Familiarity with data governance (GDPR, HIPAA) Bonus points for: DBT, Airflow, Atlan, Iceberg, CI/CD, API modelling The vibe: You’ll be joining a collaborative, inclusive team that values technical excellence and continuous learning. Flexible working, strong L&D support, and More ❯
Posted:

Data Architect

South East, United Kingdom
Hybrid / WFH Options
Anson Mccade
of real-time and analytical data pipelines, metadata, and cataloguing (e.g., Atlan) Strong communication, stakeholder management, and documentation skills Preferred (but not essential): AWS or Snowflake certifications Knowledge of Apache Airflow, DBT, GitHub Actions Experience with Iceberg tables and data product thinking Why Apply? Work on high-impact, high-scale client projects Join a technically elite team with More ❯
Employment Type: Permanent, Work From Home
Posted:
Apache Iceberg
Work from Home
10th Percentile
£100,000
25th Percentile
£105,000
Median
£130,000
75th Percentile
£137,500