design, implementation, testing, and support of next-generation features related to Dremio's Query Planner and Reflections technologies Work with open source projects like Apache Calcite and ApacheIceberg Use modular design patterns to deliver an architecture that's elegant, simple, extensible and maintainable Solve complex technical … distributed query engines. Hands on experience in query processing or optimization, distributed systems, concurrency control, data replication, code generation, networking, storage systems, heap management, Apache Arrow, SQL Operators, caching techniques, and disk spilling Hands on experience with multi-threaded and asynchronous programming models More ❯
grasp of data governance/data management concepts, including metadata management, master data management and data quality. Ideally, have experience with Data Lakehouse toolset (Iceberg) What you'll get in return Hybrid working (4 days per month in London HQ + as and when required) Access to market leading More ❯
Scala Starburst and Athena Kafka and Kinesis DataHub ML Flow and Airflow Docker and Terraform Kafka, Spark, Kafka Streams and KSQL DBT AWS, S3, Iceberg, Parquet, Glue and EMR for our Data Lake Elasticsearch and DynamoDB More information: Enjoy fantastic perks like private healthcare & dental insurance, a generous work More ❯
Data Engineer Language Skills (Python, SQL, Javascript) Experience of Azure, AWS or GCP cloud platforms and Data Lake/Warehousing Platforms such as Snowflake, Iceberg etc Experience of various ETL and Streaming Tools (Flink, Spark) Experience of a variety of data mining techniques (APIs, GraphQL, Website Scraping) Ability to More ❯
working with hierarchical reference data models. Proven expertise in handling high-throughput, real-time market data streams. Familiarity with distributed computing frameworks such as Apache Spark. Operational experience supporting real-time systems. Equal Opportunity Workplace We are proud to be an equal opportunity workplace. We do not discriminate based More ❯
hands-on experience with exposure to Snowflake and the Big Data tech stack Technical Skills: Python PySpark DBT Airflow AWS, especially S3 and Glue ApacheIceberg SQL Git Kubernetes and Docker Shell Snowflake and Big Data tech stack More ❯
data. What you offer Experience with AWS cloud. Experience programming, debugging, and running production systems in Python. Exposure to open-source technologies such as Iceberg, Trino, and Airflow. Passionate about the use and adoption of these capabilities, focused on user experience and ensuring our business sees real value from More ❯
to-end engineering experience supported by excellent tooling and automation. Preferred Qualifications, Capabilities, and Skills: Good understanding of the Big Data stack (Spark/Iceberg). Ability to learn new technologies and patterns on the job and apply them effectively. Good understanding of established patterns, such as stability patterns More ❯
processes using infrastructure-as-code (Terraform) Build and maintain data pipelines using Airflow. Manage our tech stack including Python, Node.js, PostgreSQL, MongoDB, Kafka, and Apache Iceberg. Optimize infrastructure costs and develop strategies for efficient resource utilization. Provide critical support by monitoring services and resolving production issues. Contribute to the More ❯
analysis and automation. Proficiency in building and maintaining batch and streaming ETL/ELT pipelines at scale, employing tools such as Airflow, Fivetran, Kafka, Iceberg, Parquet, Spark, Glue for developing end-to-end data orchestration leveraging on AWS services to ingest, transform and process large volumes of structured and More ❯
Salary range: £70,000-£80,000 + 10% bonus + benefits Purpose: Build and maintain large, scalable Data Lakes, processes and pipelines Tech: Python, Iceberg/Kafka, Spark/Glue, CI/CD Industry: Financial services/securities trading Immersum continue to support a leading SaaS securities trading platform … Infra tooling using Terraform, Ansible and Jenkins whilst automating everything with Python Tech (experience in any listed is advantageous) Python Cloud: AWS Lake house: Apache Spark or AWS Glue Cloud Native storage: Iceberg, RDS, RedShift, Kafka IaC: Terraform, Ansible CI/CD: Jenkins, Gitlab Other platforms such as More ❯
Salary range: £70,000-£80,000 + 10% bonus + benefits Purpose: Build and maintain large, scalable Data Lakes, processes and pipelines Tech: Python, Iceberg/Kafka, Spark/Glue, CI/CD Industry: Financial services/securities trading Immersum continue to support a leading SaaS securities trading platform … Infra tooling using Terraform, Ansible and Jenkins whilst automating everything with Python Tech (experience in any listed is advantageous) Python Cloud: AWS Lake house: Apache Spark or AWS Glue Cloud Native storage: Iceberg, RDS, RedShift, Kafka IaC: Terraform, Ansible CI/CD: Jenkins, Gitlab Other platforms such as More ❯
Milton Keynes, Buckinghamshire, United Kingdom Hybrid / WFH Options
Banco Santander SA
AWS Data Engineer S3 Data Centre of Excellence AWS Data Engineer S3 Data Centre of Excellence Country: United Kingdom Interested in part-time, job-share or flexible working? We want to talk to you! Join our community. We have an More ❯