london (city of london), south east england, united kingdom
developrec
across complex data and cloud engineering projects Designing and delivering distributed solutions on an AWS-centric stack, with open-source flexibility Working with Databricks, ApacheIceberg, and Kubernetes in a cloud-agnostic environment Guiding architecture and implementation of large-scale data pipelines for structured and unstructured data Steering … Proven delivery of big data solutions—not necessarily at FAANG scale, but managing high-volume, complex data (structured/unstructured) Experience working with Databricks, ApacheIceberg, or similar modern data platforms Experience of building software environments from the ground up, setting best practice and standards Experience leading and … startup/scaleup background and someone that is adaptable ⚙️ Tech Stack Snapshot Languages: Python Cloud: AWS preferred, cloud-agnostic approach encouraged Data: SQL, Databricks, Iceberg, Kubernetes, large-scale data pipelines CI/CD & Ops: Open source tools, modern DevOps principles 🚀 Why Join Us? Impactful Work – Help solve security problems More ❯
across complex data and cloud engineering projects Designing and delivering distributed solutions on an AWS-centric stack, with open-source flexibility Working with Databricks, ApacheIceberg, and Kubernetes in a cloud-agnostic environment Guiding architecture and implementation of large-scale data pipelines for structured and unstructured data Steering … Proven delivery of big data solutions-not necessarily at FAANG scale, but managing high-volume, complex data (structured/unstructured) Experience working with Databricks, ApacheIceberg, or similar modern data platforms Experience of building software environments from the ground up, setting best practice and standards Experience leading and … that could be preferable but open for this person to drive the Engineering practice Cloud: AWS preferred, cloud-agnostic approach encouraged Data: SQL, Databricks, Iceberg, Kubernetes, large-scale data pipelines CI/CD & Ops: Open source tools, modern DevOps principles Why Join Us? Impactful Work - Help solve security problems More ❯
grasp of data governance/data management concepts, including metadata management, master data management and data quality. Ideally, have experience with Data Lakehouse toolset (Iceberg) What you'll get in return Hybrid working (4 days per month in London HQ + as and when required) Access to market leading More ❯
building ETL pipelines using Python. Experience of SQL and relational databases. Experience with AWS or similar Cloud technology. Experience with S3, Kafka, Airflow, and Iceberg will be beneficial. Experience in the financial markets with a focus on securities & derivatives trading. Exceptional communication skills, attention to detail, and adaptability. More ❯
building ETL pipelines using Python. Experience of SQL and relational databases. Experience with AWS or similar Cloud technology. Experience with S3, Kafka, Airflow, and Iceberg will be beneficial. Experience in the financial markets with a focus on securities & derivatives trading. Exceptional communication skills, attention to detail, and adaptability. More ❯
building ETL pipelines using Python. Experience of SQL and relational databases. Experience with AWS or similar Cloud technology. Experience with S3, Kafka, Airflow, and Iceberg will be beneficial. Experience in the financial markets with a focus on securities & derivatives trading. Exceptional communication skills, attention to detail, and adaptability. More ❯
Scala Starburst and Athena Kafka and Kinesis DataHub ML Flow and Airflow Docker and Terraform Kafka, Spark, Kafka Streams and KSQL DBT AWS, S3, Iceberg, Parquet, Glue and EMR for our Data Lake Elasticsearch and DynamoDB More information: Enjoy fantastic perks like private healthcare & dental insurance, a generous work More ❯
cloud-native environments (AWS preferred) Strong proficiency with Python and SQL Extensive hands-on experience in AWS data engineering technologies, including Glue, PySpark, Athena, Iceberg, Databricks, Lake Formation, and other standard data engineering tools. Familiarity with DevOps practices and infrastructure-as-code (e.g., Terraform, CloudFormation) Solid understanding of data More ❯
cloud-native environments (AWS preferred) Strong proficiency with Python and SQL Extensive hands-on experience in AWS data engineering technologies, including Glue, PySpark, Athena, Iceberg, Databricks, Lake Formation, and other standard data engineering tools. Familiarity with DevOps practices and infrastructure-as-code (e.g., Terraform, CloudFormation) Solid understanding of data More ❯
london, south east england, united kingdom Hybrid / WFH Options
Radley James
cloud-native environments (AWS preferred) Strong proficiency with Python and SQL Extensive hands-on experience in AWS data engineering technologies, including Glue, PySpark, Athena, Iceberg, Databricks, Lake Formation, and other standard data engineering tools. Familiarity with DevOps practices and infrastructure-as-code (e.g., Terraform, CloudFormation) Solid understanding of data More ❯
Data Engineer Language Skills (Python, SQL, Javascript) Experience of Azure, AWS or GCP cloud platforms and Data Lake/Warehousing Platforms such as Snowflake, Iceberg etc Experience of various ETL and Streaming Tools (Flink, Spark) Experience of a variety of data mining techniques (APIs, GraphQL, Website Scraping) Ability to More ❯
Data Engineer Language Skills (Python, SQL, Javascript) Experience of Azure, AWS or GCP cloud platforms and Data Lake/Warehousing Platforms such as Snowflake, Iceberg etc Experience of various ETL and Streaming Tools (Flink, Spark) Experience of a variety of data mining techniques (APIs, GraphQL, Website Scraping) Ability to More ❯
data. What you offer Experience with AWS cloud. Experience programming, debugging, and running production systems in Python. Exposure to open-source technologies such as Iceberg, Trino, and Airflow. Passionate about the use and adoption of these capabilities, focused on user experience and ensuring our business sees real value from More ❯
processes using infrastructure-as-code (Terraform) Build and maintain data pipelines using Airflow. Manage our tech stack including Python, Node.js, PostgreSQL, MongoDB, Kafka, and Apache Iceberg. Optimize infrastructure costs and develop strategies for efficient resource utilization. Provide critical support by monitoring services and resolving production issues. Contribute to the More ❯
delivering customer proposals aligned with Analytics Solutions. Experience with one or more relevant tools (Sqoop, Flume, Kafka, Oozie, Hue, Zookeeper, HCatalog, Solr, Avro, Parquet, Iceberg, Hudi). Experience developing software and data engineering code in one or more programming languages (Java, Python, PySpark, Node, etc). AWS and other More ❯
analysis and automation. Proficiency in building and maintaining batch and streaming ETL/ELT pipelines at scale, employing tools such as Airflow, Fivetran, Kafka, Iceberg, Parquet, Spark, Glue for developing end-to-end data orchestration leveraging on AWS services to ingest, transform and process large volumes of structured and More ❯
the future of the AI Data Cloud. Join the Snowflake team. Snowflake is seeking an accomplished Principal Sales Specialist, Data Engineering Platform (Openflow and Iceberg/Polaris) to drive sales execution for our ingestion workloads for the EMEA markets. Reporting to the Managing Direction, Data Engineering Platform Sales, this More ❯
with Spark or DBT on Starburst Use SQL to transform data into meaningful insights Build and deploy infrastructure with Terraform Implement DDL, DML with Iceberg Do code reviews for your peers Orchestrate your pipelines with DAGs on Airflow Participate in SCRUM ceremonies (standups, backlogs, demos, retros, planning) Secure data … diagram of proposed tables to enable discussion Good communicator and comfortable with presenting ideas and outputs to technical and non-technical users Worked on Apache Airflow before to create DAGS. Ability to work within Agile, considering minimum viable products, story pointing and sprints More information: Enjoy fantastic perks like More ❯
modeling. Knowledge of CI/CD tools like GitHub Actions or similar. AWS certifications such as AWS Certified Data Engineer. Knowledge of Snowflake, SQL, Apache Airflow, and DBT. Familiarity with Atlan for data cataloging and metadata management. Understanding of iceberg tables. Who we are: We're a global More ❯