across complex data and cloud engineering projects Designing and delivering distributed solutions on an AWS-centric stack, with open-source flexibility Working with Databricks, ApacheIceberg, and Kubernetes in a cloud-agnostic environment Guiding architecture and implementation of large-scale data pipelines for structured and unstructured data Steering … Proven delivery of big data solutions—not necessarily at FAANG scale, but managing high-volume, complex data (structured/unstructured) Experience working with Databricks, ApacheIceberg, or similar modern data platforms Experience of building software environments from the ground up, setting best practice and standards Experience leading and … startup/scaleup background and someone that is adaptable Tech Stack Snapshot Languages: Python Cloud: AWS preferred, cloud-agnostic approach encouraged Data: SQL, Databricks, Iceberg, Kubernetes, large-scale data pipelines CI/CD & Ops: Open source tools, modern DevOps principles Why Join Us? Impactful Work – Help solve security problems More ❯
across complex data and cloud engineering projects Designing and delivering distributed solutions on an AWS-centric stack, with open-source flexibility Working with Databricks, ApacheIceberg, and Kubernetes in a cloud-agnostic environment Guiding architecture and implementation of large-scale data pipelines for structured and unstructured data Steering … Proven delivery of big data solutions—not necessarily at FAANG scale, but managing high-volume, complex data (structured/unstructured) Experience working with Databricks, ApacheIceberg, or similar modern data platforms Experience of building software environments from the ground up, setting best practice and standards Experience leading and … startup/scaleup background and someone that is adaptable Tech Stack Snapshot Languages: Python Cloud: AWS preferred, cloud-agnostic approach encouraged Data: SQL, Databricks, Iceberg, Kubernetes, large-scale data pipelines CI/CD & Ops: Open source tools, modern DevOps principles Why Join Us? Impactful Work – Help solve security problems More ❯
on complex data and cloud engineering projects Designing and delivering distributed solutions on an AWS-centric stack, with open-source flexibility Working with Databricks, ApacheIceberg, and Kubernetes in a cloud-agnostic environment Guiding architecture and implementation of large-scale data pipelines for structured and unstructured data Steering … cloud deployments (especially AWS), and orchestration technologies Proven delivery of big data solutions—managing high-volume, complex data (structured/unstructured) Experience with Databricks, ApacheIceberg, or similar modern data platforms Experience building software environments from scratch, setting standards and best practices Experience leading and mentoring teams Startup More ❯
across complex data and cloud engineering projects Designing and delivering distributed solutions on an AWS-centric stack, with open-source flexibility Working with Databricks, ApacheIceberg, and Kubernetes in a cloud-agnostic environment Guiding architecture and implementation of large-scale data pipelines for structured and unstructured data Steering … Proven delivery of big data solutions—not necessarily at FAANG scale, but managing high-volume, complex data (structured/unstructured) Experience working with Databricks, ApacheIceberg, or similar modern data platforms Experience of building software environments from the ground up, setting best practice and standards Experience leading and … startup/scaleup background and someone that is adaptable Tech Stack Snapshot Languages: Python Cloud: AWS preferred, cloud-agnostic approach encouraged Data: SQL, Databricks, Iceberg, Kubernetes, large-scale data pipelines CI/CD & Ops: Open source tools, modern DevOps principles Why Join? Impactful Work – Help solve security problems that More ❯
across complex data and cloud engineering projects Designing and delivering distributed solutions on an AWS-centric stack, with open-source flexibility Working with Databricks, ApacheIceberg, and Kubernetes in a cloud-agnostic environment Guiding architecture and implementation of large-scale data pipelines for structured and unstructured data Steering … Proven delivery of big data solutions—not necessarily at FAANG scale, but managing high-volume, complex data (structured/unstructured) Experience working with Databricks, ApacheIceberg, or similar modern data platforms Experience of building software environments from the ground up, setting best practice and standards Experience leading and … startup/scaleup background and someone that is adaptable Tech Stack Snapshot Languages: Python Cloud: AWS preferred, cloud-agnostic approach encouraged Data: SQL, Databricks, Iceberg, Kubernetes, large-scale data pipelines CI/CD & Ops: Open source tools, modern DevOps principles Why Join? Impactful Work – Help solve security problems that More ❯
london (city of london), south east england, united kingdom
developrec
across complex data and cloud engineering projects Designing and delivering distributed solutions on an AWS-centric stack, with open-source flexibility Working with Databricks, ApacheIceberg, and Kubernetes in a cloud-agnostic environment Guiding architecture and implementation of large-scale data pipelines for structured and unstructured data Steering … Proven delivery of big data solutions—not necessarily at FAANG scale, but managing high-volume, complex data (structured/unstructured) Experience working with Databricks, ApacheIceberg, or similar modern data platforms Experience of building software environments from the ground up, setting best practice and standards Experience leading and … startup/scaleup background and someone that is adaptable Tech Stack Snapshot Languages: Python Cloud: AWS preferred, cloud-agnostic approach encouraged Data: SQL, Databricks, Iceberg, Kubernetes, large-scale data pipelines CI/CD & Ops: Open source tools, modern DevOps principles Why Join? Impactful Work – Help solve security problems that More ❯
across complex data and cloud engineering projects Designing and delivering distributed solutions on an AWS-centric stack with open-source flexibility Working with Databricks, ApacheIceberg, and Kubernetes in a cloud-agnostic environment Guiding architecture and implementation of large-scale data pipelines for structured and unstructured data Steering … cloud deployments (especially AWS), and orchestration technologies Proven delivery of big data solutions managing high-volume, complex data (structured/unstructured) Experience with Databricks, ApacheIceberg, or similar modern data platforms Experience building software environments from scratch, setting best practices and standards Experience leading and mentoring teams Startup More ❯
across complex data and cloud engineering projects Designing and delivering distributed solutions on an AWS-centric stack, with open-source flexibility Working with Databricks, ApacheIceberg, and Kubernetes in a cloud-agnostic environment Guiding architecture and implementation of large-scale data pipelines for structured and unstructured data Steering … Proven delivery of big data solutions—not necessarily at FAANG scale, but managing high-volume, complex data (structured/unstructured) Experience working with Databricks, ApacheIceberg, or similar modern data platforms Experience of building software environments from the ground up, setting best practice and standards Experience leading and More ❯
not necessary) Agile The following is DESIRABLE, not essential: AWS or GCP Buy-side Data tools such as Glue, Athena, Airflow, Ignite, DBT, Arrow, Iceberg, Dremio Fixed Income performance, risk or attribution TypeScript and Node Role: Python Developer (Software Engineer Programmer Developer Python Fixed Income JavaScript Node Fixed Income … times a week. The tech environment is very new and will soon likely include exposure to the following: Glue, Athena, Airflow, Ignite, DBT, Arrow, Iceberg, Dremio This is an environment that has been described as the only corporate environment with a start-up/fintech attitude towards technology. Hours More ❯
implementing REST API Familiarity with AWS Preferred experience/skills An understanding of the asset management business and/or financial markets Experience with Iceberg, Snowflake is a bonus Experience of gRPC Experience of working in a Scrum team Commitment to Diversity, Equity, and Inclusion: At T. Rowe Price More ❯
Hive, Redshift, Kafka, etc. Experience in data quality testing; capable of writing test cases and scripts, and resolving data issues. Experience with Databricks, Snowflake, Iceberg is required. Preferred qualifications, capabilities, and skills Understanding of application and data design disciplines, especially real-time processing with Kafka. Knowledge of the Commercial More ❯
stakeholders at all levels, provide training, and solicit feedback. Preferred qualifications, capabilities, and skills Experience with big-data technologies, such as Splunk, Trino, and Apache Iceberg. Data Science experience. AI/ML experience with building models. AWS certification (e.g., AWS Certified Solutions Architect, AWS Certified Developer). #J More ❯
grasp of data governance/data management concepts, including metadata management, master data management and data quality. Ideally, have experience with Data Lakehouse toolset (Iceberg) What you'll get in return Hybrid working (4 days per month in London HQ + as and when required) Access to market leading More ❯
Scala Starburst and Athena Kafka and Kinesis DataHub ML Flow and Airflow Docker and Terraform Kafka, Spark, Kafka Streams and KSQL DBT AWS, S3, Iceberg, Parquet, Glue and EMR for our Data Lake Elasticsearch and DynamoDB More information: Enjoy fantastic perks like private healthcare & dental insurance, a generous work More ❯
intellectually curious, and team-oriented. Strong communication skills. Experience with options trading or options data is a strong plus. Experience with technologies like KDB, ApacheIceberg, and Lake Formation will be a meaningful differentiator. #J-18808-Ljbffr More ❯
London, England, United Kingdom Hybrid / WFH Options
ZipRecruiter
working in cloud- environments (AWS ) Strong proficiency with Python and SQL Extensive hands-on experience in AWS data engineering technologies, including Glue, PySpark, Athena, Iceberg, Databricks, Lake Formation, and other standard data engineering tools. Familiarity with DevOps practices and infrastructure-as-code (e.g., Terraform, CloudFormation) Solid understanding of data More ❯
Data Engineer Language Skills (Python, SQL, Javascript) Experience of Azure, AWS or GCP cloud platforms and Data Lake/Warehousing Platforms such as Snowflake, Iceberg etc Experience of various ETL and Streaming Tools (Flink, Spark) Experience of a variety of data mining techniques (APIs, GraphQL, Website Scraping) Ability to More ❯
London, England, United Kingdom Hybrid / WFH Options
Workato
data movement, databases (Oracle, SQL Server, PostgreSQL), and cloud analytics platforms (Snowflake, Databricks, BigQuery). Familiarity with emerging data technologies like Open Table Format, ApacheIceberg, and their impact on enterprise data strategies. Hands-on experience with data virtualization and analytics platforms (Denodo, Domo) to enable seamless self More ❯
and designing multi-step pipelines. About you: Proficient in Python Experience in building complex data transformation pipelines Experience with Databricks at scale, preferably with Iceberg Familiarity with Airflow or Dagster Experience with AWS and open-source technologies on top of DataWeave Desirable: Exposure to medical data, especially video/ More ❯
to-end engineering experience supported by excellent tooling and automation. Preferred Qualifications, Capabilities, and Skills: Good understanding of the Big Data stack (Spark/Iceberg). Ability to learn new technologies and patterns on the job and apply them effectively. Good understanding of established patterns, such as stability patterns More ❯
processes using infrastructure-as-code (Terraform) Build and maintain data pipelines using Airflow. Manage our tech stack including Python, Node.js, PostgreSQL, MongoDB, Kafka, and Apache Iceberg. Optimize infrastructure costs and develop strategies for efficient resource utilization. Provide critical support by monitoring services and resolving production issues. Contribute to the More ❯
processes using infrastructure-as-code (Terraform) Build and maintain data pipelines using Airflow. Manage our tech stack including Python, Node.js, PostgreSQL, MongoDB, Kafka, and Apache Iceberg. Optimize infrastructure costs and develop strategies for efficient resource utilization. Provide critical support by monitoring services and resolving production issues. Contribute to the More ❯
analysis and automation. Proficiency in building and maintaining batch and streaming ETL/ELT pipelines at scale, employing tools such as Airflow, Fivetran, Kafka, Iceberg, Parquet, Spark, Glue for developing end-to-end data orchestration leveraging on AWS services to ingest, transform and process large volumes of structured and More ❯