React or Angular good but not necessary) Agile The following is DESIRABLE, not essential: AWS or GCP Buy-side Data tools such as Glue, Athena, Airflow, Ignite, DBT, Arrow, Iceberg, Dremio Fixed Income performance, risk or attribution TypeScript and Node Role: Python Developer (Software Engineer Programmer Developer Python Fixed Income JavaScript Node Fixed Income Credit Rates Bonds ABS Vue … in the office 1-2 times a week. The tech environment is very new and will soon likely include exposure to the following: Glue, Athena, Airflow, Ignite, DBT, Arrow, Iceberg, Dremio This is an environment that has been described as the only corporate environment with a start-up/fintech attitude towards technology. Hours are 9-5. Salary More ❯
programming language SQL Experience in implementing REST API Familiarity with AWS Preferred experience/skills An understanding of the asset management business and/or financial markets Experience with Iceberg, Snowflake is a bonus Experience of gRPC Experience of working in a Scrum team Commitment to Diversity, Equity, and Inclusion: At T. Rowe Price, our associates are our greatest More ❯
ability to effectively collaborate with stakeholders at all levels, provide training, and solicit feedback. Preferred qualifications, capabilities, and skills Experience with big-data technologies, such as Splunk, Trino, and Apache Iceberg. Data Science experience. AI/ML experience with building models. AWS certification (e.g., AWS Certified Solutions Architect, AWS Certified Developer). About Us J.P. Morgan is a global More ❯
data security, and data privacy frameworks and standards. Experience in data analysis, data visualization using tools such as Power BI Excellent communication, collaboration, and problem-solving skills. Knowledge of Iceberg, Trino will be considered an advantage Additional Information Compensation MUFG Investor Services provides all its employees with a competitive and attractive. compensation package. Starting salary will be dependent on More ❯
e.g. Glue and s3. Solid grasp of data governance/data management concepts, including metadata management, master data management and data quality. Ideally, have experience with Data Lakehouse toolset (Iceberg) What you'll get in return Hybrid working (4 days per month in London HQ + as and when required) Access to market leading technologies What you need to More ❯
Data technologies (Spark, Impala, Hive, Redshift, Kafka, etc.) Experience in data quality testing; adept at writing test cases and scripts, presenting and resolving data issues Experience with Databricks, Snowflake, Iceberg are required Preferred qualifications, capabilities, and skills Experience in application and data design disciplines with an emphasis on real-time processing and delivery e.g. Kafka is preferable Understanding of More ❯
building ETL/ELT pipelines using Python and pandas within a financial environment. Strong knowledge of relational databases and SQL. Familiarity with various technologies such as S3, Kafka, Airflow, Iceberg Proficiency working with large financial datasets from various vendors. A commitment to engineering excellence and pragmatic technology solutions. A desire to work in an operational role at the heart … What would be advantageous: Strong understanding of financial markets. Experienceworking with hierarchical referencedata models. Provenexpertise in handling high-throughput, real-time marketdata streams Familiarity with distributed computing frameworks suchas Apache Spark Operational experience supporting real time systems. Equal Opportunity Workplace We are proud to be an equal opportunity workplace. We do not discriminate based upon race, religion, color, national More ❯
Qualifications: Other useful skills and experience include: Proven desire to expand your cloud/platform engineering capabilities Experience working with Big Data Experience of data storage technologies: Delta Lake, Iceberg, Hudi Proven knowledge and understanding of Apache Spark, Databricks or Hadoop Ability to take business requirements and translate these into tech specifications Competence in evaluating and selecting development More ❯
Join to apply for the Senior Lead Software Engineer role at LSEG (London Stock Exchange Group) Role Description As a Senior Lead within Software Engineering, you’ll design and implement functionalities, focusing on Data Engineering tasks. You’ll be working More ❯
Role Description : As a Senior Lead within Software Engineering, you'll design and implement functionalities, focusing on Data Engineering tasks. You'll work with semi-structured data to ingest and distribute it on a Microsoft Fabric-based platform, modernizing data More ❯
Cardiff, South Glamorgan, United Kingdom Hybrid / WFH Options
RVU Co UK
Staff Software Engineer - Data Department: Engineering Employment Type: Full Time Location: Cardiff Description is the UK's first comparison platform for car insurance. We've been helping customers since 2002 by empowering them to make better decisions around insurance and More ❯
of financial products. Hard-working, intellectually curious, and team-oriented. Strong communication skills. Experience with options trading or options data is a strong plus. Experience with technologies like KDB, ApacheIceberg, and Lake Formation will be a meaningful differentiator. #J-18808-Ljbffr More ❯
with interface/API data modeling. Knowledge of CI/CD tools like GitHub Actions or similar. AWS certifications such as AWS Certified Data Engineer. Knowledge of Snowflake, SQL, Apache Airflow, and DBT. Familiarity with Atlan for data cataloging and metadata management. Understanding of iceberg tables. Who we are: We're a global business empowering local teams with More ❯
London, England, United Kingdom Hybrid / WFH Options
Workato
pipelines. Knowledge of real-time data movement, databases (Oracle, SQL Server, PostgreSQL), and cloud analytics platforms (Snowflake, Databricks, BigQuery). Familiarity with emerging data technologies like Open Table Format, ApacheIceberg, and their impact on enterprise data strategies. Hands-on experience with data virtualization and analytics platforms (Denodo, Domo) to enable seamless self-service data exploration and analytics. More ❯
London, England, United Kingdom Hybrid / WFH Options
ZipRecruiter
experience as a Data Engineer working in cloud- environments (AWS ) Strong proficiency with Python and SQL Extensive hands-on experience in AWS data engineering technologies, including Glue, PySpark, Athena, Iceberg, Databricks, Lake Formation, and other standard data engineering tools. Familiarity with DevOps practices and infrastructure-as-code (e.g., Terraform, CloudFormation) Solid understanding of data modeling, ETL frameworks, and big More ❯
level Employment type Employment type Full-time Job function Job function Engineering and Information Technology Industries Banking, Investment Banking, and Information Services Referrals increase your chances of interviewing at Iceberg by 2x Sign in to set job alerts for “Software Engineer” roles. Backend Junior Software Engineer - Remote 4 days a week (Europe) Manchester, England, United Kingdom $30,000.00 More ❯
Experienced Lead Data Engineer Strong Data Engineer Language Skills (Python, SQL, Javascript) Experience of Azure, AWS or GCP cloud platforms and Data Lake/Warehousing Platforms such as Snowflake, Iceberg etc Experience of various ETL and Streaming Tools (fiveTran, Flink, Spark) Experience of a variety of data mining techniques (APIs, GraphQL, Website Scraping) Ability to translate Data into meaningful More ❯
Join to apply for the Data Platform Engineering Lead role at LSEG 3 days ago Be among the first 25 applicants Join to apply for the Data Platform Engineering Lead role at LSEG LSEG (London Stock Exchange Group) is one More ❯
engineers + external partners) on complex data and cloud engineering projects Designing and delivering distributed solutions on an AWS-centric stack, with open-source flexibility Working with Databricks, ApacheIceberg, and Kubernetes in a cloud-agnostic environment Guiding architecture and implementation of large-scale data pipelines for structured and unstructured data Steering software stack decisions, best practices, and … Deep experience with Software Engineering, cloud deployments (especially AWS), and orchestration technologies Proven delivery of big data solutions—managing high-volume, complex data (structured/unstructured) Experience with Databricks, ApacheIceberg, or similar modern data platforms Experience building software environments from scratch, setting standards and best practices Experience leading and mentoring teams Startup/scaleup background and adaptability More ❯
SQL. Hands-on experience with AWS services, including AWS Lambda, Glue, Step Functions, Fargate and CDK. Experience with Docker containerization and managing container images using AWS ECR. Experience with ApacheIceberg or a similar lakehouse engine. Knowledge of building custom ETL solutions. Data modeling and T-SQL experience for managing business data and reporting. Ability to perform technical More ❯
queries, converting data, mapping outputs, and designing multi-step pipelines. About you: Proficient in Python Experience in building complex data transformation pipelines Experience with Databricks at scale, preferably with Iceberg Familiarity with Airflow or Dagster Experience with AWS and open-source technologies on top of DataWeave Desirable: Exposure to medical data, especially video/image data, not just tabular More ❯
Science or a related field. Experience working on and shipping live service games. Experience working on Spring Boot projects. Experience deploying software/services on Kubernetes. Experience working with Apache Spark and Iceberg. More ❯
standards, and drive team alignment Work closely with stakeholders to translate business needs into scalable solutions Tech environment includes Python, SQL, dbt, Databricks, BigQuery, Delta Lake, Spark, Kafka, Parquet, Iceberg (If you haven’t worked with every tool, that’s totally fine — my client values depth of thinking and engineering craft over buzzword familiarity.) What they’re looking for More ❯
standards, and drive team alignment Work closely with stakeholders to translate business needs into scalable solutions Tech environment includes Python, SQL, dbt, Databricks, BigQuery, Delta Lake, Spark, Kafka, Parquet, Iceberg (If you haven’t worked with every tool, that’s totally fine — my client values depth of thinking and engineering craft over buzzword familiarity.) What they’re looking for More ❯