React or Angular good but not necessary) Agile The following is DESIRABLE, not essential: AWS or GCP Buy-side Data tools such as Glue, Athena, Airflow, Ignite, DBT, Arrow, Iceberg, Dremio Fixed Income performance, risk or attribution TypeScript and Node Role: Python Developer (Software Engineer Programmer Developer Python Fixed Income JavaScript Node Fixed Income Credit Rates Bonds ABS Vue … in the office 1-2 times a week. The tech environment is very new and will soon likely include exposure to the following: Glue, Athena, Airflow, Ignite, DBT, Arrow, Iceberg, Dremio This is an environment that has been described as the only corporate environment with a start-up/fintech attitude towards technology. Hours are 9-5. Salary More ❯
Chantilly, Virginia, United States Hybrid / WFH Options
The DarkStar Group
rather huge and includes Python (Pandas, numpy, scipy, scikit-learn, standard libraries, etc.), Python packages that wrap Machine Learning (packages for NLP, Object Detection, etc.), Linux, AWS/C2S, Apache NiFi, Spark, pySpark, Hadoop, Kafka, ElasticSearch, Solr, Kibana, neo4J, MariaDB, Postgres, Docker, Puppet, and many others. Work on this program takes place in Chantilly, VA, McLean, VA and in … standards. Develop and deliver documentation for each project including ETL mappings, code use guide, code location and access instructions. Design and optimize Data Pipelines using tools such as Spark, ApacheIceberg, Trino, OpenSearch, EMR cloud services, NiFi and Kubernetes containers Ensure the pedigree and provenance of the data is maintained such that the access to data is protected More ❯
Herndon, Virginia, United States Hybrid / WFH Options
The DarkStar Group
rather huge and includes Python (Pandas, numpy, scipy, scikit-learn, standard libraries, etc.), Python packages that wrap Machine Learning (packages for NLP, Object Detection, etc.), Linux, AWS/C2S, Apache NiFi, Spark, pySpark, Hadoop, Kafka, ElasticSearch, Solr, Kibana, neo4J, MariaDB, Postgres, Docker, Puppet, and many others. Work on this program takes place in Chantilly, VA, McLean, VA and in … standards. Develop and deliver documentation for each project including ETL mappings, code use guide, code location and access instructions. Design and optimize Data Pipelines using tools such as Spark, ApacheIceberg, Trino, OpenSearch, EMR cloud services, NiFi and Kubernetes containers Ensure the pedigree and provenance of the data is maintained such that the access to data is protected More ❯
San Jose, California, United States Hybrid / WFH Options
PayPal
8+ years of professional experience in software development, with a proven experience in backend development in TypeScript and Java. Experience with modern data warehousing technologies (e.g., Snowflake, BigQuery, Databricks, ApacheIceberg) and has informed perspectives on their advantages, limitations, and how they fit into evolving data architectures. High proficiency and strong skills in Cloud Technologies, object-oriented, and More ❯
e.g. Glue and s3. Solid grasp of data governance/data management concepts, including metadata management, master data management and data quality. Ideally, have experience with Data Lakehouse toolset (Iceberg) What you'll get in return Hybrid working (4 days per month in London HQ + as and when required) Access to market leading technologies What you need to More ❯
Cardiff, South Glamorgan, United Kingdom Hybrid / WFH Options
RVU Co UK
Staff Software Engineer - Data Department: Engineering Employment Type: Full Time Location: Cardiff Description is the UK's first comparison platform for car insurance. We've been helping customers since 2002 by empowering them to make better decisions around insurance and More ❯
London, England, United Kingdom Hybrid / WFH Options
Workato
pipelines. Knowledge of real-time data movement, databases (Oracle, SQL Server, PostgreSQL), and cloud analytics platforms (Snowflake, Databricks, BigQuery). Familiarity with emerging data technologies like Open Table Format, ApacheIceberg, and their impact on enterprise data strategies. Hands-on experience with data virtualization and analytics platforms (Denodo, Domo) to enable seamless self-service data exploration and analytics. More ❯
Herndon, Virginia, United States Hybrid / WFH Options
Maxar Technologies Holdings Inc
and learning new technologies quickly. Preferred Qualifications: Experience with software development. Experience with geospatial data. Experience building data-streaming processes. Experience using PostGIS. Experience with any of the following: Apache-Hive, Trino, Presto, Starburst, OpenMetadata, Apache-SuperSet, Terraform, dbt, Tableau, Fivetran, Airflow. Experience implementing resilient, scalable, and supportable systems in AWS. Experience using a wide variety of open More ❯
London, England, United Kingdom Hybrid / WFH Options
ZipRecruiter
experience as a Data Engineer working in cloud- environments (AWS ) Strong proficiency with Python and SQL Extensive hands-on experience in AWS data engineering technologies, including Glue, PySpark, Athena, Iceberg, Databricks, Lake Formation, and other standard data engineering tools. Familiarity with DevOps practices and infrastructure-as-code (e.g., Terraform, CloudFormation) Solid understanding of data modeling, ETL frameworks, and big More ❯
pipelines. Implement and optimize automation processes using infrastructure-as-code (Terraform) Build and maintain data pipelines using Airflow. Manage our tech stack including Python, Node.js, PostgreSQL, MongoDB, Kafka, and Apache Iceberg. Optimize infrastructure costs and develop strategies for efficient resource utilization. Provide critical support by monitoring services and resolving production issues. Contribute to the development of new services as More ❯
pipelines. Implement and optimize automation processes using infrastructure-as-code (Terraform) Build and maintain data pipelines using Airflow. Manage our tech stack including Python, Node.js, PostgreSQL, MongoDB, Kafka, and Apache Iceberg. Optimize infrastructure costs and develop strategies for efficient resource utilization. Provide critical support by monitoring services and resolving production issues. Contribute to the development of new services as More ❯
London, England, United Kingdom Hybrid / WFH Options
Automata
and SQL for data processing, analysis and automation. Proficiency in building and maintaining batch and streaming ETL/ELT pipelines at scale, employing tools such as Airflow, Fivetran, Kafka, Iceberg, Parquet, Spark, Glue for developing end-to-end data orchestration leveraging on AWS services to ingest, transform and process large volumes of structured and unstructured data from diverse sources. More ❯
Cardiff, Wales, United Kingdom Hybrid / WFH Options
Identify Solutions
Want to drive a top brand's Data team with 1m+ users? If you love building software in Python, implementing robust data pipelines & driving best practices, you may be interested in a Senior Engineer role I have with a highly More ❯
London, England, United Kingdom Hybrid / WFH Options
Experteer Italy
Expertise in data warehousing, data modelling, and data integration. * Experience in MLOps and machine learning pipelines. * Proficiency in SQL and data manipulation languages. * Experience with big data platforms (including Apache Arrow, Apache Spark, ApacheIceberg, and Clickhouse) and cloud-based infrastructure on AWS. Education & Qualifications * Bachelor's or Master's degree in Computer Science, Engineering, or More ❯
developing new and enhancing existing open-source based Data Lakehouse platform components Experience cultivating relationships with and contributing to open-source software projects Experience with open-source table formats (ApacheIceberg, Delta, Hudi or equivalent) Experience with open-source compute engines (Apache Spark, Apache Flink, Trino/Presto, or equivalent) Experience with cloud computing (AWS, Microsoft More ❯