London, England, United Kingdom Hybrid / WFH Options
Yelp USA
we’re looking for great people, not just those who simply check off all the boxes. What you'll do: Work with technologies like Apache Lucene, Apache Flink, Apache Beam, and Kubernetes to build core components of Yelp’s search infrastructure. Design, build, and maintain scalable real … and complexity analysis. Comprehensive understanding of systems and application design, including operational and reliability trade-offs. Experience with distributed data processing frameworks such as Apache Flink or Apache Beam. Familiarity with search technologies like Apache Lucene or Elasticsearch is a plus. Experience working with containerized environments and More ❯
/or Python. Experience with tools like Terraform to provision AWS cloud services. Knowledge of AWS Glue, AWS Athena, and AWS S3. Understanding of Apache Parquet and open table formats such as Delta, Iceberg, and Hudi. Experience with Test Driven Development using JUnit, Mojito, or similar tools. Extensive knowledge More ❯
London, England, United Kingdom Hybrid / WFH Options
Methods
Kubernetes for orchestration, ensuring scalable and efficient deployment of applications across both cloud-based and on-premises environments. Workflow Automation: Employ tools such as Apache NiFi and Apache Airflow to automate data flows and manage complex workflows within hybrid environments. Event Streaming Experience: Utilise event-driven technologies such … as Kafka, Apache NiFi, and Apache Flink to handle real-time data streams effectively. Security and Compliance: Manage security setups and access controls, incorporating tools like Keycloak to protect data integrity and comply with legal standards across all data platforms. Data Search and Analytics: Oversee and enhance Elasticsearch … Solid experience with Docker and Kubernetes in managing applications across both on-premises and cloud platforms. Proficiency in Workflow Automation Tools: Practical experience with Apache NiFi and Apache Airflow in hybrid data environments. Experience in Event Streaming: Proven ability in managing and deploying event streaming platforms like Kafka More ❯
technologies like Docker and Kubernetes. Ideally, some familiarity with data workflow management tools such as Airflow as well as big data technologies such as Apache Spark/Ignite or other caching and analytics technologies. A working knowledge of FX markets and financial instruments would be beneficial. What we'll More ❯
stakeholders at all levels, provide training, and solicit feedback. Preferred qualifications, capabilities, and skills Experience with big-data technologies, such as Splunk, Trino, and Apache Iceberg. Data Science experience. AI/ML experience with building models. AWS certification (e.g., AWS Certified Solutions Architect, AWS Certified Developer). About Us More ❯
factor app development standards Experience building modern enterprise applications and deploying to public or private clouds including AWS Experience in distributed cache systems like Apache Ignite or Redis Experience in big data platforms and technologies such as Hadoop, Hive, HDFS, Presto/Starburst, Spark, and Kafka Experience in Spring More ❯
source platform that we will teach you. Read more on Bloomberg . Other technologies in use in our space: RESTful services, Maven/Gradle, Apache Spark, BigData, HTML 5, AngularJs/ReactJs, IntelliJ, Gitlab, Jira. Cloud Technologies: You'll be involved in building the next generation of finance systems More ❯
highest data throughput are implemented in Java. Within Data Engineering, Dataiku, Snowflake, Prometheus, and ArcticDB are used heavily. Kafka is used for data pipelines, Apache Beam for ETL, Bitbucket for source control, Jenkins for continuous integration, Grafana + Prometheus for metrics collection, ELK for log shipping and monitoring, Docker More ❯
practices include OWASP guidelines/top 10, SOC 2, and NCSC cloud security principles. Experience in data and orchestration tools including some of dbt, Apache Airflow, Azure Data Factory. Experience in programming languages including some of Python, Typescript, Javascript, R, Java, C#, producing services, APIs, Function Apps or Lambdas. More ❯
understanding of tradable financial instruments (securities, derivatives) and capital markets Computer Science, Math, or Financial Engineering degree Strong knowledge of data orchestration technologies – e.g., Apache Airflow, Dagster, AWS Step Functions Understanding of ETL/ELT workflows, data modeling, and performance optimization for both batch and real-time processing. Role More ❯
SDLC automation tools like JIRA, Bamboo, and Ansible is a plus Qualifications/Experience: Strong programming skills in Java; experience with Spring, Hibernate, and Apache Ignite is a plus Ability to write complex SQL queries Experience with Fidessa Equities platform ETP/CTAC is desirable Unix/Linux command More ❯
AWS Certified Data Analytics - Specialty or AWS Certified Solutions Architect - Associate. Experience with Airflow for workflow orchestration. Exposure to big data frameworks such as Apache Spark, Hadoop, or Presto. Hands-on experience with machine learning pipelines and AI/ML data engineering on AWS. Benefits: Competitive salary and performance More ❯
diagram of proposed tables to enable discussion Good communicator and comfortable with presenting ideas and outputs to technical and non-technical users Worked on Apache Airflow before to create DAGS. Ability to work within Agile, considering minimum viable products, story pointing and sprints More information: Enjoy fantastic perks like More ❯
SDLC automation tools like JIRA, Bamboo, and Ansible is a plus Qualifications/Experience: Strong programming skills in Java; experience with Spring, Hibernate, and Apache Ignite is a plus Ability to write complex SQL queries Experience with Fidessa Equities platform ETP/CTAC is desirable Unix/Linux command More ❯
SDLC automation tools like JIRA, Bamboo, and Ansible is a plus Qualifications/Experience: Strong programming skills in Java; experience with Spring, Hibernate, and Apache Ignite is a plus Ability to write complex SQL queries Experience with Fidessa Equities platform ETP/CTAC is desirable Unix/Linux command More ❯
SDLC automation tools like JIRA, Bamboo, and Ansible is a plus Qualifications/Experience: Strong programming skills in Java; experience with Spring, Hibernate, and Apache Ignite is a plus Ability to write complex SQL queries Experience with Fidessa Equities platform ETP/CTAC is desirable Unix/Linux command More ❯
london (city of london), south east england, united kingdom
LevelUP HCS
SDLC automation tools like JIRA, Bamboo, and Ansible is a plus Qualifications/Experience: Strong programming skills in Java; experience with Spring, Hibernate, and Apache Ignite is a plus Ability to write complex SQL queries Experience with Fidessa Equities platform ETP/CTAC is desirable Unix/Linux command More ❯
Flask, Django, or FastAPI. Proficiency in Python 3.x and libraries like Pandas, NumPy, and Dask. Experience with data manipulation and processing frameworks (e.g., PySpark, Apache Beam). Strong knowledge of databases, including SQL and NoSQL (e.g., PostgreSQL, MongoDB). Familiarity with ETL processes and tools such as Airflow or More ❯
Strong communication skills and the ability to work in a team. Strong analytical and problem-solving skills. PREFERRED QUALIFICATIONS Experience with Kubernetes deployment architectures Apache NiFi experience Experience building trading controls within an investment bank ABOUT GOLDMAN SACHS At Goldman Sachs, we commit our people, capital and ideas to More ❯
communication skills and demonstrated ability to engage with business stakeholders and product teams. Experience in data modeling , data warehousing (e.g., Snowflake , AWS Glue , EMR , Apache Spark ), and working with data pipelines . Leadership experience—whether technical mentorship, team leadership, or managing critical projects. Familiarity with Infrastructure as Code (IaC More ❯
modeling. Knowledge of CI/CD tools like GitHub Actions or similar. AWS certifications such as AWS Certified Data Engineer. Knowledge of Snowflake, SQL, Apache Airflow, and DBT. Familiarity with Atlan for data cataloging and metadata management. Understanding of iceberg tables. Who we are: We're a global business More ❯
Flask, Django, or FastAPI. Proficiency in Python 3.x and libraries like Pandas, NumPy, and Dask. Experience with data manipulation and processing frameworks (e.g., PySpark, Apache Beam). Strong knowledge of databases, including SQL and NoSQL (e.g., PostgreSQL, MongoDB). Familiarity with ETL processes and tools such as Airflow or More ❯