City of London, London, United Kingdom Hybrid / WFH Options
TECHNOLOGY RECWORKS LIMITED
sponsors) Knowledge and experience of the following would be advantageous: Knowledge of Enterprise Architecture Frameworks Good knowledge of Azure DevOps Pipelines Strong experience in Apache Spark framework Previous experience in designing and delivering data warehouse and business intelligence solutions using on-premises Microsoft stack (SSIS, SSRS, SSAS) Knowledge of more »
Product Manager (Product Owner Business Analysis Data Lake Data Mesh Datamesh SQL Architecture Big Data AWS Python Dremio Apache Iceburg Iceberg Arrow DBT Tableau PowerBI Finance Trading Front Office Investment Banking Asset Manager Financial Services FX Fixed Income Equities Commodities Derivatives Hedge Fund Buy Side) required by my asset … fund, asset manager, investment management) Role: Product Manager (Product Owner Business Analysis Data Lake Data Mesh Datamesh SQL Architecture Big Data AWS Python Dremio Apache Iceburg Iceberg Arrow DBT Tableau PowerBI Finance Trading Front Office Investment Banking Asset Manager Financial Services FX Fixed Income Equities Commodities Derivatives Hedge Fund more »
London, England, United Kingdom Hybrid / WFH Options
Client Server
to production, providing subject matter expertise on the .Net stack and contributing to technical design discussions. You'll use a range of technology including Apache Flink with Java for large scale data processing and will be able to assess and recommend new and emerging technologies, using the best tool more »
London, England, United Kingdom Hybrid / WFH Options
Client Server
to production, providing subject matter expertise on the .Net stack and contributing to technical design discussions. You'll use a range of technology including Apache Flink with Java for large scale data processing and will be able to assess and recommend new and emerging technologies, using the best tool more »
Greater London, England, United Kingdom Hybrid / WFH Options
Understanding Recruitment
use Java (for a very small amount of scripting work) Have public cloud experience with AWS or other cloud providers Have an understanding of Apache products such as Kafka and Flake Good knowledge of development using CI/CD Bonus points if you knowledge of: Web products Financial markets more »
experience in data engineering. Experienced in building ETL data pipelines. Relational database experience w/PostgreSQL. Understanding of tech within our stack: AWS/Apache beam/Kafka. Experience with Object Orientated Programming A desire to work in the commodities/trading sector. Permanent/Full-Time Employment. Hybrid more »
testing, and maintenance of data pipelines and data storage systems on Google Cloud Platform (GCP). You will be working with technologies such as Apache Airflow, BigQuery, Python, and SQL to transform and load large data sets, ensuring high data quality and accessibility for business intelligence and analytics purposes. more »
teams to support the orchestration of our ETL pipelines using Airflow and manage our tech stack including Python, Next.js, Airflow, PostgreSQL MongoDB, Kafka and Apache Iceberg. Optimize infrastructure costs and develop strategies for efficient resource utilization. Provide critical support by monitoring services and quickly resolving production issues. Contribute to more »
City of London, England, United Kingdom Hybrid / WFH Options
Penguin Recruitment
market leading software required to communicate and exchange technical information. Demonstrable academic qualifications appropriate to the role. Be a NABERS Assessor Have experience with Apache HVAC (preferred) Have a genuine interest and enthusiasm in the built environment and progress toward Net Zero Carbon. Demonstrable familiarity with key industry agendas more »
London, England, United Kingdom Hybrid / WFH Options
Pioneer Search
Senior Scala Developer - Apache Spark - Urgent Requirement Contract Length: 6 Months IR35 status: Inside Location: London - Hybrid working A Senior Scala Developer with experience in Apache Spark is needed for a British consultancy organisation. You will be an integral member of the team providing technical expertise to the more »
Elasticsearch, Bigquery, PostgresQL FullCircl 3 Lead_Data_Engineer 04.24 · Kubernetes, Docker, Airflow KEY RESPONSIBILITIES · Designing and implementing scalable data pipelines using tools such as Apache Spark, Google PubSub etc. · Optimizing data storage and retrieval systems for maximum performance using both relational and NoSQL databases. · Continuously monitoring and improving the … Data Infrastructure projects, as well as designing and building data intensive applications and services. · Experience with data processing and distributed computing frameworks such as Apache Spark · Expert knowledge in one or more of the following languages - Python, Scala, Java, Kotlin · Deep knowledge of data modelling, data access, and data more »
Greater London, England, United Kingdom Hybrid / WFH Options
Validis
Proven ability to leverage CI/CD tools to streamline data pipeline development and deployment. Experience designing and implementing ETL pipelines using tools like Apache Airflow, Luigi, Spark, or similar frameworks (familiarity is a plus). Understanding of data warehousing concepts and data modelling techniques. Experience with SQL and more »
delivering moderate-to-complex data flows as part of a development team in collaboration with others. You’ll be confident using technologies such as: Apache Kafka, Apache NiFi, SAS DI Studio, or other data integration platforms. You can implement, deliver, and translate several data models, including unstructured data … and recognised standards to build solutions using various traditional or big data languages such as: SQL, PL/SQL, SAS Macro Language, Python, Scala, Apache Spark, Java, JavaScript etc, using various tools including SAS, Hue (Hive/Impala), Kibana (Elastic Search). Knowledge of data management on Cloud platforms more »
ETL/ELT tools.Experience with NoSQL type environments, Data Lakes, Lake-Houses (Cassandra, MongoDB or Neptune).Experience with distributed storage, processing engines such as Apache Hadoop and Apache Spark.Experience with message brokering/stream processing services such as Apache Kafka, Confluent, Azure Stream Analytics.Experience in Test Driven more »
Flask, Tornado or Django, Docker Experience working with ETL pipelines is desirable e.g. Luigi, Airflow or Argo Experience with big data technologies, such as Apache Spark, Hadoop, Kafka, etc. Data acquisition and development of data sets and improving data quality Preparing data for predictive and prescriptive modelling Hands on more »
Greater London, England, United Kingdom Hybrid / WFH Options
Validis
to leverage CI/CD tools to streamline data pipeline development and deployment. Proven expertise in designing and implementing ETL pipelines using tools like Apache Airflow, Luigi, Spark, or similar frameworks. Strong understanding of data warehousing concepts and data modelling techniques. Experience with SQL and proficiency in writing complex more »
London, England, United Kingdom Hybrid / WFH Options
McGregor Boyall
Work with Hadoop, Spark, and other platforms for large-scale data processing. Real-Time Data Streaming: Develop and manage pipelines using CDC, Kafka, and Apache Spark. Database Management: Handle SQL databases like Oracle, MySQL, or PostgreSQL. Data Governance: Ensure data quality, security, and compliance with best practices. Ideal Candidate more »