My client is seeking an experienced contractor for a 3-month project (with potential extension) to work on data extraction into PostgreSQL Datamarts using ApacheAirflow . The project’s technical architecture is already completed, and you'll be responsible for implementing the solution. Key Responsibilities: Implement data … extraction pipelines into PostgreSQL Use ApacheAirflow to manage workflows Collaborate with stakeholders to ensure alignment with the existing architecture Work extensively with SQL and PostgreSQL Align the solution with the initial proposal, incorporating Azure Synapse Requirements: Experience with PostgreSQL, and ApacheAirflow Expertise in data more »
or Angular good but not necessary) Agile The following is DESIRABLE, not essential: AWS or GCP Buy-side Data tools such as Glue, Athena, Airflow, Ignite, DBT, Arrow, Iceberg, Dremio Fixed Income performance, risk or attribution TypeScript and Node Role: Python Developer (Software Engineer Programmer Developer Python Fixed Income … work is largely down to you. It can be entirely Back End. Otherwise, the stack includes Redux Saga, Ag-Grid, Node, TypeScript, gRPC, protobuf, Apache Ignite, ApacheAirflow and AWS. As the application suite grows and advances in complexity, there is a decent amount of interaction with … the office 1-2 times a week. The tech environment is very new and will soon likely include exposure to the following: Glue, Athena, Airflow, Ignite, DBT, Arrow, Iceberg, Dremio This is an environment that has been described as the only corporate environment with a start-up/fintech more »
in Python for at least 3 years Git experience including code review using pull requests SQL Server or NoSQL development Exposure to ETL pipelines (Airflow, Spark, dbt) for 2+ years Clean code, test driven development and other good coding practices React.js, Redux preferred, good for 1+ years Working knowledge more »
Proficiency in effective code management, collaboration and version control: GIT Adequate level of knowledge and experience with some of the following: API, YAML, Kafka, Airflow, JSON, AVRO, Parquet Professional Experience & Education: 7+ years of experience in data engineering STEM, Finance, or Economics degree preferred, Masters degree bonus Relevant certification more »
or Java or C++ Proficiency with RDBMS, NoSQL, distributed computing platforms such as Spark, Dask, or Hadoop. Experience with any of the following systems: ApacheAirflow, AWS/GCE/Azure, Jupyter, Kafka, Docker, Kubernetes, or Snowflake. Bachelor’s, Masters or Ph.D. degree in Computer Science or equivalent more »
years of professional experience in a computer science/computational role Experience working in a technical environment with DevOps functions (Google Cloud, Airflow, InfluxDB, Grafana) Design and implementation of front-office systems for quant trading Highly Valued Relevant Experience Knowledge of machine learning and statistical techniques and related libraries more »
tools (e.g., Glue, QuickSight) Proficiency in Python and Java Experience with data lakes and data pipelines Knowledge of ETL processes and tools such as ApacheAirflow (experience with Kafka is a plus) Familiarity with data governance, particularly GDPR compliance Strong problem-solving skills and the ability to work more »
collaborative design & development Shared code ownership & cross-functional teams Bonus points if you: Have experience with Serverless architectures Experienced with job orchestration frameworks (e.g. Airflow, MWAA on AWS) MLOps knowledge and grasp of basic concepts Have a strong interest in the health/fitness technologies Our tech stack Below more »
junior engineers and contribute to continuous learning within the team Technical Stack: Frontend: React.js, Redux Backend: Python Databases: Hive, MongoDB, SQL Server ETL Pipelines: Airflow, Spark, dbt Other: Docker, Git, Test-driven development Requirements: 5+ years of full-stack development experience in Python 5+ years of experience with SQL more »
the perfect role for you. What You’ll Do: Maintain Stability: Ensure the smooth daily production of cloud data pipelines using tools like Kubernetes, Airflow, and AWS. Create Dashboards: Develop dashboards for risk analytics, signal generation, and trade simulations, used by both the quant team and the wider desk. more »
deploying ML models in a cloud environment. Solid understanding of algorithms, data structures, and software engineering best practices. Familiarity with data pipeline tools (e.g., Airflow, Spark) and version control systems like Git. Strong problem-solving skills and the ability to work both independently and in a collaborative environment. Excellent more »
Advanced degree in Computer Science, Data Engineering, AI or related field. - Extensive experience in RAG pipeline frameworks and orchestration tools (LlamaIndex, LangChain, Spark, Kafka, Airflow). - Demonstrated ability with Python and various DBs (MongoDB, Pinecone, Elasticsearch, Pgvector, Neo4j). - Strong background in LLM-as-a-service and Cloud Technologies more »
data pipelines and products Atleast 2 years' experience with SQL including data modelling (DBT/Dataform) Experience with streaming and batch ETL solutions using Airflow for orchestration Familiarity with cloud-based services such as GCP, AWS or Azure Deep understanding of data best practices e.g. CI/CD and more »
Your Profile Key skills/knowledge/experience: Advanced Data Engineering Skills : Proficiency in designing and managing ETL processes using DBT, Python, Terraform and Airflow Expertise in Cloud Platforms: In-depth knowledge of Snowflake and Azure with experience in leveraging these platforms for scalable data solutions. Data Architecture and more »
days in the office per week) Sector: Banking/FinTech Skills/Experience Required: Familiarity with AWS, Redshift, SQL, Relational Database Service (RDS), Airflow, Glue) Data analysis experience (data modelling, entity relationships (ER), mapping documents, data dictionaries) Product Owner/Business Analysis experience (requirement gathering) Experience in a financial more »
with SQL Experience with cloud environments like AWS, GCP, etc. Experience with deploying ML models to production and working with DataOps tools like Docker, Airflow, DBT What’s in it for you 💰: Competitive starting salary (depending on experience!) Ability to directly impact clients and long-term AI strategy Opportunity more »
with AWS Experience with modern build tools - Jenkins, GitHub etc. Experience with Spring Boot or similar API framework. Experience with scheduling services such as Airflow, Oozie. Responsibilities: Developing Data Pipelines: Create and manage robust data pipelines to ensure seamless data flow across our platforms. Optimising Data Storage and Retrieval more »
decision-making applications. Cloud Platform Experience : Proficiency with cloud-based platforms like Databricks, Snowflake, and Google BigQuery, and experience deploying workflows with tools like Airflow or Databricks. Product Deployment Track Record : Proven experience taking data science products from conception to production deployment. Commitment to Quality : Strong focus on accuracy more »
deployment architectures. Tooling & Best Practices: Develop and implement best practices for model development, deployment, and management, leveraging modern MLOps tools like Docker, Kubernetes, MLflow, Airflow, etc. Qualifications: Educational Background: Bachelor's or Master's degree in Computer Science, Data Science, Engineering, or a related field. Experience: 3+ years of more »
Solid understanding of ETL/ELT processes, along with hands-on experience building and maintaining data pipelines using DBT, Snowflake, Python, SQL, Terraform and Airflow Experience in designing and implementing data products and solutions on cloud-based architectures. Cloud Platforms: Experience working with cloud data warehouses and analytics platforms more »
related to a wider Data Strategy 🔧Enhance your skills and be part of a function using your expertise in Python, Spark and AWS (Kinesis, ApacheAirflow) 🔧 Create Pipelines for model evaluations including interactive dashboards, tables, and plots to display insights and projections to non-technical project stakeholders 🔧 Focus more »
for real-time processing and scale-out and improve its reliability and monitoring. Stack in play: Python, SQL (ETL), Kafka, Flink, Kubernetes, Docker, Helm, Airflow If you are an excellent Financial Markets Data Engineer/Developer and would like to transition to a major hedge fund to develop new more »
solutions meet standards for accessibility, security, data quality, and ethical handling. Location: London - Can work fully remote Tech Stack: Python, FastAPI/Flask, Spark, Airflow, SQL, PostgreSQL, SQL Server, Azure Data Lakes, Kubernetes Salary: Up to 75k (Plus benefits - 13% Bonus - 12% Pension Contribution, 30 Days Holiday) If this more »
Terraform would be great. AI-ML techniques are becoming more prominent - Sagemaker etc would be a nice to have. Data Platform technologies (Redshift/Airflow etc) - also a nice to have. The interview process will consist of 3 stages. An initial call with the hiring manager followed by a more »