data warehouse technologies and relevant data modeling best practices (Presto, Athena, Glue, etc)Experience building data pipelines/ETL and familiarity with design principles (ApacheAirflow is a big plus!)Excellent SQL and data manipulation skills using common frameworks like Spark/PySpark, or similar.Proficiency in a major more »
processes, and mastery in designing and implementing scalable solutions. From a tech perspective, the team uses the following; Python, SQL, AWS, Docker, Redshift, Kafka, Airflow, S3. Deep experience in Python, SQL and AWS is mandatory for this role. This is a great opportunity to join a firm at the more »
building APIs & SQLAlchmy for database interactions. Strong experience in cloud-based development (AWS). Proficiency with both Docker & Kubernetes for containerisation & orchestration. Understanding or Airflow & DAGs. Experience in building applications using Kafka. Solid OOP principles & design patterns. Permanent/Full-Time Employment. Hybrid working environment (2/3 days more »
converting SAS-based modules to Python-based solutions. Strong understanding of data management principles and experience working with Snowflake. Proficiency in Python, DBT, and Airflow or similar technologies. Excellent problem-solving skills and ability to troubleshoot complex issues. Experience working in an Agile environment and collaborating with cross-functional more »
on quality system maintenance. Strong knowledge of data engineering principles, including ETL pipelining , shell scripting , and Elasticsearch . Experience with workflow orchestration tools (e.g., Airflow , Dagster ). Excellent technical communication skills, with a track record of thriving in fast-paced environments. Ability to support, modify, and maintain systems and more »
Machine Learning Engineer (Multimodal) Up to £100,000 London Hybrid/Remote Driving innovation with AI and machine learning to revolutionize financial services and enhance customer experiences. COMPANY Harnham has partnered with a leading Fintech company using advanced AI technology more »
Developer (Financial Services): - Extensive experience working within a Python Development capacity essential - Data mind set essential - Financial services experience essential - Good DB knowledge required - ApacheAirflow, Pandas, Dask all nice to have - Happy to undertake a Hacker Rank assessment as part of selection process Please note we receive more »
Cloud Platform (GCP) data warehouse and data solutions team. If you're passionate about leveraging cutting-edge technologies like Google Cloud, BigQuery, ETL, and ApacheAirflow, this role is perfect for you. Key Responsibilities: - Data Integration & ETL: Design and implement robust ETL processes to integrate data from various … processes to maintain data integrity. What We're Looking For: - Proven experience in data engineering, with a strong focus on GCP, BigQuery, ETL, and Apache Airflow. - A strategic thinker with the ability to lead and develop a data function from the ground up. - A team player who thrives in more »
decision-making applications. Cloud Platform Experience : Proficiency with cloud-based platforms like Databricks, Snowflake, and Google BigQuery, and experience deploying workflows with tools like Airflow or Databricks. Product Deployment Track Record : Proven experience taking data science products from conception to production deployment. Commitment to Quality : Strong focus on accuracy more »
and Interested or experienced working with Rust Strong understanding of clean coding principles PEP8 etc etc Strong knowledge across backed/infra - Docker, Kubernetes, Airflow etc Experienced with cloud technologies i.g AWS, GCP, Azure Strong interest in building complex systems in a simplistic way Thorough understanding of Computer science more »
in SQL, NoSQL, Blob,Delta Lake, and other enterprise scale data stores. Data Orchestration - Enterprise scale usage of technology such as Azure Data Factory, ApacheAirflow, Logic Apps, DBT, SnapLogic, Spark or similar tools. Software Tooling - GIT/GitHub, CI/CD, deployment tools like Octopus, Terraform infrastructure more »
control their data and processes effectively. Required Skills and Experience Technologies: Backend (Node.js, Typescript), Frontend (React.js, Typescript), Database (MySQL), and familiarity with ECS, K8s, Airflow, Kafka, Serverless, AWS. Experience: Minimum 5 years in engineering leadership roles. Expertise in handling data-heavy products and presenting actionable insights. Background in cybersecurity more »
Up to £500 p/day inside IR35 3 month Contract AWS, Airflow We are seeking an experienced contractor to provide interim technical guidance to a data engineering team for 3 months. This role is essential to maintaining operational stability and progressing key projects during a transition period. What … projects and overseeing a small data engineering team, ensuring continuity and stability. A key responsibility will include planning and executing an ETL migration to Airflow, in collaboration with the data architect and architecture team. The contractor will play a supportive role in new proof of concept and pilot projects more »
Wantage, Oxfordshire, South East, United Kingdom Hybrid / WFH Options
Corriculo Ltd
native development and leading transformation and cloud migration projects would be advantageous A breadth of technical experience encompassing technologies such as C#.NET, HPC, Kubernetes, Airflow, etc. I want to do that! If you have any questions or would simply welcome a chat about this excellent Software Engineering Manager position more »
control their data and processes effectively. Required Skills and Experience Technologies: Backend (Node.js, Typescript), Frontend (React.js, Typescript), Database (MySQL), and familiarity with ECS, K8s, Airflow, Kafka, Serverless, AWS. Experience: Minimum 5 years in engineering leadership roles. Expertise in handling data-heavy products and presenting actionable insights. Background in cybersecurity more »
complex technical issues. Integration testing experience. Understanding of TCP/IP and OSI Models. Hands-on experience with data pipeline tools (eg. NiFi and Airflow). Past experience designing scalable data systems with a focus on security and GDPR compliance. Hybrid cloud experience with on-premise and AWS cloud more »
or Angular good but not necessary) Agile The following is DESIRABLE, not essential: AWS or GCP Buy-side Data tools such as Glue, Athena, Airflow, Ignite, DBT, Arrow, Iceberg, Dremio Fixed Income performance, risk or attribution TypeScript and Node Role: Python Developer (Software Engineer Programmer Developer Python Fixed Income … work is largely down to you. It can be entirely Back End. Otherwise, the stack includes Redux Saga, Ag-Grid, Node, TypeScript, gRPC, protobuf, Apache Ignite, ApacheAirflow and AWS. As the application suite grows and advances in complexity, there is a decent amount of interaction with … the office 1-2 times a week. The tech environment is very new and will soon likely include exposure to the following: Glue, Athena, Airflow, Ignite, DBT, Arrow, Iceberg, Dremio This is an environment that has been described as the only corporate environment with a start-up/fintech more »
automate scalable cloud solutions using Google Cloud Platform native tools (e.g., Data Prep, Data Proc, Data Fusion, Data Flow, DataForm, DBT, Big Query) and Apache Airflow. > Operationalize and automate data best practices: quality, auditable, timeliness and complete > Monitor and enhance the performance and scalability of data processing systems to … open-source tools like Terraform. # Experience with CI/CD practices and tools such as Tekton. # Knowledge of workflow management platforms like ApacheAirflow and Astronomer. # Proficiency in using GitHub for version control and collaboration. # Ability to design and maintain efficient data pipelines. # more »
implement the systems that require the highest data throughput in Java. We implement most of our long running services and analytics in C#.We use Airflow for workflow management, Kafka for data pipelines, Bitbucket for source control, Jenkins for continuous integration, ELK for logs, Grafana, Prometheus & InfluxDb for metrics, Docker more »
and verbal) Proven track record at using technology to automate data processes Strong SQL skills, including Jinja templating/DBT Experience with Snowflake and Airflow (Astronomer) Experience with Java Experience with Python including exposure to data science packages (NumPy, Pandas, scikit-learn) Experience manipulating, processing and extracting value from more »
tools (e.g., Glue, QuickSight) Proficiency in Python and Java Experience with data lakes and data pipelines Knowledge of ETL processes and tools such as ApacheAirflow (experience with Kafka is a plus) Familiarity with data governance, particularly GDPR compliance Strong problem-solving skills and the ability to work more »
implementation. You love solving complex data problems and researching new technologies and tools. Experience in data technologies and languages such as kdb, dbt, SQL, Airflow, Pandas, Polars, Snowflake, and Kafka. Experience working on modern infrastructure, i.e. K8, Docker, Helm etc Strong academic background in a STEM subject from a more »
teams. Requirements & Qualifications: Strong programming skills in at least one major language (Python, Java, C++, etc.). Experience building robust, scalable data pipelines (DBT, Airflow, or similar). Proficiency in data manipulation tools (Pandas, Polars). Proven experience with TCA systems, including benchmarking and pricing across various asset classes. more »
Cardiff, Wales, United Kingdom Hybrid / WFH Options
Creditsafe
in python Comfortable implementing data architectures in analytical data warehouses such as Snowflake, Redshift or BigQuery Hands on experience with data orchestrators such as Airflow Knowledge of Agile development methodologies Awareness of cloud technology particularly AWS. Knowledge of automated delivery processes Hands on experience of best engineering practices (handling more »
be able to apply this expertise to business problems to generate value. We currently work in an AWS, Snowflake, dbt, Looker, Python, Kinesis and Airflow stack and are building out our real-time data streaming capabilities. You should be very comfortable with these (or similar). As an individual more »