SENIOR DATA ENGINEER LONDON/HYBRID £65,000-75,000 PER ANNUM This fast-growing E-commerce company are searching for a new Senior Data Engineer. You will be responsible for building out and maintaining the company's Azure platform. more »
DBT (Data Build Tool). Experience with FastAPI. Familiarity with Python UI framework packages. Knowledge of Apigee (as an app developer). Familiarity with Airflow (as an app developer). Qualifications: Bachelor's degree in Computer Science, Engineering, or a related field. If you are passionate about data engineering more »
degree in Computer Science, or similar, and have 4-5 years minimum exposure to back end development in Python, and Kubernetes automation. Skills in Airflow are of extra interest. In this role, you will: Executes elite software solutions, help design, development, and technical troubleshooting Create secure and high-quality … conclusions to identify improvements and contribute to better decision-making in terms of creating secure and stable software application development Key skills Python Kubernetes Airflow Distributed systems Computer Science degree Market leading salary and bonus on offer, hybrid working, swish offices and excellent work culture. £120-150k++ more »
financial services or energy trading industry Expertise in Python and its ecosystem of libraries and frameworks for data processing, data analysis and data visualisation Airflow, detailed understanding of architecture including schedulers, executers, operators Cloud Environments, understanding of principles, technologies and services for AWS/Azure Kubernetes EKS/AKS … including high availability Desired experience: Worked with Python 3.9+ Familiar with Python test automation Experience with SQL and Timeseries databases Familiar with Parquet, Arrow, Airflow, Databricks Experience with cloud AWS services, such as S3, EC2, RDS etc Quality engineering best practice and tooling including TDD, BDD This is an more »
tooling: containers (e.g., Docker) container orchestration (Kubernetes/K8s) CI/CD experience Version control (Git, Github, Gitlab) Orchestration/DAGs tools (e.g., Argo, Airflow, Kubeflow) Infrastructure as Code (Terraform, etc.) HOW TO APPLY Please register your interest by sending your CV to niall.wharton@@xcede.com or click the Apply more »
data processing, data analysis and data visualisation SQL and Timeseries databases cloud AWS services, such as S3, EC2, RDS etc ETL tools, such as Airflow Git, CI/CD, testing tools, supporting documentation and best practices best practice and tooling including TDD, BDD Domain and soft skills summary office more »
London, England, United Kingdom Hybrid / WFH Options
Harnham
Engineer will have: Worked in a Software Engineering setting. Strong Python coding skills including testing principles (unit testing, TDD/BDD) Experience working with Airflow, DBT, and Terraform An understanding of cloud architecture RECRUITMENT PROCESS: A Technical Test 1 Hour online chat. HOW TO APPLY: Please register your interest … Engineer will have: Worked in a Software Engineering setting. Strong Python coding skills including testing principles (unit testing, TDD/BDD) Experience working with Airflow, DBT, and Terraform An understanding of cloud architecture RECRUITMENT PROCESS: A Technical Test 1 Hour online chat. HOW TO APPLY: Please register your interest more »
required: Python SQL Kubernetes CI/CD experience with relevant tooling with Jenkins, Docker or Terraform Cloud services experience with AWS/Azure Ideally: Airflow Java Experience working with front office trading systems and financial market data For more information on this role or any other contract/permanent more »
AWS preferred) Solid understanding of libraries like Pandas and NumPy Experience in data warehousing tools like Snowflake, Databricks, BigQuery Familiar with AWS Step Functions, Airflow, Dagster, or other workflow orchestration tools Commercial experience with performant database programming in SQL Capability to solve complex technical issues, comprehending risks prior to more »
processing and analytics Desired experience: Worked with Python 3.9+ Familiar with Python test automation Experience with SQL and Timeseries databases Familiar with Parquet, Arrow, Airflow, Databricks Experience with cloud AWS services, such as S3, EC2, RDS etc Quality engineering best practice and tooling including TDD, BDD This is an more »
Nottingham, Nottinghamshire, East Midlands, United Kingdom
Microlise
data practices Possess strong knowledge of data tools, data management tools, and various data and information technologies. E.g. DAMA DMBOK, Microsoft SQL Server, Couchbase, Apache Druid, Spark, Kafka, Airflow, etc In-depth understanding of modern data principles, methodologies, and tools Excellent communication and collaboration skills, with the ability … native computing concepts and experience working with hybrid or private cloud platforms is a plus. Demonstrable technical experience working with a Microsoft, Redhat, and Apache data and software engineering environment. A team-oriented individual with a passion for engineered excellence and the ability to lead and motivate a team more »
4.Monitoring and Logging: Implement and maintain monitoring, logging, and alerting solutions. Key technologies: AWS, VPN, VPC Peering, EC2, S3, Lambda, Aurora, Docker/Kubernetes. ApacheAirflow, AWS networking concepts such as VPN, VPC peering, subnets, security groups, NAT gateways. AWS CloudWatch or equivalent. Kafka or similar data streaming more »
there is little work to do here. Experience is data-intensive applications is desirable here. Other technology in the stack includes Node, gRPC, protobuf, Apache Ignite, ApacheAirflow and AWS. They have a hybrid-working set up that requires the team to be in the office more »
or Angular good but not necessary) Agile The following is DESIRABLE, not essential: AWS or GCP Buy-side Data tools such as Glue, Athena, Airflow, Ignite, DBT, Arrow, Iceberg, Dremio Fixed Income performance, risk or attribution TypeScript and Node Role: Python Developer (Software Engineer Programmer Developer Python Fixed Income … work is largely down to you. It can be entirely Back End. Otherwise, the stack includes Redux Saga, Ag-Grid, Node, TypeScript, gRPC, protobuf, Apache Ignite, ApacheAirflow and AWS. As the application suite grows and advances in complexity, there is a decent amount of interaction with … the office 1-2 times a week. The tech environment is very new and will soon likely include exposure to the following: Glue, Athena, Airflow, Ignite, DBT, Arrow, Iceberg, Dremio This is an environment that has been described as the only corporate environment with a start-up/fintech more »
Python Data Engineer (Software Engineer Programmer Developer Data Engineer Python PySpark Spark Glue Athena Iceberg Airflow Dagster DBT Java Agile AWS GCP Buy Side Asset Manager Investment Management Finance Front Office Trading Financial Services Pandas Numpy Scipy Banking) required by our asset management client in London. You MUST have … Developer/Software Engineer/Programmer Excellent Python PySpark Excellent data engineering AWS, GCP or Azure Agile The following is DESIRABLE, not essential: Iceberg Airflow or Dagster Dremio or DBT Java Finance Role: Python Data Engineer (Software Engineer Programmer Developer Data Engineer Python PySpark Spark Glue Athena Iceberg Airflow … will bring Python and PySpark experience to contribute towards this initiative. They will also be looking at the use of tooling such as Iceberg, Airflow, Dagster, Dremio, DBT, Glue and Athena. These are not essential, only 'nice-to-have' technologies. This is also an excellent opportunity to enter into more »
London, England, United Kingdom Hybrid / WFH Options
Aventum Group
Profisee), Snowflake Data Integration, Azure Service Bus, Deltalake, BigQuery, Azure DevOps, Azure Monitor, Azure Data Factory, SQL Server, Azure DataLake Storage, Azure App Service, ApacheAirflow, Apache Iceberg, Apache Spark, Apache Hudi, Apache Kafka, Power BI, BigQuery, Azure ML is a plus Experience with more »
Birmingham, England, United Kingdom Hybrid / WFH Options
Lorien
in SQL, NoSQL, Blob,Delta Lake, and other enterprise scale data stores. Data Orchestration - Enterprise scale usage of technology such as Azure Data Factory, ApacheAirflow, Logic Apps, DBT, SnapLogic, Spark or similar tools. Software Tooling - GIT/GitHub, CI/CD, deployment tools like Octopus, Terraform infrastructure more »
pipelines solutions for the ingestion, transformation, and serving of data, as well as solutions for the orchestration of pipeline components (e.g. AWS Step Functions, ApacheAirflow). Good understanding of data modelling, algorithm, and data transformation techniques to work with data platforms. Working knowledge of cloud development practices more »
Azure SQL Data Warehouse, or Amazon Redshift Support and learn the associated scheduling logic for data pipelines using scheduling tool such as Autosys or Airflow Read and translate logical data models and use ETL skills to load the physical layer, based on an understanding of timing of data loads more »
existing systems and ingestion pipelines. Requirements: Proven experience working with Python or Java or C# Experience working with ELT/ELT technologies such as Airflow, Argo, Dagster, Spark, Hive Strong technical expertise, especially in data processing and exploration, with a willingness to learn new technologies. A passion for automation more »
really value their employees. As a testament to this, you’ll also receive an unrivalled benefits package. 🛠Tech: Snowflake, AWS (or Azure/GCP), Airflow, dbt 🌳Environment: Agile ✍️Process: 3 stages No CV? No problem. Email me at athomas@trg-uk.com, and let’s arrange a call more »