Software Engineer Data Enterprise Engineering Developer Programmer AWS GCP Python Athena Glue Airflow Ignite JavaScript Agile Pandas NumPy SciPy Spark Dremio Apache Iceburg Arrow DBT gRPC protobuf Snowflake TypeScript Manager Finance Trading Front Office Investment Banking Asset Manager Financial Services FX Fixed Income Equities Commodities Derivatives Hedge Fund) required by … led small teams on the delivery of projects AWS Glue Dremio Agile The following is DESIRABLE, not essential: Snowflake Spark, Airflow, Apache Iceberg, Arrow, DBT Trading, Front Office finance Some appreciation of asset classes such as Fixed Income, equities, FX or commodities Role: Python Software Engineer Team Lead (Architecture Programmer … Engineer Data Enterprise Engineering Developer Programmer AWS GCP Python Athena Glue Airflow Ignite JavaScript Agile Pandas NumPy SciPy Spark Dremio Snowflake Apache Iceburg Arrow DBT gRPC protobuf TypeScript Manager Finance Trading Front Office Investment Banking Asset Manager Financial Services FX Fixed Income Equities Commodities Derivatives Hedge Fund) required by my more »
good but not necessary) Agile The following is DESIRABLE, not essential: AWS or GCP Buy-side Data tools such as Glue, Athena, Airflow, Ignite, DBT, Arrow, Iceberg, Dremio Fixed Income performance, risk or attribution TypeScript and Node Role: Python Developer (Software Engineer Programmer Developer Python Fixed Income JavaScript Node Fixed … times a week. The tech environment is very new and will soon likely include exposure to the following: Glue, Athena, Airflow, Ignite, DBT, Arrow, Iceberg, Dremio This is an environment that has been described as the only corporate environment with a start-up/fintech attitude towards technology. Hours are more »
help them further grow their already exciting business. Within this role, you will be responsible for Maintaining, supporting and expanding existing data pipelines using DBT, Snowflake and S3. You will also be tasked with implementing standardised data ingress/egress pipelines coupled with onboarding new, disparate data sets, sourced from more »
Birmingham, England, United Kingdom Hybrid / WFH Options
Lorien
Delta Lake, and other enterprise scale data stores. Data Orchestration - Enterprise scale usage of technology such as Azure Data Factory, Apache Airflow, Logic Apps, DBT, SnapLogic, Spark or similar tools. Software Tooling - GIT/GitHub, CI/CD, deployment tools like Octopus, Terraform infrastructure as code and other DevOps practices. more »
source and public cloud technologies. Strong experience with data orchestration tools: e.g. Apache Airflow, Dagster. Experience with big data storage and processing technologies: e.g. DBT, Spark, SQL, Athena/Trino, Redshift, Snowflake, RDBMSs (PostgreSQL/MySQL). Knowledge of event-driven architectures and streaming technologies: e.g. Apache Kafka, Kafka Streams more »
Experience with building and optimizing data pipelines for large-scale datasets. Solid understanding of data modeling concepts and ETL processes. Experience with DBT (DataBuildTool). Desirable Skills: Experience with DBT (DataBuildTool). Experience with FastAPI. Familiarity with Python UI framework packages. Knowledge of Apigee (as an more »
experience: Ability to operate in a fast changing environment. Fluent in English Previous cloud based infrastructure experience, particularly with AWS. Experience using Airflow and dbt Expert SQL knowledge Solid understanding of Dimensional Data Modelling. Experience with at least one or more of these programming languages: Python, Scala/Java Experience more »
s largest clients Develop solutions to parse and process tabular data from PDF and HTML documents Maintain, support and expand existing data pipelines using DBT, Snowflake and S3 Implement standardised data ingress/egress pipelines Onboard new, disparate data sets, sourced from many and varied data vendors, covering all asset more »
and tracking the technology innovations applicable to the solutions. Proven experience of working effectively with senior business stakeholders Experience of using tools including Snowflake, DBT, ADF and Azure Synapse Ability to lead teams and projects towards a common architecture approach and language. Strong communication and collaboration skills. Additional Information Legal more »
professionals. Acting as a leader and mentor to team members, fostering their professional development. Working closely with technology partners such as Google Cloud (GCP), dbt Labs, and Looker. Playing a crucial role in shaping the architecture team, driving innovation and maintaining high standards of performance. Requirements of the Cloud Architect more »
Data Pipeline Development: Design and construct data pipelines to automate data flow, involving ETL processes as needed. Modern tech stack - Python, AWS, Airflow and DBT Must haves: A team player, happy to work with several teams, this is key as you will be reporting directly to the CTO. 2 + more »
value their employees. As a testament to this, you’ll also receive an unrivalled benefits package. 🛠Tech: Snowflake, AWS (or Azure/GCP), Airflow, dbt 🌳Environment: Agile ✍️Process: 3 stages No CV? No problem. Email me at athomas@trg-uk.com, and let’s arrange a call more »
Previous experience in a Data Engineering role Strong SQL and Python development skills Hands-on experience with cloud-based data warehousing technologies (e.g., Snowflake, DBT, FiveTran, AirFlow) Effective communication skills for both technical and non-technical audiences Analytical mindset with attention to detail High energy, enthusiasm, and passion for learning more »
Python/Javascript/C# Familiarity with statistical/machine learning/AI concepts and techniques Understanding of data pipeline/orchestration tools e.g. dbt, dataform Appreciation of GCP’s serverless technologies e.g. Cloud Run/Workflows Understanding of Google’s marketing stack, Google Analytics, Google Tag Manager, Google Ads more »
M50, Trafford Park, Trafford, Greater Manchester, United Kingdom
Hoist Finance
SQL Server, Oracle, MySQL, PostgreSQL) Knowledge of BI Stack design and implementation Any knowledge in some of the following areas is an advantage:- Snowflake, DBT, Azure Technologies including Azure Data Factory, Azure Data Lake, Azure DevOps, Powershell, GIT, Python. Excellent communication and interpersonal skills As a Data Engineer you will more »
Employment Type: Permanent
Salary: £40000 - £45000/annum + Car Allowance + Bonus
Durham, County Durham, North East, United Kingdom Hybrid / WFH Options
Reed Technology
science team and managing complex projects. Expertise in machine learning, statistics, data management, and relevant technologies (e.g., Python, R, SQL, AWS SageMaker, Apache Airflow, Dbt, AWS Kinesis). Strong communication skills with the ability to explain complex data concepts to a non-technical audience. Knowledge of industry standards like CRISP more »
value their employees. As a testament to this, you’ll also receive an unrivalled benefits package. Tech: Snowflake, AWS (or Azure/GCP), Airflow, dbt TC: £85,000 + bonus + up to 22% pension Process: 2 stages No CV? No problem. Email me at athomas@trg-uk.com, and let more »
years of demonstrated commercial experience as a Data Engineer or similar role within large-scale environments dealing with large data sets. Expertise in SQL & dbt and ideally Kafka Significant Python coding skills Containerisation experience (Dockers, Kubernetes) Cloud Computing experience ( GCP/AWS/Azure ) Strong preference for a Snowflake background more »
business requirements spanning a number of systems At least 10 years of relevant experience Hands-on in-depth experience in the following: Snowflake/DBT/Airflow Background/working experience in the following: Azure Power BI/DAX Traditional SQL (SqlServer, MySql, Postgres) JIRA, Confluence, (Github/BitBucket) Azure more »
and tracking the technology innovations applicable to the solutions. Proven experience of working effectively with senior business stakeholders Experience of using tools including Snowflake, DBT, ADF and Azure Synapse Ability to lead teams and projects towards a common architecture approach and language. Strong communication and collaboration skills. Additional Information Legal more »
hands-on software engineering using a broad range of technologies including the following: Java or Python Microservices Data pipelines and database programming such as DBT, SQL, BigQuery, Cloud Composer etc CI/CD/DevOps tooling experience e.g., GIT, Jenkins etc What you'll get to learn (any previous experience more »