Westminster, Colorado, United States Hybrid / WFH Options
Maxar Technologies
with Git Preferred qualifications: Prior experience with CI/CD technologies such as Jenkins Prior experience with any of the following: Trino/Starburst, dbt (core or cloud), Apache Superset, OpenMetadata, Apache Airflow, Tableau. Prior experience with RDS databases, or Postgres. Agile software development lifecycle experience These skills would be more »
Has used Mixpanel Mirror*** We,re looking for someone who Understands data warehousing principles, for example using DBT Has led the delivery of complex projects with a focus on scalable, reliable, and accessible solutions Has excellent communication skills and presents data in an accessible way for non-technical people as more »
Analytics Engineer €40,000-€50,000 dbt, SQL, Snowflake Porto, twice a week in office FinTech influencing global markets We are partnered exclusively with a technology company transforming payments and who's global reach spans across 120 markets to deliver a better experience for guests, shoppers, and consumers everywhere. Backed … You will actively contribute to the development and optimization of data models and ELT processes within a data mesh architecture, using tools such as dbt and Snowflake. You will work closely within the Analytics Engineering team to implement data solutions that adhere to best practices, standards, and coding conventions, ensuring … commercial experience with a strong proficiency in SQL and experience using platforms such as Snowflake and BigQuery. Experience with Datamodelling and ELT processes using dbt in a data mesh environment is highly desirable. Experience with Python Scripting to automate processes. A proven ability to work collaboratively in a team environment more »
Software Engineer Data Enterprise Engineering Developer Programmer AWS GCP Python Athena Glue Airflow Ignite JavaScript Agile Pandas NumPy SciPy Spark Dremio Apache Iceburg Arrow DBT gRPC protobuf Snowflake TypeScript Manager Finance Trading Front Office Investment Banking Asset Manager Financial Services FX Fixed Income Equities Commodities Derivatives Hedge Fund) required by … led small teams on the delivery of projects AWS Glue Dremio Agile The following is DESIRABLE, not essential: Snowflake Spark, Airflow, Apache Iceberg, Arrow, DBT Trading, Front Office finance Some appreciation of asset classes such as Fixed Income, equities, FX or commodities Role: Python Software Engineer Team Lead (Architecture Programmer … Engineer Data Enterprise Engineering Developer Programmer AWS GCP Python Athena Glue Airflow Ignite JavaScript Agile Pandas NumPy SciPy Spark Dremio Snowflake Apache Iceburg Arrow DBT gRPC protobuf TypeScript Manager Finance Trading Front Office Investment Banking Asset Manager Financial Services FX Fixed Income Equities Commodities Derivatives Hedge Fund) required by my more »
Durham, County Durham, North East, United Kingdom Hybrid / WFH Options
Reed Technology
science team and managing complex projects. Expertise in machine learning, statistics, data management, and relevant technologies (e.g., Python, R, SQL, AWS SageMaker, Apache Airflow, Dbt, AWS Kinesis). Strong communication skills with the ability to explain complex data concepts to a non-technical audience. Knowledge of industry standards like CRISP more »
source and public cloud technologies. Strong experience with data orchestration tools: e.g. Apache Airflow, Dagster. Experience with big data storage and processing technologies: e.g. DBT, Spark, SQL, Athena/Trino, Redshift, Snowflake, RDBMSs (PostgreSQL/MySQL). Knowledge of event-driven architectures and streaming technologies: e.g. Apache Kafka, Kafka Streams more »
professionals. Acting as a leader and mentor to team members, fostering their professional development. Working closely with technology partners such as Google Cloud (GCP), dbt Labs, and Looker. Playing a crucial role in shaping the architecture team, driving innovation and maintaining high standards of performance. Requirements of the Cloud Architect more »
Data Pipeline Development: Design and construct data pipelines to automate data flow, involving ETL processes as needed. Modern tech stack - Python, AWS, Airflow and DBT Must haves: A team player, happy to work with several teams, this is key as you will be reporting directly to the CTO. 2 + more »
the firm’s largest clientsDevelop solutions to parse and process tabular data from PDF and HTML documentsMaintain, support and expand existing data pipelines using DBT, Snowflake and S3Implement standardised data ingress/egress pipelinesOnboard new, disparate data sets, sourced from many and varied data vendors, covering all asset types and more »
years of demonstrated commercial experience as a Data Engineer or similar role within large-scale environments dealing with large data sets. Expertise in SQL & dbt and ideally Kafka Significant Python coding skills Containerisation experience (Dockers, Kubernetes) Cloud Computing experience ( GCP/AWS/Azure ) Strong preference for a Snowflake background more »
capabilities. Expert data engineering and database practitioner; Experience of designing and building AWS based data solutions including pipelines, data warehouse/lake. Experience of DBT or equivalent data modelling tool. Experience of administrating SQL databases and highly proficient in SQL query optimisation. Broad experience of data analytics and reporting platforms more »
with AWS, especially AWS services focused on data flow, pipelines, data transformation, storage and streaming. Excellent data engineering skills, for example with SQL, Python, DBT and Airflow. Good understanding of service-oriented architecture; experience of exposing and consuming data via APIs, streams and webhooks. Confidence in coding, scripting, configuring, versioning more »
Greater London, England, United Kingdom Hybrid / WFH Options
Agora Talent
stage B2B SaaS experience involving client-facing projects • Experience in front-end development and competency in JavaScript • Knowledge of API development • Familiarity with Airflow, DBT, Databricks • Experience working with Enterprise Resource Planning (e.g. Oracle, SAP) and CRM systems. If this role sounds of interest, please apply using the link and more »
the ability to inspire and mentor engineers, teaching them best practices and fostering a culture of continuous improvement within an agile framework. Experience with dbt and Snowflake or other cloud based data warehouses. Excellent communication and collaboration skills, with the ability to work effectively with cross-functional teams. Detail-oriented more »
City Of London, England, United Kingdom Hybrid / WFH Options
Harnham
Gaming or Entertainment experience. Experience managing, mentoring, or coaching a small team of analytics engineers. Advanced knowledge and commercial experience with tools such as DBT, Redshift, and other AWS tools. Good educational background is preferred. Strong communication skills. A passion for gaming! THE BENEFITS A salary of up to more »
Senior AWS Data Engineer - London (Hybrid) - Permanent - £65,000-£70,000 Media Agency, Snowflake, SQL, Python, dbt, Airflow, Marketing *You must be based in London, and have full permanent right to work in the UK to apply for this role* I'm currently working with a leading media agency, specialised … above ASAP and I will be in touch. Senior AWS Data Engineer - London (Hybrid) - Permanent - £65,000-£70,000 Media Agency, Snowflake, SQL, Python, dbt, Airflow, Marketing more »
Join a team at the heart of the global economy! The Department for Business and Trade ("DBT") and Inspire People are partnering together to bring you an exciting opportunity fo r Lead Improvement Architects/Managers to create and deliver a Technical Services Improvement Plan which will sit at the … days with service three paid volunteering days a year an employee benefits programme including cycle to work Further information: This role requires SC clearance. DBT’s requirement for SC clearance is to have been present in the UK for at least 3 of the last 5 years. Failure to meet more »
West Bend, Wisconsin, United States Hybrid / WFH Options
Delta Defense
relevant work experience and certifications. Strong understanding of best practices related to PII (Personally Identifiable Information) and be able to apply them. 2+ years dbt experience required. 5+ years of experience in data engineering required. Prior experience executing within cloud data warehouses (Snowflake, Redshift, BigQuery) required. 2+ years of experience more »
Python/Javascript/C# Familiarity with statistical/machine learning/AI concepts and techniques Understanding of data pipeline/orchestration tools e.g. dbt, dataform Appreciation of GCP’s serverless technologies e.g. Cloud Run/Workflows Understanding of Google’s marketing stack, Google Analytics, Google Tag Manager, Google Ads more »
that there should be learning in every role you do. However, some experience in the following is important for this position: Advanced SQL and dbt skills to clean, transform and validate data from Data warehouses or Data Lakes Experience and knowledge in data warehousing and data modelling best practices Experience more »
to support decision-making efforts. Communicate technical information with both technical and non-technical team members and collaborators. Data Transformation : employ ETL tools like DBT to transform data, making it more accessible to the broader business. Use time series graphing services such as Grafana to build visualisations, supervise trends, and more »
Data Modelling. The primary focus of this role will be the development of a Snowflake data warehouse environment and will include using ETL tools, DBT and Power BI for reporting and data visualisations. As the ideal candidate you will be adept at working with large data sets to develop advanced more »