Grand Prairie, Texas, United States Hybrid / WFH Options
Jobot
and Fabric, ensuring their effective use in large-scale data solutions. o Manage and optimize data pipelines using tools such as Apache Airflow, Prefect, DBT, and SSIS, ensuring that all stages of the pipeline (ETL) are efficient, scalable, and reliable. o Ensure robust testing, monitoring, and validation of all dataMore ❯
typical day will look like this: Connect with the team at standup to catch up on the latest. Builddata pipelines with Spark or DBT on Starburst. Use SQL to transform data into meaningful insights. Build and deploy infrastructure with Terraform. Implement DDL, DML with Iceberg. Do a code review More ❯
Stack: Python and Scala Starburst and Athena Kafka and Kinesis DataHub ML Flow and Airflow Docker and Terraform Kafka, Spark, Kafka Streams and KSQL DBT AWS, S3, Iceberg, Parquet, Glue and EMR for our Data Lake Elasticsearch and DynamoDB More information: Enjoy fantastic perks like private healthcare & dental insurance, a More ❯
Birmingham, Staffordshire, United Kingdom Hybrid / WFH Options
Yelp USA
technologies: Python, AWS Redshift, AWS Athena/Apache Presto, Big Data technologies (e.g S3, Hadoop, Hive, Spark, Flink, Kafka etc), NoSQL systems like Cassandra, DBT is nice to have. What you'll get: Full responsibility for projects from day one, a collaborative team, and a dynamic work environment. Competitive salary More ❯
to other engineering teams in a Flask API; Our current technology stack primarily includes Linux, Docker, Kubernetes, Jenkins, Kafka, Python, Flask, Postgres, AWS Redshift, dbt, Google Bigquery, Prometheus, Grafana, Elastic Search, Kibana, Sisense. Your responsibilities As a member of the data team, your responsibilities will include contributions to: Developing and More ❯
. Experience deploying and maintaining cloud infrastructure (e.g. AWS, GCP, or Azure). Familiarity with data modeling and warehousing concepts, and dimensional modeling techniques. dbt knowledge is preferable. Comfortable working with CI/CD tools, version control, and containers (e.g Git, Jenkins, Docker). Understanding of data governance, security best More ❯
Bristol, Gloucestershire, United Kingdom Hybrid / WFH Options
Duel
with Snowflake. You understand event-driven architectures and real-time data processing. You have experience implementing and maintaining scalable data pipelines using tools like dbt, Apache Airflow, or similar. You have no issue working with either structured or semi-structured data. You are comfortable working with data engineering and scripting More ❯
Nottinghamshire, United Kingdom Hybrid / WFH Options
Akkodis
Data Engineer (Python, Data Modelling, ETL/ELT, Apache Airflow, DBT, AWS) Enterprise-scale tech firm Up to £70,000 plus benefits - FULLY REMOTE UK Are you a Data Engineering enthusiast who thrives from designing and implementing robust ETL processes, highly scalabale data structures and data pipelines within a truly More ❯
R. Working knowledge of message queuing and stream processing. Experience with Apache Spark or Similar Technologies. Experience with Agile and Scrum Technologies. Familiarity with dbt and Airflow is an advantage. Experience working in a start-up or scale up environment. Experience working in the fields of financial technology, traditional financial More ❯
R. Working knowledge of message queuing and stream processing. Experience with Apache Spark or Similar Technologies. Experience with Agile and Scrum Technologies. Familiarity with dbt and Airflow is an advantage. Experience working in a start-up or scale up environment. Experience working in the fields of financial technology, traditional financial More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
as well as 30 Engineers in the businesses data arm. Requirements: 3+ Years data engineering experience Snowflake experience Proficiency across an AWS tech stack DBT Expertise Terraform Experience Nice to Have: Data Modelling Data Vault Apache Airflow Benefits: Up to 10% Bonus Up to 14% Pensions Contribution 29 Days Annual More ❯
e.g., Snowflake). Proficiency in SQL and at least one scripting language (e.g., Python, Bash). Familiarity with data pipeline orchestration tools (e.g., Airflow, dbt). Strong understanding of data quality principles and best practices. Excellent communication and collaboration skills. Working with AWS, Twilio Segment and Google Analytics. Hands-on More ❯
either AWS or GCP. Experience with Terraform to define and manage cloud infrastructure through code. Desirables: Experience in SQL-based transformation workflows, particularly using DBT in BigQuery Experience with containerisation technologies (Docker, Kubernetes). Familiarity with streaming data ingestion technologies (Kafka, Debezium). Exposure to Data management and Linux admisntration More ❯
while collaborating with technical and non-technical stakeholders. What you'll do: Crafting robust Business Intelligence (BI) solutions using cutting-edge tools like Alteryx, DBT, Snowflake, Azure, Tableau, Power BI, and ThoughtSpot. Collaborating with stakeholders to gather business requirements, designing innovative solutions, and creating intuitive reports and dashboards. You'll More ❯
Cardiff, Wales, United Kingdom Hybrid / WFH Options
Creditsafe
efficient code and comfortable undertaking system optimisation and performance tuning tasks Comfortable working with relational databases such as Oracle, PostgreSQL, MySQL Has exposure to DBT and data quality test frameworks Has awareness of Infrastructure as Code such as Terraform and Ansible BENEFITS Competitive Salary. Company Laptop supplied. Bonus Scheme. More ❯
working autonomously. It would be a real bonus, but not a requirement if: You've worked in a start-up environment. You've got DBT experience. You've familiarity with MLOps principles and practices and their application in a production setting. Interview Process: You'll have a 20-minute conversation More ❯
Snowflake/Databricks/Redshift/BigQuery, including performance tuning and optimisation. Understanding of best practices for designing scalable and efficient data models, leveraging dbt for transformations. Familiarity with CircleCI, Terraform or similar tools for deployment and infrastructure as code. In this role, you will be responsible for: Shipping and More ❯
Northern Ireland, United Kingdom Hybrid / WFH Options
Ocho
Excellent communication skills and ability to work across regions Desirable Experience: • Experience with time-series data and cost intelligence platforms • Familiarity with Snowflake, Airflow, DBT, or other data platform technologies • Background in data-centric product development or FinOps tooling • Exposure to ML-powered querying, analytics, or optimization • Experience in scaling More ❯
united kingdom, united kingdom Hybrid / WFH Options
Ocho
Excellent communication skills and ability to work across regions Desirable Experience: • Experience with time-series data and cost intelligence platforms • Familiarity with Snowflake, Airflow, DBT, or other data platform technologies • Background in data-centric product development or FinOps tooling • Exposure to ML-powered querying, analytics, or optimization • Experience in scaling More ❯
analysis Knowledge of dashboard design and data visualization best practices Experience with cloud-based data infrastructure (AWS) Familiarity with modern data stack tools (Airflow, dbt, etc.) Why This Role Matters Judge.me is at an inflection point. As the market leader in Shopify reviews, we've chosen to build our future More ❯
and coaching others to succeed. Have a strong background in building and managing data infrastructure at scale, with expertise in Python, SQL, BigQuery, AWS, dbt and Airflow. Have a strong background in data modelling and building scalable data pipelines. Are naturally curious and enthusiastic about experimenting with new tech to More ❯
Have hands-on experience with cloud infrastructure (GCP/AWS/Azure), infrastructure-as-code (Terraform), containerisation (Docker/k8s), and data pipelines (SQL, dbt, Airbyte). Love automation, process improvement, and finding ways to help others work efficiently. Are comfortable working autonomously and taking responsibility for the delivery of More ❯
about our Minimum Standards of Product Development here . The Tech SQL & Python are your native languages, with a dash of Scala when needed. DBT, data modeling , and analytics are your go-to tools; Airflow is your daily companion. BigQuery/GCP hold no secrets for you, and AWS is More ❯
W1S, St James's, Greater London, United Kingdom Hybrid / WFH Options
MFK Recruitment
Monitoring tools (Grafana, Prometheus, etc.) Mentoring and knowledge sharing within the team Senior Engineer - Desirable Skills: Experience in energy supply or trading Familiarity with dbt or modular analytics tooling Exposure to forecasting or optimisation workflows Knowledge of React or frontend tools for internal apps What they offer: A high-autonomy More ❯
Experience with at least one BI/visualisation tool (e.g. Looker/Power BI/Tableau etc.) Knowledge of ETL/ELT tools (e.g. dbt/Fivetran etc.) An understanding of Data Governance principles, and the importance of maintaining data quality and providing accurate data and sound insights. An agile More ❯