southlake, texas, united states Hybrid / WFH Options
Charles Schwab
in designing and optimizing large-scale data architectures, including the use of cloud platforms Strong technical expertise with MS SQL Server, Google Cloud Platform, BigQuery and SnowFlake Proven track record of implementations using Informatica PowerCenter, IICS, Python, GCP DataFlow Experience or knowledge of Continuous Integration, Continuous Delivery, and DevOps More ❯
lone tree, colorado, united states Hybrid / WFH Options
Charles Schwab
in designing and optimizing large-scale data architectures, including the use of cloud platforms Strong technical expertise with MS SQL Server, Google Cloud Platform, BigQuery and SnowFlake Proven track record of implementations using Informatica PowerCenter, IICS, Python, GCP DataFlow Experience or knowledge of Continuous Integration, Continuous Delivery, and DevOps More ❯
and reporting specific to each internal department. Required Experience & Skills: Technical Expertise: Proven experience with AWS/Azure/GCP (preferably AWS), Redshift/BigQuery, Apache Spark/EMR, Docker, Python, and SQL. Hands-on expertise with workflow orchestration tools (Prefect, Mage, or Airflow). Experience with MPP architectures More ❯
certification on Python programming concepts and 3+ years applied experience. Extensive development experience using SQL. Hands-on experience with MPP databases such as Redshift, BigQuery, or Snowflake, and modern transformation/query engines like Spark, Flink, Trino. Familiarity with workflow management tools (e.g., Airflow) and/or dbt for More ❯
external clients. Strong hands-on experience using SQL for multi-step/complex analytics is essential for this role. Experience in cloud platforms (GCP - BigQuery, Azure -Synapse, Snowflake) and exposure to data science tools/languages such as Python, dbt, D3, Github, GCP/AWS would be advantageous. (This More ❯
london, south east england, United Kingdom Hybrid / WFH Options
ENI – Elizabeth Norman International
you’ll bring: Strong SQL skills Proven experience building and maintaining data pipelines (Airflow, DBT, etc.). Familiarity with cloud data platforms like Snowflake, BigQuery, or Redshift. Solid experience with BI tools like Looker, Tableau, or similar. Understanding of data warehousing and data architecture best practices. Ability to simplify More ❯
Required) : • Experience with data pipeline tools like Apache Airflow, DBT, or Kafka • Knowledge of cloud data services (AWS S3/Glue/Redshift, GCP BigQuery, Azure Data Factory) • Exposure to Spark, Hadoop, or other big data frameworks • Personal or academic data engineering projects Perks & Benefits : • 1:1 mentorship with More ❯
Ruddington, Nottinghamshire, United Kingdom Hybrid / WFH Options
Experian Group
Python coding skills. Experience with AWS and knowledge of IaaS (e.g., Terraform, CDK). Proficiency in SQL for big data (Presto/HiveQL/BigQuery/SparkSQL). Familiarity with data transformation (batch & streaming) using DBT. Knowledge of shell scripting. Additional Information You will get: Personal Development - career pathway More ❯
while mentoring and coaching others to succeed. Have a strong background in building and managing data infrastructure at scale, with expertise in Python, SQL, BigQuery, AWS, dbt and Airflow. Have a strong background in data modelling and building scalable data pipelines. Are naturally curious and enthusiastic about experimenting with More ❯
to abstract and model complex business processes into structured data systems. Hands-on experience in SQL, database design, and data modelling. Experience working with BigQuery or similar data warehouses. Familiarity with DBT (Data Build Tool) for data transformation is a strong plus. Experience building dashboards in Tableau, Looker, or More ❯
data services (S3, Athena, Glue) Experience with Airflow for scheduling and orchestrating workflows Experience working with data lakes or modern data warehouses (Snowflake, Redshift, BigQuery) A pragmatic problem solver who can balance technical excellence with business needs At Funding Circle we are committed to building diverse teams so please More ❯
insights and automation Working knowledge of database technologies (relational, columnar, NoSQL), e.g., MySQL, Oracle, MongoDB Experience with modern cloud data warehouses (e.g., Snowflake, Databricks, BigQuery) Excellent organisational and multitasking skills across multiple sales cycles Agile and adaptable to evolving customer needs and priorities Creative problem-solver with a strategic More ❯
insights and automation Working knowledge of database technologies (relational, columnar, NoSQL), e.g., MySQL, Oracle, MongoDB Experience with modern cloud data warehouses (e.g., Snowflake, Databricks, BigQuery) Excellent organisational and multitasking skills across multiple sales cycles Agile and adaptable to evolving customer needs and priorities Creative problem-solver with a strategic More ❯
with Python. Experience building scalable, high-quality data models that serve complex business use cases. Knowledge of dbt and experience with cloud data warehouses (BigQuery, Snowflake, Databricks, Azure Synapse etc). Proficiency in building BI dashboards and self-service capabilities using tools like Tableau and Looker. Experience communicating with More ❯
insights and automation Working knowledge of database technologies (relational, columnar, NoSQL), e.g., MySQL, Oracle, MongoDB Experience with modern cloud data warehouses (e.g., Snowflake, Databricks, BigQuery) Excellent organisational and multitasking skills across multiple sales cycles Agile and adaptable to evolving customer needs and priorities Creative problem-solver with a strategic More ❯
for: Technical excellence across the Data Engineering ecosystem: We primarily work in python, go and SQL. Our code (tracked via github) deploys to GCP (Bigquery, Airflow/Composer, GKE, Cloud Run), dbt Cloud and Azure via terraform. We recognise this is a broad list; if youre not deeply familiar More ❯
for: Technical excellence across the Data Engineering ecosystem: We primarily work in python, go and SQL. Our code (tracked via github) deploys to GCP (Bigquery, Airflow/Composer, GKE, Cloud Run), dbt Cloud and Azure via terraform. We recognise this is a broad list; if you're not deeply More ❯
insights and automation Working knowledge of database technologies (relational, columnar, NoSQL), e.g., MySQL, Oracle, MongoDB Experience with modern cloud data warehouses (e.g., Snowflake, Databricks, BigQuery) Excellent organisational and multitasking skills across multiple sales cycles Agile and adaptable to evolving customer needs and priorities Creative problem-solver with a strategic More ❯
POV/POCs, configuring a cloud storage bucket for a customer to upload sample data to, configuring cloud permissions to access a customer’s BigQuery instance in order to create a materialized view. What You’ll Need Technical Expertise Comfortable working with APIs (REST, GraphQL). Able to interpret More ❯
POV/POCs, configuring a cloud storage bucket for a customer to upload sample data to, configuring cloud permissions to access a customer’s BigQuery instance in order to create a materialized view. What You’ll Need Technical Expertise Comfortable working with APIs (REST, GraphQL). Able to interpret More ❯
/Analytics teams within the financial services and insurance sectors in the UK. Demonstrable experience in key analytics software and technologies (e.g. SQL, Python, BigQuery, Snowflake, dBT, etc.) and reporting tools (e.g. PowerBI). Deep knowledge of Microsoft Excel in a commercial setting. Experience with cloud-based platforms (AWS More ❯
insights that matter. You’ll need: Hands-on experience with GA4, Google Analytics 360, Google Tag Manager, and CRM tools. Proficiency in SQL, within BigQuery environment. Strong dashboarding and visualisation skills using Tableau, Looker, or Google Data Studio. A keen eye for detail and a love of structured, reliable More ❯
large-scale data ingestion, transformation, and integration projects. Key Responsibilities: - Build and maintain APIs and backend systems - Develop and optimise GCP-based data pipelines (BigQuery, DataFlow, Composer) - Design scalable data models and cloud data lakes - Integrate data from cloud and on-prem sources - Automate infrastructure with Terraform and CI More ❯
Ansible, Chef, Puppet) and scripting languages (Python, Bash, PowerShell). Experience with cloud-native big data tools such as AWS EMR, Azure Synapse, GoogleBigQuery, or OCI Big Data Service. Expertise in data pipeline orchestration tools such as Apache Airflow, AWS Glue, or Azure Data Factory. Knowledge of cloud More ❯
Python & DS packages (numpy/pandas/sklearn/keras, etc.); ability to optimize and speed up code. Solid experience with SQL (MySQL, GoogleBigQuery); understanding of data warehousing and ETL processes. Know how to deploy models to production: Flask/FastAPI, Linux, Docker, GitLab CI/CD, GCP. More ❯