pipelines. Work with Python and SQL for data processing, transformation, and analysis. Leverage a wide range of GCP services including: Cloud Composer (Apache Airflow) BigQuery Cloud Storage Dataflow Pub/Sub Cloud Functions IAM Design and implement data models and ETL processes. Apply infrastructure-as-code practices using tools More ❯
pipelines. Work with Python and SQL for data processing, transformation, and analysis. Leverage a wide range of GCP services including: Cloud Composer (Apache Airflow) BigQuery Cloud Storage Dataflow Pub/Sub Cloud Functions IAM Design and implement data models and ETL processes. Apply infrastructure-as-code practices using tools More ❯
london, south east england, united kingdom Hybrid / WFH Options
Noir
pipelines. Work with Python and SQL for data processing, transformation, and analysis. Leverage a wide range of GCP services including: Cloud Composer (Apache Airflow) BigQuery Cloud Storage Dataflow Pub/Sub Cloud Functions IAM Design and implement data models and ETL processes. Apply infrastructure-as-code practices using tools More ❯
Washington, Washington DC, United States Hybrid / WFH Options
SMX
and NoSQL databases. Proficiency in writing complex queries and applying database optimization techniques. Data Warehousing: Experience with data warehousing solutions like Amazon Redshift, GoogleBigQuery, or Microsoft Azure SQL Data Warehouse. Soft Skills: Strong communication and collaboration skills. Excellent problem-solving skills. US Citizenship is required to obtain a More ❯
technology approach combining talent with software and service expertise. Tasks & Responsibilities: Design, develop, and maintain scalable data pipelines on GCP using services such as BigQuery and Cloud Functions. Collaborate with internal consulting and client teams to understand data requirements and deliver data solutions that meet business needs. Implement data More ❯
designing, building, and maintaining scalable data pipelines and ETL processes. Proficiency in SQL and experience working with relational and non-relational databases (e.g. Snowflake, BigQuery, PostgreSQL, MySQL, MongoDB). Hands-on experience with big data technologies such as Apache Spark, Kafka, Hive, or Hadoop. Proficient in at least one More ❯
for data analysis, machine learning, and data visualization. In-depth knowledge of cloud platforms (AWS, Azure, Google Cloud) and related data services (e.g., S3, BigQuery, Redshift, Data Lakes). Expertise in SQL for querying large datasets and optimizing performance. Experience working with big data technologies such as Hadoop, Apache More ❯
london, south east england, united kingdom Hybrid / WFH Options
Careerwise
for data analysis, machine learning, and data visualization. In-depth knowledge of cloud platforms (AWS, Azure, Google Cloud) and related data services (e.g., S3, BigQuery, Redshift, Data Lakes). Expertise in SQL for querying large datasets and optimizing performance. Experience working with big data technologies such as Hadoop, Apache More ❯
Washington, Washington DC, United States Hybrid / WFH Options
Digital Management, Inc
in theTask Order• Excellent communication skills Preferred Skills:• Experience with cloud-based Data Warehouse platforms and solutions (e.g., Oracle FDIP, Snowflake, Amazon Redshift, GoogleBigQuery).• Relevant certifications in data management or ETL tools are a plus Experience with one or more of the following systems: Maximo, PeopleSoft FSCM More ❯
data from diverse sources. Strong knowledge of SQL/NoSQL databases and cloud data warehouse technology such as Snowflake/Databricks/Redshift/BigQuery, including performance tuning and optimisation. Understanding of best practices for designing scalable and efficient data models, leveraging dbt for transformations. Familiarity with CircleCI, Terraform More ❯
ELT workflows. Strong analytic skills related to working with unstructured datasets. Engineering best practices and standards. Experience with data warehouse software (e.g. Snowflake, GoogleBigQuery, Amazon Redshift). Experience with data tools: Hadoop, Spark, Kafka, etc. Code versioning (Github integration and automation). Experience with scripting languages such as More ❯
databases. Hands on experience working with visualization tools including ThoughtSpot, Power BI or Tableau. Familiarity with leading cloud-based data warehouses such as Azure, BigQuery, AWS Redshift, or Snowflake. Strong analytical and problem-solving abilities to address complex data challenges. Detail-oriented mindset with a focus on data accuracy More ❯
Cardiff, Wales, United Kingdom Hybrid / WFH Options
Creditsafe
commercial environment creating production grade ETL and ELT pipelines in python Comfortable implementing data architectures in analytical data warehouses such as Snowflake, Redshift or BigQuery Hands on experience with data orchestrators such as Airflow Knowledge of Agile development methodologies Awareness of cloud technology particularly AWS. Knowledge of automated delivery More ❯
through sharing knowledge and mentoring. Minimum Requirements Min. 3 years of experience as Data Analyst/Scientist Proficient in query language/framework (SQL, BigQuery, MySQL) is a MUST Has experience handling big data projects Experienced in R or Python Experience with data visualisation tools like Looker Mastered various More ❯
Be Doing Design, develop, maintain, and optimize data pipelines on Google Cloud Platform . Build and manage data warehouses and lakes using tools like BigQuery , Cloud Storage , and Dataflow . Leverage GCP services including Cloud Pub/Sub , Cloud Composer , and Dataflow to ensure seamless data flow and processing. More ❯
external clients. Strong hands-on experience using SQL for multi-step/complex analytics is essential for this role. Experience in cloud platforms (GCP - BigQuery, Azure -Synapse, Snowflake) and exposure to data science tools/languages such as Python, dbt, D3, Github, GCP/AWS would be advantageous. (This More ❯
you’ll bring: Strong SQL skills Proven experience building and maintaining data pipelines (Airflow, DBT, etc.). Familiarity with cloud data platforms like Snowflake, BigQuery, or Redshift. Solid experience with BI tools like Looker, Tableau, or similar. Understanding of data warehousing and data architecture best practices. Ability to simplify More ❯
london, south east england, united kingdom Hybrid / WFH Options
ENI – Elizabeth Norman International
you’ll bring: Strong SQL skills Proven experience building and maintaining data pipelines (Airflow, DBT, etc.). Familiarity with cloud data platforms like Snowflake, BigQuery, or Redshift. Solid experience with BI tools like Looker, Tableau, or similar. Understanding of data warehousing and data architecture best practices. Ability to simplify More ❯
while mentoring and coaching others to succeed. Have a strong background in building and managing data infrastructure at scale, with expertise in Python, SQL, BigQuery, AWS, dbt and Airflow. Have a strong background in data modelling and building scalable data pipelines. Are naturally curious and enthusiastic about experimenting with More ❯
data services (S3, Athena, Glue) Experience with Airflow for scheduling and orchestrating workflows Experience working with data lakes or modern data warehouses (Snowflake, Redshift, BigQuery) A pragmatic problem solver who can balance technical excellence with business needs At Funding Circle we are committed to building diverse teams so please More ❯
for: Technical excellence across the Data Engineering ecosystem: We primarily work in python, go and SQL. Our code (tracked via github) deploys to GCP (Bigquery, Airflow/Composer, GKE, Cloud Run), dbt Cloud and Azure via terraform. We recognise this is a broad list; if you're not deeply More ❯
Python & DS packages (numpy/pandas/sklearn/keras, etc.); ability to optimize and speed up code. Solid experience with SQL (MySQL, GoogleBigQuery); understanding of data warehousing and ETL processes. Know how to deploy models to production: Flask/FastAPI, Linux, Docker, GitLab CI/CD, GCP. More ❯
modeling, and data warehousing Proficient in SQL, and familiar with NoSQL databases Skilled in Python, Java, or Scala for data pipeline development Experienced with BigQuery, Dataflow, Dataproc, Cloud Storage, Pub/Sub Exposure to Hadoop, Spark, Kafka Data Engineer - GCP & Python Location: London, UK Type: Hybrid (3 days onsite More ❯
to help us build structured, high-quality data solutions for our Content Understanding teams. Our teams process petabytes of data using tools such as BigQuery and Dataflow, and where needed we develop our own data tooling to support and evolve our music catalog. What You'll Do Take ownership More ❯
to help us build structured, high-quality data solutions for our Content Understanding teams. Our teams process petabytes of data using tools such as BigQuery and Dataflow, and where needed we develop our own data tooling to support and evolve our music catalog. Location London Job type Permanent What More ❯