standards to ensure alignment across teams Experience Experience in a similar analytics engineeringrole Strong SQL skills and experience with data warehouses (e.g., Redshift, Snowflake, BigQuery) Proficiency with dbt or similar data transformation tools Experience with BI tools (e.g., Tableau, Looker) for creating reports and dashboards Knowledge of data governance More ❯
stakeholders alike. The Head of BI will work within a modern, cloud-based BI ecosystem, including: Data Integration: Fivetran, HVR, Databricks, Apache Kafka, GoogleBigQuery, Google Analytics 4 Data Lake & Storage: Databricks Delta Lake, Amazon S3 Data Transformation: dbt Cloud Data Warehouse: Snowflake Analytics & Reporting: Power BI, Excel, Snowflake More ❯
ELT workflows. Strong analytic skills related to working with unstructured datasets. Engineering best practices and standards. Experience with data warehouse software (e.g. Snowflake, GoogleBigQuery, Amazon Redshift). Experience with data tools: Hadoop, Spark, Kafka, etc. Code versioning (Github integration and automation). Experience with scripting languages such as More ❯
Runcorn, Cheshire, North West, United Kingdom Hybrid / WFH Options
Forward Role
much more Key Responsibilities Develop and maintain ETL/ELT data pipelines using Python and SQL. Work with enterprise-level cloud platforms, ideally GCP (BigQuery, Airflow, Cloud Functions). Integrate APIs and process data from multiple sources. Design and optimise data warehouses and reporting systems. Build reports and dashboards More ❯
databases. Hands on experience working with visualization tools including ThoughtSpot, Power BI or Tableau. Familiarity with leading cloud-based data warehouses such as Azure, BigQuery, AWS Redshift, or Snowflake. Strong analytical and problem-solving abilities to address complex data challenges. Detail-oriented mindset with a focus on data accuracy More ❯
dbt. You have experience building dashboards in Looker and/or Tableau. Experience with AWS Redshift and/or other cloud data warehouses (Snowflake, BigQuery). Typeform drives hundreds of millions of interactions each year, enabling conversational, human-centered experiences across the globe. We move as one team , empowering More ❯
Cardiff, Wales, United Kingdom Hybrid / WFH Options
Creditsafe
commercial environment creating production grade ETL and ELT pipelines in python Comfortable implementing data architectures in analytical data warehouses such as Snowflake, Redshift or BigQuery Hands on experience with data orchestrators such as Airflow Knowledge of Agile development methodologies Awareness of cloud technology particularly AWS. Knowledge of automated delivery More ❯
and data governance practices, with an emphasis on scalability and compliance in research environments. Enterprise exposure to data engineering tools and products (Spark, PySpark, BigQuery, Pub/Sub) with an understanding of product/market fit for internal stakeholders Familiarity with cloud computing environments, including but not limited to More ❯
with Python. Experience building scalable, high-quality data models that serve complex business use cases. Knowledge of dbt and experience with cloud data warehouses (BigQuery, Snowflake, Databricks, Azure Synapse etc). Proficiency in building BI dashboards and self-service capabilities using tools like Tableau and Looker. Excellent communication skills More ❯
and data governance practices, with an emphasis on scalability and compliance in research environments. Enterprise exposure to data engineering tools and products (Spark, PySpark, BigQuery, Pub/Sub) with an understanding of product/market fit for internal stakeholders Familiarity with cloud computing environments, including but not limited to More ❯
for workflow orchestration. Strong experience in Python with demonstrable experience in developing and maintaining data pipelines and automating data workflows. Proficiency in SQL, particularly BigQuery SQL for querying and manipulating large datasets. Familiarity with machine learning (ML) concepts, algorithms (supervised/unsupervised learning), and ML tools. Experience with version More ❯
familiar with the auditing process to verify the efficacy of the data being captured. Naturally you’ll be comfortable working with SQL and ideally BigQuery (though a similar data warehousing technology is fine) with PowerBI experience being a big bonus, though by no means essential. Bonus points if you More ❯
this isn't set in stone! Familiarity with digital advertising platforms (Google Ads, Facebook Ads, LinkedIn Ads, TikTok, etc.) We use Google Analytics 4, BigQuery, Fivetran, Amplitude, Metabase and DBT amongst other things for our data stack, so experience here would be beneficial, however, experience in similar tools (if More ❯
BI SQL for data querying and manipulation Excel or Google Sheets for data analysis and reporting Experience coding in Python or Javascript Experience with BigQuery or similar cloud-based data warehousing solutions. Experience with Google Marketing Platform, social media ad managers (Meta, TikTok, Snapchat, Reddit) and mobile measurement partners More ❯
as main visualization tools. 3+ years of experience in extracting & manipulating large data sets from various relational databases using SQL (Amazon Redshift, Oracle, GoogleBigQuery). Coding skills in at least one statistical or programming language (R or Python preferred) to import, summarise, and analyse data. Hands on experience More ❯
through sharing knowledge and mentoring. Minimum Requirements Min. 3 years of experience as Data Analyst/Scientist Proficient in query language/framework (SQL, BigQuery, MySQL) is a MUST Has experience handling big data projects Experienced in R or Python Experience with data visualisation tools like Looker Mastered various More ❯
and mobile app development, SQL, ETL or data pipelines, and data analysis. You have experience with cloud data warehouses/lakes including Snowflake, Databricks, BigQuery, Redshift, S3, and ADLS. You have experience with AWS, GCP, and/or Azure cloud services. You have strong technical skills and experience with More ❯
technology (relational, columnar, or NoSQL) and familiarity with others (e.g., MySQL, Oracle, MSSQL, Vertica, MongoDB) Knowledge of cloud data warehouses like Snowflake, Databricks, and BigQuery Proven ability to manage technical project delivery, including scoping, planning, and risk assessment Strong organisational skills to oversee multiple concurrent customer projects Adaptability to More ❯
technology (relational, columnar, or NoSQL) and familiarity with others (e.g., MySQL, Oracle, MSSQL, Vertica, MongoDB) Knowledge of cloud data warehouses like Snowflake, Databricks, and BigQuery Proven ability to manage technical project delivery, including scoping, planning, and risk assessment Strong organisational skills to oversee multiple concurrent customer projects Adaptability to More ❯
technology (relational, columnar, or NoSQL) and familiarity with others (e.g., MySQL, Oracle, MSSQL, Vertica, MongoDB) Knowledge of cloud data warehouses like Snowflake, Databricks, and BigQuery Proven ability to manage technical project delivery, including scoping, planning, and risk assessment Strong organisational skills to oversee multiple concurrent customer projects Adaptability to More ❯
proven experience in managing, administrating, and troubleshooting Power BI platforms, ensuring performance and security. Experience working with cloud data warehouse solutions such as Snowflake, BigQuery, databricks, Redshift. Experience with version control systems like Git for managing Power BI products. Experience with the DevOps lifecycle. Experience mentoring visualisation developers. Knowledge More ❯
Be Doing Design, develop, maintain, and optimize data pipelines on Google Cloud Platform . Build and manage data warehouses and lakes using tools like BigQuery , Cloud Storage , and Dataflow . Leverage GCP services including Cloud Pub/Sub , Cloud Composer , and Dataflow to ensure seamless data flow and processing. More ❯
Texas office. Candidates need a Bachelors degree in Computer Science or related field. Youll design robust data systems using Google Cloud technologies such as BigQuery, Dataflow, and Pub/Sub. Key Responsibilities: Build and maintain scalable data pipelines using GCP tools. Ensure data security and governance. Monitor, troubleshoot, and More ❯
in their core areas: Cloud Data Platforms Azure Synapse Analytics, Microsoft Fabric, Azure Data Lake, Azure SQL Amazon Redshift, AWS Athena, AWS Glue GoogleBigQuery, Google Cloud Storage, Dataproc Artificial Intelligence & Machine Learning Azure OpenAI, Azure Machine Learning Studio, Azure AI Foundry AWS SageMaker, Amazon Bedrock Google Vertex AI More ❯
SQL querying experience Hands-on experience with ETL tools is a plus Knowledge of Data Vault, Kimball is a plus Knowledge of Oracle, GoogleBigQuery or Microsoft Azure is a plus ARE YOU IN FOR THIS CHALLENGE? WE HAVE A LOT TO OFFER: At Tripwire Solutions you will find More ❯