to deliver best in class data science products and solutions for end clients Exposure to cloud based analytical platforms such as Databricks, Snowflake, GoogleBigQuery etc. more »
Demonstratable commercial experience working on Data Pipelines, Data Warehouse, and ETLs. Strong Python coding skills GCP experience (GCP Datastream, Pub/Sub, Dataflow, Dataform, BigQuery) Experience with modern development methods and tooling: containers (e.g., Docker) container orchestration (Kubernetes/K8s) CI/CD experience Version control (Git, Github, Gitlab more »
Greater London, England, United Kingdom Hybrid / WFH Options
BJSS
of projects in the cloud (AWS, Azure, GCP), learning about and using data services such as Databricks, Data Factory, Synapse, Kafka, Redshift, Glue, Athena, BigQuery, S3, Cloud Data Fusion etc. About You You're an engineer at heart and enjoy the challenge of building reliable, efficient data applications systems more »
working on: We’re looking for a product analyst to join our ever-developing London office. You’ll be using tools and databases including BigQuery, Tableau, DBT and Python, to tackle important problems that we’re only just starting to understand, in order to: Measure, quantify and optimise complex more »
Must have experience with building visualisation layer on top of the warehouse eg Power BI. Must have worked with Cloud Database technology like GoogleBigQuery, Snowflake etc. Should have Experience with Multi-dimensional OLAP cubes to store/reference data in semantic data layer. Asset Management data expertise. Strong more »
in Python and SQL Data Engineer Desirable Qualities: Ability to communicate effectively to technical and non technical people Nice to haves includes: Kubernetes, Docker, BigQuery, Cloud, Databases Interest in FinTech Data Engineer Responsibilities: Creating new data pipelines Implementing automated data pipelines and overseeing MLOps processes to streamline the workflow more »
in a really good place as they have worked with Data Engineers to do so. Required experience: Experience in Data Analytics they use GoogleBigQuery, Google Analytics 4. They also would like some familiarity in dbt. SQL Familiar with common e-commerce testing and measurement strategies such as Geo more »
crucial role in developing and maintaining our data infrastructure, ensuring scalability, reliability, and efficiency. Key Responsibilities: Design, develop, and maintain scalable data pipelines using BigQuery, Python, and CloudRun. Collaborate with data scientists and analysts to understand business requirements and translate them into technical solutions. Implement best practices for data … date with the latest technologies and trends in data engineering and recommend innovative solutions to enhance our data infrastructure. Required Skills: Strong proficiency in BigQuery, Python, and CloudRun. Experience with building and optimizing data pipelines for large-scale datasets. Solid understanding of data modeling concepts and ETL processes. Experience more »
maintenance of data pipelines and data storage systems on Google Cloud Platform (GCP). You will be working with technologies such as Apache Airflow, BigQuery, Python, and SQL to transform and load large data sets, ensuring high data quality and accessibility for business intelligence and analytics purposes. Our business more »
maintain scalable data architectures, including pipelines and cloud-based data warehouses. Tech: Python (NumPy, Pandas), SQL, ETL, Cloud (AWS, Azure or GCP), Snowflake, Airflow, BigQuery, PowerBI/Tableau Industry: Fintech, Maritime trading Immersum are supporting the growth of a specialist consultancy who solely specialise in the Maritime trading industry. more »
optionally) experience with Python.Experience building scalable, high-quality data models that serve complex business use cases.Knowledge of dbt and experience with cloud data warehouses (BigQuery, Snowflake, Databricks, Azure Synapse etc).Proficiency in building BI dashboards and self-service capabilities using tools like Tableau and Looker.Excellent communication skills and experience more »
python skills. • Extensive use of cloud technologies such as AWS and GCP. • Good working knowledge of Data Warehousing technologies (such as AWS Redshift, GCP BigQuery or Snowflake). • Experience in deploying and scheduling code bases in a data development environment, using technologies such as Airflow. • Demonstrable experience of working more »
product teams on existing projects and new innovations to support company growth and profitability. OUR TECH STACK · Python · Scala · Kotlin · Spark · Google PubSub · Elasticsearch, Bigquery, PostgresQL FullCircl 3 Lead_Data_Engineer 04.24 · Kubernetes, Docker, Airflow KEY RESPONSIBILITIES · Designing and implementing scalable data pipelines using tools such as Apache Spark more »
Python and experience with relevant libraries (e.g., pandas, numpy).Extensive experience with ETL tools and processes.Familiarity with data warehousing concepts and technologies (e.g., Redshift, BigQuery, Snowflake).Proficient in SQL and experience with relational databases (e.g., PostgreSQL, MySQL).Exposure to cloud platforms (e.g., AWS, GCP, Azure) and their data services more »
experience with relevant libraries (e.g., pandas, numpy). • Extensive experience with ETL tools and processes. • Familiarity with data warehousing concepts and technologies (e.g., Redshift, BigQuery, Snowflake). • Proficient in SQL and experience with relational databases (e.g., PostgreSQL, MySQL). • Exposure to cloud platforms (e.g., AWS, GCP, Azure) and their more »
experience with relevant libraries (e.g., pandas, numpy). Extensive experience with ETL tools and processes. Familiarity with data warehousing concepts and technologies (e.g., Redshift, BigQuery, Snowflake). Proficient in SQL and experience with relational databases (e.g., PostgreSQL, MySQL). Exposure to cloud platforms (e.g., AWS, GCP, Azure) and their more »
of relational databases (e.g., SQL Server, Oracle, MySQL) and NoSQL databases (e.g., MongoDB, Cassandra, Couchbase). Experience with data warehousing solutions (e.g., Snowflake, Redshift, BigQuery). Excellent scripting skills (e.g., Python, SQL). Strong analytical and problem-solving skills. Excellent communication and interpersonal skills, with the ability to interact more »
to production Generate actionable insights for business improvements Key Skills 3+ years of Python experience Highly statistical and Analytical Exposure to Google Cloud Platform ( BigQuery, GCS, Datalab, Dataproc, Cloud ML (desirable) Spark & Hadoop experience Strong communication skills Good problem solving skills Qualifications Bachelor's degree or equivalent experience in more »
in Infrastructure as Code developments and tools e.g. Terraform Desired : experience with MLOps deployment and maintenance. Desired: Data Engineering technologies e.g. ETL , Spark , Dataflow , BigQuery Please note: even if you don't have exactly the background indicated, do contact us now if this type of job is of interest more »
in Infrastructure as Code developments and tools e.g. Terraform Desired : experience with MLOps deployment and maintenance. Desired: Data Engineering technologies e.g. ETL , Spark , Dataflow , BigQuery Please note: even if you don't have exactly the background indicated, do contact us now if this type of job is of interest more »
a plus). Because data preparation will be c.25% of the role,the candidate requires technical background and comfort with this activity. Experience GCP BigQuery and Oracle PL/SQL experiences are a plus. Cooperative and positive attitude in a group setting to achieve common goals Ability to act more »
Greater London, England, United Kingdom Hybrid / WFH Options
Cera
This is an exciting opportunity to join one of Europe's fastest growing health-tech startups, working in a Data team that has made huge steps in moving our innovative platform forwards This role is Hybrid but only requires 1 more »
London, England, United Kingdom Hybrid / WFH Options
Jellyfish
your clients' needs, developing long-lasting relationships and helping them navigate and solve technical challenges in the AdTech ecosystem. Familiarity with JavaScript, Google Cloud, BigQuery, SQL, Looker Studio or Adobe Analytics is advantageous, but not mandatory. When faced with a problem, you collaborate with the appropriate people to evaluate more »
strategically and proactively identifying opportunities. You have experience collaborating with senior business stakeholders and finance teams. You have working knowledge of Python, Airflow, dbt, Bigquery, and Looker. Additional desirables: Experience in one or more finance domains, such as Financial Reporting, Treasury, Regulatory Reporting, Financial Planning & Analysis, Financial Risk, and more »
ROLE SUMMARY DATA ENGINEER REMOTE/TRAVEL TO CLIENT SITE IN LONDON Valcon UK are looking to recruit a self-motivated, highly logical and intellectually curious Data Engineer’s to join our expanding Data capability in the UK. As a more »