via automated ML Ops. Ideally, you’ll also be technically skilled in most or all of the below: - Expert knowledge of Python and SQL, inc. the following libraries: Numpy, Pandas, PySpark and SparkSQL - Expert knowledge of ML Ops frameworks in the following categories More ❯
via automated ML Ops. Ideally, you’ll also be technically skilled in most or all of the below: - Expert knowledge of Python and SQL, inc. the following libraries: Numpy, Pandas, PySpark and SparkSQL - Expert knowledge of ML Ops frameworks in the following categories More ❯
via automated ML Ops. Ideally, you’ll also be technically skilled in most or all of the below: - Expert knowledge of Python and SQL, inc. the following libraries: Numpy, Pandas, PySpark and SparkSQL - Expert knowledge of ML Ops frameworks in the following categories More ❯
via automated ML Ops. Ideally, you’ll also be technically skilled in most or all of the below: - Expert knowledge of Python and SQL, inc. the following libraries: Numpy, Pandas, PySpark and SparkSQL - Expert knowledge of ML Ops frameworks in the following categories More ❯
via automated ML Ops. Ideally, you’ll also be technically skilled in most or all of the below: - Expert knowledge of Python and SQL, inc. the following libraries: Numpy, Pandas, PySpark and SparkSQL - Expert knowledge of ML Ops frameworks in the following categories More ❯
via automated ML Ops. Ideally, you’ll also be technically skilled in most or all of the below: - Expert knowledge of Python and SQL, inc. the following libraries: Numpy, Pandas, PySpark and SparkSQL - Expert knowledge of ML Ops frameworks in the following categories More ❯
via automated ML Ops. Ideally, you’ll also be technically skilled in most or all of the below: - Expert knowledge of Python and SQL, inc. the following libraries: Numpy, Pandas, PySpark and SparkSQL - Expert knowledge of ML Ops frameworks in the following categories More ❯
via automated ML Ops. Ideally, you’ll also be technically skilled in most or all of the below: - Expert knowledge of Python and SQL, inc. the following libraries: Numpy, Pandas, PySpark and SparkSQL - Expert knowledge of ML Ops frameworks in the following categories More ❯
via automated ML Ops. Ideally, you’ll also be technically skilled in most or all of the below: - Expert knowledge of Python and SQL, inc. the following libraries: Numpy, Pandas, PySpark and SparkSQL - Expert knowledge of ML Ops frameworks in the following categories More ❯
City of London, London, United Kingdom Hybrid / WFH Options
La Fosse
technical and non-technical teams Troubleshoot issues and support wider team adoption of the platform What You’ll Bring: Proficiency in Python, PySpark, SparkSQL or Java Experience with cloud tools (Lambda, S3, EKS, IAM) Knowledge of Docker, Terraform, GitHub Actions Understanding of data quality frameworks More ❯
technical and non-technical teams Troubleshoot issues and support wider team adoption of the platform What You’ll Bring: Proficiency in Python, PySpark, SparkSQL or Java Experience with cloud tools (Lambda, S3, EKS, IAM) Knowledge of Docker, Terraform, GitHub Actions Understanding of data quality frameworks More ❯
South East London, England, United Kingdom Hybrid / WFH Options
La Fosse
technical and non-technical teams Troubleshoot issues and support wider team adoption of the platform What You’ll Bring: Proficiency in Python, PySpark, SparkSQL or Java Experience with cloud tools (Lambda, S3, EKS, IAM) Knowledge of Docker, Terraform, GitHub Actions Understanding of data quality frameworks More ❯
looking for someone who has these abilities and skills: Well established Data & Analytics work experience. Sound understanding/experience of Python, Databricks, PySpark, SparkSQL and best practices. Expertise in Star Schema data modelling. Expertise in the design, creation and management of large datasets/data More ❯
You’ll need strong experience delivering and monitoring and scalable ML/AI solutions via automated ML Ops. Expert knowledge of Python and SQL, inc. Numpy, Pandas, PySpark and SparkSQL - Expert knowledge of ML Ops frameworks in the following categories: b) orchestration of More ❯
You’ll need strong experience delivering and monitoring and scalable ML/AI solutions via automated ML Ops. Expert knowledge of Python and SQL, inc. Numpy, Pandas, PySpark and SparkSQL - Expert knowledge of ML Ops frameworks in the following categories: b) orchestration of More ❯
You’ll need strong experience delivering and monitoring and scalable ML/AI solutions via automated ML Ops. Expert knowledge of Python and SQL, inc. Numpy, Pandas, PySpark and SparkSQL - Expert knowledge of ML Ops frameworks in the following categories: b) orchestration of More ❯
You’ll need strong experience delivering and monitoring and scalable ML/AI solutions via automated ML Ops. Expert knowledge of Python and SQL, inc. Numpy, Pandas, PySpark and SparkSQL - Expert knowledge of ML Ops frameworks in the following categories: b) orchestration of More ❯
our data infrastructure and capabilities. What is needed to succeed: Technical skills: Problem-solving team player with an analytical mind. Strong knowledge of SQL and Spark SQL. Understanding of dimensional data modelling concepts. Experience with Azure Synapse Analytics. Understanding of streaming data ingestion processes. Ability to … develop/manage ApacheSpark data processing applications using PySpark on Databricks. Experience with version control (e.g., Git), DevOps, and CI/CD. Experience with Python. Experience with Microsoft data platform, Microsoft Azure stack, and Databricks. Experience in marketing is a plus. Soft Skills: Strong problem-solving skills More ❯
tools to manage the platform, ensuring resilience and optimal performance are maintained. Data Integration and Transformation Integrate and transform data from multiple organisational SQL databases and SaaS applications using end-to-end dependency-based data pipelines, to establish an enterprise source of truth. Create ETL and ELT processes … using Azure Databricks, ensuring audit-ready financial data pipelines and secure data exchange with Databricks Delta Sharing and SQL Warehouse endpoints. Governance and Compliance Ensure compliance with information security standards in our highly regulated financial landscape by implementing Databricks Unity Catalog for governance, data quality monitoring, and ADLS … architecture. Proven experience of ETL/ELT, including Lakehouse, Pipeline Design, Batch/Stream processing. Strong working knowledge of programming languages, including Python, SQL, PowerShell, PySpark, Spark SQL. Good working knowledge of data warehouse and data mart architectures. Good experience in Data Governance, including Unity Catalog More ❯
tools to manage the platform, ensuring resilience and optimal performance are maintained. Data Integration and Transformation Integrate and transform data from multiple organisational SQL databases and SaaS applications using end-to-end dependency-based data pipelines, to establish an enterprise source of truth. Create ETL and ELT processes … using Azure Databricks, ensuring audit-ready financial data pipelines and secure data exchange with Databricks Delta Sharing and SQL Warehouse endpoints. Governance and Compliance Ensure compliance with information security standards in our highly regulated financial landscape by implementing Databricks Unity Catalog for governance, data quality monitoring, and ADLS … architecture. Proven experience of ETL/ELT, including Lakehouse, Pipeline Design, Batch/Stream processing. Strong working knowledge of programming languages, including Python, SQL, PowerShell, PySpark, Spark SQL. Good working knowledge of data warehouse and data mart architectures. Good experience in Data Governance, including Unity Catalog More ❯
Experience with data modeling, warehousing, and building ETL pipelines. Experience with query languages such as SQL, PL/SQL, HiveQL, SparkSQL, or Scala. Experience with scripting languages like Python or KornShell. Knowledge of writing and optimizing SQL queries for large-scale, complex datasets. PREFERRED … QUALIFICATIONS Experience with big data technologies such as Hadoop, Hive, Spark, EMR. Experience with ETL tools like Informatica, ODI, SSIS, BODI, or DataStage. We promote an inclusive culture that empowers Amazon employees to deliver the best results for our customers. If you have a disability and require workplace More ❯
will you bring to the table? 3+ experience as in DWH Developer or a related role Thorough understanding of Kimball and Snowflake models SQL is your first language :-) You know DBT, SparkSQL, Bigquery and Git A natural collaborator with excellent communication skills Able More ❯
engineers to supplement existing team during implementation phase of new data platform. Main Duties and Responsibilities: Write clean and testable code using PySpark and SparkSQL scripting languages, to enable our customer data products and business applications. Build and manage data pipelines and notebooks, deploying code in a structured, trackable and More ❯
a highly scalable, reliable and efficient data system to support the fast growing business. You will work with analytic tools, can write excellent SQL scripts, optimize performance of SQL queries and can partner with internal customers to answer key business questions. We look for candidates who … warehousing and building ETL pipelines - Experience with one or more query language (e.g., SQL, PL/SQL, DDL, MDX, HiveQL, SparkSQL, Scala) - Experience with one or more scripting language (e.g., Python, KornShell) - Knowledge of AWS Infrastructure - Knowledge of writing and optimizing SQL queries in More ❯
a highly scalable, reliable and efficient data system to support the fast growing business. You will work with analytic tools, can write excellent SQL scripts, optimize performance of SQL queries and can partner with internal customers to answer key business questions. We look for candidates who … warehousing and building ETL pipelines - Experience with one or more query language (e.g., SQL, PL/SQL, DDL, MDX, HiveQL, SparkSQL, Scala) - Experience with one or more scripting language (e.g., Python, KornShell) - Knowledge of AWS Infrastructure - Knowledge of writing and optimizing SQL queries in More ❯