swindon, wiltshire, south west england, United Kingdom
SoftServe
domains. We hold top-level partnership statuses with all the major cloud providers and collaborate with many technology partners like AWS, GCP, Microsoft, Databricks, Snowflake, Confluent, and others. IF YOU ARE Experienced with Python/PySpark Proficient working with Databricks Lakehouse architecture and principles Having 2+ years of designing data More ❯
bolton, greater manchester, north west england, United Kingdom
SoftServe
domains. We hold top-level partnership statuses with all the major cloud providers and collaborate with many technology partners like AWS, GCP, Microsoft, Databricks, Snowflake, Confluent, and others. IF YOU ARE Experienced with Python/PySpark Proficient working with Databricks Lakehouse architecture and principles Having 2+ years of designing data More ❯
portsmouth, hampshire, south east england, United Kingdom
SoftServe
domains. We hold top-level partnership statuses with all the major cloud providers and collaborate with many technology partners like AWS, GCP, Microsoft, Databricks, Snowflake, Confluent, and others. IF YOU ARE Experienced with Python/PySpark Proficient working with Databricks Lakehouse architecture and principles Having 2+ years of designing data More ❯
domains. We hold top-level partnership statuses with all the major cloud providers and collaborate with many technology partners like AWS, GCP, Microsoft, Databricks, Snowflake, Confluent, and others. IF YOU ARE Experienced with Python/PySpark Proficient working with Databricks Lakehouse architecture and principles Having 2+ years of designing data More ❯
domains. We hold top-level partnership statuses with all the major cloud providers and collaborate with many technology partners like AWS, GCP, Microsoft, Databricks, Snowflake, Confluent, and others. IF YOU ARE Experienced with Python/PySpark Proficient working with Databricks Lakehouse architecture and principles Having 2+ years of designing data More ❯
oxford district, south east england, United Kingdom
SoftServe
domains. We hold top-level partnership statuses with all the major cloud providers and collaborate with many technology partners like AWS, GCP, Microsoft, Databricks, Snowflake, Confluent, and others. IF YOU ARE Experienced with Python/PySpark Proficient working with Databricks Lakehouse architecture and principles Having 2+ years of designing data More ❯
domains. We hold top-level partnership statuses with all the major cloud providers and collaborate with many technology partners like AWS, GCP, Microsoft, Databricks, Snowflake, Confluent, and others. IF YOU ARE Experienced with Python/PySpark Proficient working with Databricks Lakehouse architecture and principles Having 2+ years of designing data More ❯
kingston upon hull, east yorkshire, yorkshire and the humber, United Kingdom
SoftServe
domains. We hold top-level partnership statuses with all the major cloud providers and collaborate with many technology partners like AWS, GCP, Microsoft, Databricks, Snowflake, Confluent, and others. IF YOU ARE Experienced with Python/PySpark Proficient working with Databricks Lakehouse architecture and principles Having 2+ years of designing data More ❯
crawley, west sussex, south east england, United Kingdom
SoftServe
domains. We hold top-level partnership statuses with all the major cloud providers and collaborate with many technology partners like AWS, GCP, Microsoft, Databricks, Snowflake, Confluent, and others. IF YOU ARE Experienced with Python/PySpark Proficient working with Databricks Lakehouse architecture and principles Having 2+ years of designing data More ❯
newcastle-upon-tyne, tyne and wear, north east england, United Kingdom
SoftServe
domains. We hold top-level partnership statuses with all the major cloud providers and collaborate with many technology partners like AWS, GCP, Microsoft, Databricks, Snowflake, Confluent, and others. IF YOU ARE Experienced with Python/PySpark Proficient working with Databricks Lakehouse architecture and principles Having 2+ years of designing data More ❯
Redshift, Kafka, etc.). Experience in data quality testing; adept at writing test cases and scripts, presenting and resolving data issues. Experience with Databricks, Snowflake, Iceberg is required. Preferred qualifications, capabilities, and skills Understanding of application and data design disciplines with an emphasis on real-time processing and delivery, e.g. More ❯
risk metric reports using SQL and data visualization tools like Tableau Web development skills for risk management UI applications Experience with databases such as Snowflake, Sybase IQ, and distributed systems like HDFS Ability to interact with business users to resolve issues Experience designing and supporting batch processes with scheduling infrastructure More ❯
in managing, querying, and transforming data using SQL, Python, or other relevant languages. Experience working with databases, including data modeling and query optimization (e.g., Snowflake , PostgreSQL ). A strong understanding of CI/CD pipelines , TDD/BDD , and the DevOps culture. Excellent problem-solving and debugging skills, particularly in More ❯
relevant experience. Proficiency in dbt required. 4+ years of SQL experience. Solid understanding of ETL/ELT methodologies. Significant experience with cloud data warehouses (Snowflake, Redshift, BigQuery). Experience with Python and shell scripting is a plus. Experience with Agile development methods. Understanding of Tableau. Experience with AWS cloud technologies More ❯
Utilize web development technologies to facilitate application development for front end UI used for risk management actions Develop software for calculations using databases like Snowflake, Sybase IQ and distributed HDFS systems. Interact with business users for resolving issues with applications. Design and support batch processes using scheduling infrastructure for calculation More ❯
Utilize web development technologies to facilitate application development for front end UI used for risk management actions • Develop software for calculations using databases like Snowflake, Sybase IQ and distributed HDFS systems. • Interact with business users for resolving issues with applications. • Design and support batch processes using scheduling infrastructure for calculation More ❯
for front-end development, and writing complex SQL queries for data extraction, transformation, and reporting Experience with data warehousing concepts and platforms, such as Snowflake and Amazon Redshift, and with databases such as Postgres, Solr, Accumulo, or Iceberg Experience integrating structured and unstructured data from various sources such as APIs More ❯
platforms are built with Clojure, employ a polylith architecture, are deployed using CI/CD, heavily exploit automation, and run on AWS, GCP, k8s, Snowflake, and more. We serve 9 petabytes and 77 billion objects annually, which amounts to 20 billion ad impressions across the globe. You'll play a More ❯
Utilize web development technologies to facilitate application development for front end UI used for risk management actions • Develop software for calculations using databases like Snowflake, Sybase IQ and distributed HDFS systems. • Interact with business users for resolving issues with applications. • Design and support batch processes using scheduling infrastructure for calculation More ❯
ML engineering domain on technical approaches to balance delivering near-term commercial impact and building long-term foundations. Our Tech Stack Cloud Data Warehouse - Snowflake AWS Data Solutions - Kinesis, SNS, SQS, S3, ECS, Lambda Data Governance & Quality - Collate & Monte Carlo Infrastructure as Code - Terraform Data Integration & Transformation - Python, DBT, Fivetran More ❯
ML engineering domain on technical approaches to balance delivering near-term commercial impact and building long-term foundations. Our Tech Stack Cloud Data Warehouse - Snowflake AWS Data Solutions - Kinesis, SNS, SQS, S3, ECS, Lambda Data Governance & Quality - Collate & Monte Carlo Infrastructure as Code - Terraform Data Integration & Transformation - Python, DBT, Fivetran More ❯
and cloud data warehouses, with a proven track record of designing and implementing complex, performant data models in enterprise-grade cloud environments (e.g., BigQuery, Snowflake, Redshift), consistently optimising for scale and cost-efficiency. Demonstrate mastery of SQL and dbt, exhibiting expertise in advanced SQL techniques and extensive experience in developing More ❯
and Dimensional (Kimball) data modelling • Proficiency in SQL (T-SQL, PL/SQL, Databricks SQL) Desirable: • Databricks (or Alternative Modern Data Platform such as Snowflake) • Experience working in a regulated environment and knowledge of the risk and compliance requirements associated with this. • Oracle Database • MongoDB • Cloud Data Technologies (Mainly Azure More ❯
Python; experience with Polaris is a helpful addition Expertise with Core Java and Spring Advanced SQL. Experience with cloud technologies is a plus (AWS, Snowflake, etc) Familiarity with equities and equity derivatives within a real-time electronic trading environment is required Strong communication skills; ability to liaise with investment professionals More ❯
and mentoring engineering teams. Excellent communication skills, with the ability to bridge technical concepts and business objectives. Hands-on experience with cloud platforms (e.g., Snowflake, AWS) and modern data pipeline tools like Airflow and dbt. More ❯