in value-based selling methodologies (e.g., MEDDICC, Force Management, Demo2Win) Proficient with BI platforms such as Tableau, Looker, or Power BI Strong SQL and Python skills for data modelling, integration, and automation Hands-on experience with APIs and tools like Postman Familiarity with AI technologies including LLMs, AI agents, and More ❯
in value-based selling methodologies (e.g., MEDDICC, Force Management, Demo2Win) Proficient with BI platforms such as Tableau, Looker, or Power BI Strong SQL and Python skills for data modelling, integration, and automation Hands-on experience with APIs and tools like Postman Familiarity with AI technologies including LLMs, AI agents, and More ❯
Bring a broad IT skill set, including hands-on experience with Linux, AWS, Azure, Oracle 19 (admin), Tomcat, UNIX tools, Bash/sh, SQL, Python, Hive, Hadoop/HDFS, and Spark. Work within a modern cloud DevOps environment using Azure, Git, Airflow, Kubernetes, Helm, and Terraform. Demonstrate solid knowledge of More ❯
london, south east england, united kingdom Hybrid / WFH Options
Kantar Media
Bring a broad IT skill set, including hands-on experience with Linux, AWS, Azure, Oracle 19 (admin), Tomcat, UNIX tools, Bash/sh, SQL, Python, Hive, Hadoop/HDFS, and Spark. Work within a modern cloud DevOps environment using Azure, Git, Airflow, Kubernetes, Helm, and Terraform. Demonstrate solid knowledge of More ❯
Hands-on experience with cloud-native data tech (preferably AWS: Redshift, Glue, S3, Lambda, IAM, Terraform, GitHub, CI/CD) Proficiency in SQL and Python for data processing and automation Experience working with data modeling tools and practices (e.g., dimensional, star/snowflake schema, dbt) Solid understanding of data governance More ❯
non-parametric (e.g. random forest, neural net) techniques as well as wider ML techniques like clustering/random forest (desirable). Tech Stack: SQL, Python, R, Tableau, AWS Athena + More! More information: Enjoy fantastic perks like private healthcare & dental insurance, a generous work from abroad policy, 2-for More ❯
datasets. Ability to apply machine learning techniques (e.g., scikit-learn, PyTorch, TensorFlow) and present complex data visually using Matplotlib, seaborn, or Streamlit. Fluent in Python, with experience working with data processing libraries such as Pandas. Strong SQL skills, a good understanding of Linux, parallel computing tools, and experience with Git More ❯
e.g., ETL, data models, queries) - Bachelor's degree in engineering, statistics, computer science, mathematics, or a related quantitative field - Experience with scripting languages (e.g., Python, Java, R) and big data technologies/languages (e.g. Spark, Hive, Hadoop, PyTorch, PySpark) PREFERRED QUALIFICATIONS - Master's degree, or Advanced technical degree - Knowledge of More ❯
fundamentals. You have experiences working with distributed systems. You are proficient in one or more of the following programming languages: C#, Java, C, C++, Python, SQL, or Scala. You have some knowledge of software development code editors: Visual Studio, Visual Studio Code, Rider, and version control systems (Git, Perforce). More ❯
to a culture of results-driven collaboration, support and respect. What You'll Bring to the Role Approx. 8 years' experience using SQL or Python for data analysis, with about 3 years' experience in P&C insurance. A degree at BSc or MSc level in a Numerical field, preferably with More ❯
to analyze application and server logs, error interpretation. Excellent written and verbal communication skills and strong collaboration skills. Knowledge of SQL and experience with Python scripting. Familiarity with AWS services and Snowflake is a plus. Certification in Tableau Server Administration. A growth mindset with a willingness to adapt and continuously More ❯
FHIR, HL7, DICOM, SNOMED CT, ICD-10. * Cloud Platforms & Infrastructure: AWS, Azure, Google Cloud, NHS Cloud. * Business Intelligence & Data Tools: SQL, Power BI, Tableau, Python, R, Snowflake. * Healthcare Cloud: Data management: and data integration and knowledge of Palantir Foundry toolset. * *Data Management & Analytics –* Experience handling large datasets and data collections More ❯
FHIR, HL7, DICOM, SNOMED CT, ICD-10. * Cloud Platforms & Infrastructure: AWS, Azure, Google Cloud, NHS Cloud. * Business Intelligence & Data Tools: SQL, Power BI, Tableau, Python, R, Snowflake. * Healthcare Cloud: Data management: and data integration and knowledge of Palantir Foundry toolset. * *Data Management & Analytics –* Experience handling large datasets and data collections More ❯
development efforts and ensuring cross-functional team alignment. Technical Skills Strong technical background with hands-on experience in modern software development technologies (e.g., Java, Python, microservices, cloud platforms like AWS, Azure, etc.). Expertise in software architecture, data architecture, and system design, with an ability to lead and influence technical More ❯
The responsibilities and duties of the role will include, but are not limited to: Technical development and deployment Develop analytical models and tools using Python and statistical or machine learning techniques to address business challenges. Collaborate with consultants to understand client needs and design appropriate data solutions. Create effective data More ❯
practices: code standards, code review, version control, CI/CD, testing, documentation, Agile, with the ability to mentor others in these practices 10+ Advanced Python programming skills PhD a related field (computer science, math, machine learning) Preferred Qualifications: Peer reviewed publications in major AI conferences Experience with large language models More ❯
complex systems. Experience using MDAO frameworks such as OpenMDAO, ModelCenter, or modeFRONTIER Proficiency in programming languages commonly used in trajectory optimization, such as MATLAB, Python, Julia or C++. Experience with optimization libraries (e.g., GPOPS-ii, DIDO, Dymos, CasADi, or IPOPT) is preferred. Strong analytical, problem-solving, and decision-making skills. More ❯
Experience and skills Deep understanding of the Python language Extensive demonstrable Python ecosystem experience Expert in Python testing frameworks Experience Designing, Building and Consuming production quality REST and Streaming APIs Experience creating Python build and deployment pipelines from the ground up Proven track record delivering large scale, complex and robust … applications in Python Strong written and verbal communication skills in English, and the desire to work together with stakeholders Ability to work independently in a fast-paced environment Software engineering background Beneficial for you to have Object Oriented language (Java, C#, C++ or similar) Experience building Cloud native applications and … Kubernetes, Helm and Docker Familiarity with *NIX platforms and associated tooling Knowledge of financial derivative products and experience/familiarity with risk management Skillset Python Dev Ops CI/CD Azure Kubernetes More ❯
Experience and skills Deep understanding of the Python language Extensive demonstrable Python ecosystem experience Expert in Python testing frameworks Experience Designing, Building and Consuming production quality REST and Streaming APIs Experience creating Python build and deployment pipelines from the ground up Proven track record delivering large scale, complex and robust … applications in Python Strong written and verbal communication skills in English, and the desire to work together with stakeholders Ability to work independently in a fast-paced environment Software engineering background Beneficial for you to have Object Oriented language (Java, C#, C++ or similar) Experience building Cloud native applications and … Kubernetes, Helm and Docker Familiarity with *NIX platforms and associated tooling Knowledge of financial derivative products and experience/familiarity with risk management Skillset Python Dev Ops CI/CD Azure Kubernetes More ❯
for data flows. Architectural Competencies Data Modelling: Designing dimensional, relational, and Document data lineage and recommend improvements for data ownership and stewardship. Qualifications Programming: Python, SQL, Scala, Java. Big Data: Apache Spark, Hadoop, Databricks, Snowflake, etc. Cloud: AWS (Glue, Redshift), Azure (Synapse, Data Factory More ❯
house applications. This is a brilliant opportunity in a high performing and truly supportive environment. The following skills/experience is essential: Java or Python Development skills Previously worked in Compliance or Regulatory Reporting ETL tools Agile Scrum Data Warehousing experience is desirable Cloud computing is desirable (Azure, AWS, etc More ❯
house applications. This is a brilliant opportunity in a high performing and truly supportive environment. The following skills/experience is essential: Java or Python Development skills Previously worked in Compliance or Regulatory Reporting ETL tools Agile Scrum Data Warehousing experience is desirable Cloud computing is desirable (Azure, AWS, etc More ❯
london, south east england, united kingdom Hybrid / WFH Options
Hunter Bond
house applications. This is a brilliant opportunity in a high performing and truly supportive environment. The following skills/experience is essential: Java or Python Development skills Previously worked in Compliance or Regulatory Reporting ETL tools Agile Scrum Data Warehousing experience is desirable Cloud computing is desirable (Azure, AWS, etc More ❯
london, south east england, united kingdom Hybrid / WFH Options
Hunter Bond
house applications. This is a brilliant opportunity in a high performing and truly supportive environment. The following skills/experience is essential: Java or Python Development skills Previously worked in Compliance or Regulatory Reporting ETL tools Agile Scrum Data Warehousing experience is desirable Cloud computing is desirable (Azure, AWS, etc More ❯
aData Quality and Cleansing Subject Matter Experts Location: Zurich, on-site Timeline: February-May 2025 (subject to potential extension English Proficiency in SQL and Python and experience with SQL-based data manipulation and analysis. Strong understanding of data quality principles and data cleansing techniques. Experience with data profiling and data More ❯