Azure, GCP) to store and process data. Document workflows, pipelines, and transformation logic for transparency. Key Skills & Experience: Strong hands-on experience in Python (Pandas, NumPy, PySpark). Experience building ETL/ELT processes. Familiarity with cloud platforms (AWS, Azure, GCP) and big data technologies (e.g., Snowflake, Databricks). Understanding More ❯
Azure, GCP) to store and process data. Document workflows, pipelines, and transformation logic for transparency. Key Skills & Experience: Strong hands-on experience in Python (Pandas, NumPy, PySpark). Experience building ETL/ELT processes. Familiarity with cloud platforms (AWS, Azure, GCP) and big data technologies (e.g., Snowflake, Databricks). Understanding More ❯
point of contact to our client representing Grey Matters Defense Solutions with their software releases About you: • 3+ years experience using Python SciPy NumPy Pandas Boto3 • 3+ years experience using AWS C2S Integration S3 SQS SNS AWS CLI • 3+ years experience using Linux Bash RHEL Derivatives • Integration ICD Documents and More ❯
simply, and work well in cross-functional teams Tech You’ll Work With ML & Data Science Python (primary language) TensorFlow, PyTorch, or Keras NumPy, pandas Data pipelines (Azure Data Factory, Airflow, etc.) Applied ML: NLP, CV, transformers, GANs, time series, etc. Engineering & Cloud Azure (or similar cloud platforms like AWS More ❯
Scrapy, BeautifulSoup, or Selenium). Strong knowledge of data cleaning, standardization, and normalization techniques Experience with data analysis and modeling using libraries such as Pandas, NumPy, Scikit-learn, or TensorFlow. Familiarity with SQL and database management systems (e.g., PostgreSQL, MySQL). Experience with cloud platforms (e.g., AWS, Azure, GCP) and More ❯
Scrapy, BeautifulSoup, or Selenium). Strong knowledge of data cleaning, standardization, and normalization techniques Experience with data analysis and modeling using libraries such as Pandas, NumPy, Scikit-learn, or TensorFlow. Familiarity with SQL and database management systems (e.g., PostgreSQL, MySQL). Experience with cloud platforms (e.g., AWS, Azure, GCP) and More ❯
Scrapy, BeautifulSoup, or Selenium). Strong knowledge of data cleaning, standardization, and normalization techniques Experience with data analysis and modeling using libraries such as Pandas, NumPy, Scikit-learn, or TensorFlow. Familiarity with SQL and database management systems (e.g., PostgreSQL, MySQL). Experience with cloud platforms (e.g., AWS, Azure, GCP) and More ❯
Southampton, Hampshire, United Kingdom Hybrid / WFH Options
Spectrum IT Recruitment
simply, and work well in cross-functional teams Tech You'll Work With ML & Data Science Python (primary language) TensorFlow, PyTorch, or Keras NumPy, pandas Data pipelines (Azure Data Factory, Airflow, etc.) Applied ML: NLP, CV, transformers, GANs, time series, etc. Engineering & Cloud Azure (or similar cloud platforms like AWS More ❯
innovation and resource efficiency, contributing to the continuous development of their consulting offering and analytical capabilities as the company grows. Skills & Experience Python (incl. pandas, numpy, fastapi, dash/plotly) Database development: e.g. SQL, PostgreSQL, SQLAlchemy, data warehousing, ETL pipelines Cloud computing & DevOps: e.g. AWS (EC2, Lambda, S3), Docker, CI More ❯
innovation and resource efficiency, contributing to the continuous development of their consulting offering and analytical capabilities as the company grows. Skills & Experience Python (incl. pandas, numpy, fastapi, dash/plotly) Database development: e.g. SQL, PostgreSQL, SQLAlchemy, data warehousing, ETL pipelines Cloud computing & DevOps: e.g. AWS (EC2, Lambda, S3), Docker, CI More ❯
Requirements Minimum Qualifications: - 7+ years of professional experience in Python development and data processing. - Deep expertise with Python data processing libraries such as PySpark, Pandas, and NumPy. - Strong experience with API development using FastAPI or similar frameworks. - Proficiency in test-driven development using PyTest and mocking libraries. - Advanced understanding of More ❯
a data product company Experience building or maintaining third-party or in-house data quality and cataloguing solutions Experience with documentation of system architecture Pandas, Jupyter, Plotly DBT, Kafka BI tools such as Tableau, Metabase and Superset The current tech stack: Airflow Clickhouse DBT Python MongoDB PostgreSQL MariaDB Kafka K8s More ❯
and the ability to apply this process when handling structured or unstructured data Confident with using common data science tooling such as Jupyter notebooks, pandas, matplotlib, seaborn, numpy API testing and security tools: Postman, Burp Suite, OWASP ZAP, etc. Strong knowledge of database management systems (DBMS) such as MySQL Hands More ❯
and the ability to apply this process when handling structured or unstructured data Confident with using common data science tooling such as Jupyter notebooks, pandas, matplotlib, seaborn, numpy API testing and security tools: Postman, Burp Suite, OWASP ZAP, etc. Strong knowledge of database management systems (DBMS) such as MySQL Hands More ❯
deliver impactful solutions. Ensure data quality, security, and governance. About You Experience in analytics or model/data engineering. Advanced Python skills (Numpy/Pandas). Strong SQL and relational database design expertise. Excellent communication skills. Benefits £6,000 per annum training & conference budget to help you up-skill and More ❯
Solid understanding of machine learning concepts, algorithms, and libraries (e.g., scikit-learn, TensorFlow, PyTorch). Experience with data manipulation and analysis using tools like Pandas and NumPy. Familiarity with cloud computing platforms (e.g., AWS, Azure, GCP). Desired Qualifications: Masters degree. Demonstrated experience with the application of machine learning and More ❯
FX markets - over the last 3-5 years - in an investment context. Fluent in Python 3.11+, the standard library, and external libraries like numpy, pandas, matplotlib, and scikit-learn. Confident in SQL Server or similar; experience with Azure and Docker is a positive. An understanding of machine learning processes and More ❯
and machine learning, with a strong portfolio of relevant projects. Proficiency in Python with libraries like TensorFlow, PyTorch, or Scikit-learn for ML, and Pandas, PySpark, or similar for data processing. Experience designing and orchestrating data pipelines with tools like Apache Airflow, Spark, or Kafka. Strong understanding of SQL, NoSQL More ❯
and machine learning, with a strong portfolio of relevant projects. Proficiency in Python with libraries like TensorFlow, PyTorch, or Scikit-learn for ML, and Pandas, PySpark, or similar for data processing. Experience designing and orchestrating data pipelines with tools like Apache Airflow, Spark, or Kafka. Strong understanding of SQL, NoSQL More ❯
and machine learning, with a strong portfolio of relevant projects. Proficiency in Python with libraries like TensorFlow, PyTorch, or Scikit-learn for ML, and Pandas, PySpark, or similar for data processing. Experience designing and orchestrating data pipelines with tools like Apache Airflow, Spark, or Kafka. Strong understanding of SQL, NoSQL More ❯
systems Professional software development experience with a track record of delivering high-quality, production-grade code Experience with scientific computing libraries such as NumPy, Pandas, or SciPy in production environments Holistic software development mindset covering testing, documentation, security, and performance Track record of mentoring other engineers and sharing knowledge across More ❯
ideation to deployment. Be on-call for urgent AI model fixes or system failures. Qualifications Proficiency in Python and related libraries (e.g., NumPy, SciPy, pandas) is required. Strong production experience with at least one framework: LangChain, AutoGen, or CrewAI. Deep understanding of agentic systems, autonomous workflows, and LLM-based automation. More ❯
systems Professional software development experience with a track record of delivering high-quality, production-grade code Experience with scientific computing libraries such as NumPy, Pandas, or SciPy in production environments Holistic software development mindset covering testing, documentation, security, and performance Track record of mentoring other engineers and sharing knowledge across More ❯
data integration, data warehousing, and proficiency in programming languages, as well as expertise with ETL tools and data integration platforms. Programming & Scripting: Python (e.g., Pandas, Requests), PowerShell, Bash, or equivalent scripting languages APIs & Integration: RESTful/SOAP APIs, OAuth, API authentication mechanisms Databases: SQL (PostgreSQL, MySQL, MSSQL), NoSQL (MongoDB, DynamoDB More ❯
Swindon, Wiltshire, United Kingdom Hybrid / WFH Options
RWE AG
/Financial Markets (e.g. trading products) and growth of technology. Advantageous, but not essential Any experience with Python and its data processing libraries e.g. Pandas, Py-Spark. An understanding of the basics of trade modelling and lifecycle. Basic knowledge of any energy trading regulations i.e. EMIR/REMIT/Dodd More ❯