relevant insights Advanced statistical and analytical techniques and concepts sampling methods, Regression Properties of distributions Weighting sample-based data Statistical tests proper usage Real-world applications. Python - NumPy, SciPy, Pandas, MLlib, scikit-learn, and other common data and machine learning related libraries Working knowledge of SQL, data structures and databases (Snowflake - desirable) This is a pragmatic and humble organisation who More ❯
Glasgow, Scotland, United Kingdom Hybrid/Remote Options
NLB Services
months with possible extensions (No Sponsorship Available ) Skills/Qualifications: · 4+ years of experience developing data pipelines and data warehousing solutions using Python and libraries such as Pandas, NumPy, PySpark, etc. · 3+ years hands-on experience with cloud services, especially Databricks, for building and managing scalable data pipelines · 3+ years of proficiency in working with Snowflake or similar cloud-based More ❯
with a focus on application quality, performance, and security. - Infrastructure-as-Code - Security awareness (RBAC, networking, encryption at rest etc.) - Positive attitude towards LLM assisted development - Preferred Toolset: - Python (Pandas, Apache Airflow, PySpark, SQLAlchemy, Dask, Great Expectations, ScikitLearn) - Linux - Containers (e.g. Docker) - Data bases (e.g. PostgreSQL) - AWS (CDK/Python, Bedrock, Sagemaker) - Demonstrated interpersonal skills and ability to work closely More ❯
Aberdeen, Aberdeenshire, Scotland, United Kingdom Hybrid/Remote Options
Reed
experience setting up CI/CD pipelines using tools like Azure DevOps, GitHub Actions, or GitLab CI, targeting Azure environments. Experience with popular Python libraries and frameworks such as Pandas, NumPy, and PySpark. Benefits Full-time, permanent contract. Competitive salary up to £60,000 per annum, depending on experience. Hybrid working model based in Aberdeen (2 days per week in More ❯
an umbrella company for the duration of the contract. You must have several years of experience developing data pipelines and data warehousing solutions using Python and libraries such as Pandas, NumPy, PySpark, etc. You will also have a number of years hands-on experience with cloud services, especially Databricks, for building and managing scalable data pipelines. ETL process expertise is More ❯