Leatherhead, Surrey, South East, United Kingdom Hybrid / WFH Options
RINA
experiencing a real breadth and variety of project work on some of the most technically advanced platforms in UK Defence, including unmanned air systems, Apache, Wildcat, Chinook and Typhoon, to name a few. The successful candidate support and deliver Systems Engineering and Assurance projects, for technically interlinked assets and more »
Telford, Shropshire, West Midlands, United Kingdom
RECRUIT123 LIMITED
Caching (Redis) It's an advantage if you also have experience with: JavaScript frameworks such as Vue/React or similar Linux server management Apache or Nginx DNS, SSL & Domain management What the role involves: Being part of a development team that is responsible for all aspects of ongoing more »
Nottingham, Nottinghamshire, East Midlands, United Kingdom
Microlise
data practices Possess strong knowledge of data tools, data management tools, and various data and information technologies. E.g. DAMA DMBOK, Microsoft SQL Server, Couchbase, Apache Druid, Spark, Kafka, Airflow, etc In-depth understanding of modern data principles, methodologies, and tools Excellent communication and collaboration skills, with the ability to … native computing concepts and experience working with hybrid or private cloud platforms is a plus. Demonstrable technical experience working with a Microsoft, Redhat, and Apache data and software engineering environment. A team-oriented individual with a passion for engineered excellence and the ability to lead and motivate a team more »
of financial markets UI skills a plus (JavaScript/Angular/React etc.) Graph database experience Big data technology stack such as Spark, HDFS, Apache Flink etc. Data domain modelling Working knowledge of UNIX/Linux ABOUT GOLDMAN SACHS At Goldman Sachs, we commit our people, capital and ideas more »
Knutsford, England, United Kingdom Hybrid / WFH Options
Capgemini
Glue, Starburst, Snowflake, Redshift, Starburst, Hybrid(on-prem to cloud) for building data pipelines and ETL processes. Experience with data pipeline orchestration tools like Apache Airflow . Knowledge of SQL and ability to write complex queries for data transformation and analysis. Understanding of data modelling Principles and best practices. more »
Northamptonshire, England, United Kingdom Hybrid / WFH Options
Capgemini
Glue, Starburst, Snowflake, Redshift, Starburst, Hybrid(on-prem to cloud) for building data pipelines and ETL processes. Experience with data pipeline orchestration tools like Apache Airflow . Knowledge of SQL and ability to write complex queries for data transformation and analysis. Understanding of data modelling Principles and best practices. more »
Experience building and running monitoring infrastructure at a large scale. For example, Elasticsearch clusters, Prometheus, Kibana, Grafana, etc Web applications and HTTP servers Java, apache, nginx Load balancers ELB, HAProxy, nginx Experience in running SQL/NoSQL data stores RDS, DynamoDB, ElastiCache, Solr Perks of Working at Viator Competitive more »
Tech: - AWS (S3, Glue, EMR, Athena, Lambda) - Snowflake, Redshift - DBT (Data Build Tool) - Programming: Python, Scala, Spark, PySpark or Ab Initio - Data pipeline orchestration (Apache Airflow) - Knowledge of SQL This is a 6 month initial contract with a trusted client of ours. CVs are being presented on Friday and more »
data, analytics and AI. Databricks is headquartered in San Francisco, with offices around the globe and was founded by the original creators of Lakehouse, Apache Spark, Delta Lake and MLflow. To learn more, follow Databricks on Twitter ,LinkedIn and Facebook . Our Commitment to Diversity and Inclusion At Databricks more »
ML libraries (TensorFlow, pytorch, scikit-learn, transformers, XGBoost, ResNet), geospatial libraries (shapely, geopandas, rasterio), CV libraries (scikit-image, OpenCV, yolo, Detectron2). AWS, Postgres, Apache Airflow, Apache kafka, Apache Spark. Mandatory requirements: You have at least 5 years of experience in DS role, deploying models into production more »
ML libraries (TensorFlow, pytorch, scikit-learn, transformers, XGBoost, ResNet), geospatial libraries (shapely, geopandas, rasterio), CV libraries (scikit-image, OpenCV, yolo, Detectron2). AWS, Postgres, Apache Airflow, Apache kafka, Apache Spark Mandatory requirements: You have at least 5 years of experience in the DS role, deploying models into more »
compliance with specifications Should have understand Banking domain Should have Core Banking knowledge Familiarity with databases e g MySQL MongoDB web Servers e g Apache and UI UX design Excellent communication and teamwork skills Great attention to detail Organizational skills An analytical mind Send me alerts about jobs like more »
tasks, both for oneself and others, ensuring progress aligns with project goals. Nice to have: Previous experience in startup environments. Proficiency or experience with Apache Spark. Familiarity or background in working with Azure. Experience orchestrating workflows, particularly within distributed system environments. Knowledge of MLOps principles and practices, especially in more »
SQL Data Warehouse, Azure Data Lake, Azure Databricks Azure Cosmos DB, Azure Data Factory, Azure Search, Azure Stream Analytics Delta Lake and Data Lakes Apache Spark Pools, SQL Pools (dpools and spools) Experience in Python, C# coding, Spark, PySpark, Unix shell/Perl scripting experience. Experience in API data more »
with Git for version control and project management, alongside some knowledge of Linux/Shell. data platform familiarity - previous experience of working with both Apache Spark and MapReduce data processing and analytics frameworks. and reporting expertise - experience with Tableau, Power BI, Excel alongside notebooks for experiment documentation. What experience more »
notebooks, Spark, and Docker Professional experience of designing, building and managing bespoke MLOps in cloud environments, using tools such as SageMaker Processing, SageMaker Pipelines, Apache Airflow, and MLflow. Strong, fundamental technical expertise in cloud-native technologies, such as serverless functions, API gateway, container registry, Cloud Formation/CDK. Experience more »
expertise in Windows server environments. (inc. Group Policy, DNS, DHCP, File Services, IIS etc.) Proven experience managing Enterprise Linux Operating Systems. (inc. SSH, Bind, Apache, Ngnix, MySQL, PostgreSQL, BASH, Python, Git, LDAP, NFS, Samba) RHEL/CentOS/Rocky Linux/Ubuntu Working with Server Hardware like Dell/ more »
operating and securing a DevOps and DevSecOps environment. PC s (Mac, Windows, Linux) Mobile devices (Android, iOS, Windows) Servers (Linux, Windows) Web servers (IIS, Apache, NGNIX) Databases (SQL Server, Oracle, Mongo DB, Postgres, Reddis) Datawarehouse (Redshift, Snowflake) Network devices (Firewalls, Proxy, NIPS, others) 8+ years of experience in critical more »
as R, Python, Azure, Machine Learning (ML), and Databricks. Essential criteria and experience Proficiency in one or more analytical tools eg R, Python, Tableau, Apache Spark, etc. Proficiency in Azure Machine Learning and Azure Data Bricks. Pro-activity and self-starting attitude. Excellent analytical and problem-solving ability. Interest more »
data, analytics and AI. Databricks is headquartered in San Francisco, with offices around the globe and was founded by the original creators of Lakehouse, Apache Spark, Delta Lake and MLflow. To learn more, follow Databricks on Twitter ,LinkedIn and Facebook . #J-18808-Ljbffr more »
data, analytics and AI. Databricks is headquartered in San Francisco, with offices around the globe and was founded by the original creators of Lakehouse, Apache Spark, Delta Lake and MLflow. To learn more, follow Databricks on Twitter ,LinkedIn and Facebook . Our Commitment to Diversity and Inclusion At Databricks more »
is not essential, but it would make things easier. The main technologies we use include: Elixir, Erlang, Python, Terraform, Ansible, Packer, EMR/Spark, Apache Spark, Apache Druid, BigQuery and Redis Familiarity with cloud technologies. Ideally AWS and technologies such as EC2, ECS, EMR, AWS Lambda, DynamoDB, S3 more »
machine learning or more general statistical analysis Strong software development skills with proficiency in Python or C++ Experience with analytics frameworks such as Pandas, Apache Spark, Dask, or Flink Experience with machine learning frameworks such as TensorFlow, JAX, PyTorch, Spark MLlib, Keras, or scikit-learn Experience in cloud-based … infrastructures such as AWS or GCP Exposure to orchestration platforms such as Apache Airflow or Kubeflow Proven attention to detail, critical thinking, and the ability to work independently within a cross-functional team What benefits do we offer ? Cond Nast Learning Hub where you ll find you ll find more »
My client is a leading global technology consulting and digital solutions company who specializes in sectors such as banking, insurance, manufacturing, and healthcare. They leverage advanced technologies like cloud computing, AI, and data analytics to deliver scalable, cutting-edge solutions. more »