Fort Worth, Texas, United States Hybrid / WFH Options
Epsilon
skills, including the ability to summarize technically complex information for a non-technical audience Organizational, motivational, and interpersonal skills Additional, But Not Required Skills Spark libraries Scala programming Python programming SQL queries Experience with distributed computing Experience with Hadoop and cloud databases Additional Information About Epsilon Epsilon is a more »
South East London, England, United Kingdom Hybrid / WFH Options
Revolution Technology
Sector, they are on the lookout for 2 AWS Data Engineers to come in on a contract basis.Key Skills/Requirements:Must have Python & Spark experienceMust have strong AWS experienceMust have Terraform experienceSQL & NoSQL experienceHave built out Data Warehouses & built Data PipelinesStrong Databricks & Snowflake experienceDocker, ECS, Kubernetes & Orchestration tools more »
a Data Scientist.DevOps experience in CI/CD.Experience using Tensorflow, Kubernetes, MLFlow, Kafka, and Airflow.Experience using Python is a must (tools like AWS and Spark are beneficial)Excellent communication skills and team and colleague engagement.A keen interest in problem-solving and using scalable machine learning to solve the biggest more »
Deerfield, Illinois, United States Hybrid / WFH Options
WALGREENS
to perform analysis and interpret data Deep knowledge of SQL Deep knowledge of open source data science and statistics packages such as Python, R, Spark, etc. Experience establishing and maintaining key relationships with internal (peers, business partners and leadership) and external (business community, clients and vendors) within a matrix more »
Glasgow, Scotland, United Kingdom Hybrid / WFH Options
Synchro
we encourage you to apply. Python App Developer Requirements: Proficiency with Python or PySpark. Exposure to cloud technologies such as Airflow, Astronomer, Kubernetes, AWS, Spark, Kafka. Experience with Big Data solutions or Relational DB. Demonstrated knowledge of software applications and technical processes within a cloud or microservice architecture. Hands more »
DevOps experience in CI/CD. Experience using Tensorflow, Kubernetes, MLFlow, Kafka, and Airflow. Experience using Python is a must (tools like AWS and Spark are beneficial) Excellent communication skills and team and colleague engagement. A keen interest in problem-solving and using scalable machine learning to solve the more »
DevOps experience in CI/CD. Experience using Tensorflow, Kubernetes, MLFlow, Kafka, and Airflow. Experience using Python is a must (tools like AWS and Spark are beneficial) Excellent communication skills and team and colleague engagement. A keen interest in problem-solving and using scalable machine learning to solve the more »
build their AI practice and a team around you. Required Skills: Building cloud and native machine learning architecture with: LLamaIndex, HuggingFace, SentenceTransformers, PyTorch, Python, Apache Spark. Experience with practical application of AI and scaling AI with these tools Experience in Health Care is essential We would love to share more »
or Rust. Experience in building and enhancing compute, storage, and data platforms with exposure to open source products like Kubernetes, Knative, Ceph, Rook, Cassandra, Spark, Nate etc. Hands-on exp. with IaC tools and automation, such as Terraform, Ansible, or Helm. Active engagement or contributions to the open-source more »
South East London, England, United Kingdom Hybrid / WFH Options
McGregor Boyall
data models, ETL processes, and data warehousing solutions.Programming: Utilize Python, Java, Scala, or GoLang to build and optimize data pipelines.Distributed Processing: Work with Hadoop, Spark, and other platforms for large-scale data processing.Real-Time Data Streaming: Develop and manage pipelines using CDC, Kafka, and Apache Spark.Database Management: Handle more »
London, England, United Kingdom Hybrid / WFH Options
McGregor Boyall
ETL processes, and data warehousing solutions. Programming: Utilize Python, Java, Scala, or GoLang to build and optimize data pipelines. Distributed Processing: Work with Hadoop, Spark, and other platforms for large-scale data processing. Real-Time Data Streaming: Develop and manage pipelines using CDC, Kafka, and Apache Spark. Database more »
Luton, England, United Kingdom Hybrid / WFH Options
Ventula Consulting
science and analytics team in deploying pipelines. Coach and mentor the team to improve development standards. Key requirements: Strong hands-on experience with Databricks, Spark, SQL or Scala. Proven experience designing and building data solutions on a cloud based, big data distributed system (AWS/Azure etc.) Hands-on … models and following best practices. The Ability to develop pipelines using SageMaker, MLFlow or similar frameworks. Strong experience with data programming frameworks such as Apache Spark. Understanding of common Data Science and Machine Learning models, libraries and frameworks. This role provides a competitive salary plus excellent benefits package. In more »
improvements Key Skills 3+ years of Python experience Highly statistical and Analytical Exposure to Google Cloud Platform ( BigQuery, GCS, Datalab, Dataproc, Cloud ML (desirable) Spark & Hadoop experience Strong communication skills Good problem solving skills Qualifications Bachelor's degree or equivalent experience in a quantative field (Statistics, Mathematics, Computer Science … classification techniques, and algorithms Fluency in a programming language (Python, C,C++, Java, SQL) Familiarity with Big Data frameworks and visualization tools (Cassandra, Hadoop, Spark, Tableau) This is a permanent position, and offers flexibility with Hybrid working, 2-3 days per week in the office, depending on workload more »
Agile frameworkBuilding knowledge of all data resources within ND and prototype new data sources internally and externallyWhat you’ll bringProficiency in technologies such as Spark (SQL and/or Scala), Kafka.Experience in Scala programming languageAnalytical and problem-solving skills, applied to data solutionProficiency with traditional database SQL technologiesExperience with … dynamic NewDay culture We’re focused on what will drive impact in helping people move forward with credit. Our distinctive culture is geared to spark innovation and team working – with lots of open doors for development. Our customers can rely on us because we aim high, support each other more »
Greater London, England, United Kingdom Hybrid / WFH Options
Hunter Bond
My client are looking for a talented and motivated Big Data Architect (Azure, Databricks, Spark) to be based in their London office. You'll be responsible for providing technical leadership in architecting and designing end-to-end solutions for the organisation's datalake initiatives, as they provide increasing numbers … improvements in design, processes, and implementation to improve operational management, scalability, and extensibility. The following skills/experience is essential: Strong implementation experience using Spark and Databricks Strong Cloud experience (ideally Azure) Previously heavily involved in an implementation programme Data Warehouse Strong stakeholder management experience Excellent IT background, ideally more »