in Financial Industry and Capital Markets is a plus. Experience with Big Data technologies ( i.e. NOSQL) Knowledge of BI tools like Power BI, Microstrategy etc Exposure to Python and Scala Exposure to Salesforce ecosytem About S&P Global Ratings At S&P Global Ratings, our analyst-driven credit ratings, research, and sustainable finance opinions provide critical insights that are essential More ❯
About the Role: We are seeking a Principal Applied Machine Learning Engineer to be the foundational hire responsible for establishing Boost's machine learning capabilities. This is a high-impact, high-ownership role for someone who thrives in greenfield environments More ❯
engineering specialists, contributing to the development and maintenance of advanced data pipelines and supporting various analytical initiatives. Responsibilities: • Assist in the development and maintenance of data pipelines using Spark, Scala, PySpark, and Python. • Support the deployment and management of AWS services including EC2, S3, and IAM. • Work with the team to implement and optimize big data processing frameworks such as … OR equivalent practical experience. • Basic knowledge of Spark and Hadoop distributed processing frameworks. • Familiarity with AWS services, particularly EC2, S3, and IAM. • Some experience with programming languages such as Scala, PySpark, Python, and SQL. • Understanding of data pipeline development and maintenance. • Strong problem-solving skills and the ability to work collaboratively in a team environment. • Eagerness to learn and grow More ❯
ensure cost efficiency and high performance. Design and manage data lakes, data warehouses, and associated infrastructure to ensure data accuracy, integrity, and availability. Write high-quality, efficient code in Scala, Python, and Spark, following best practices for maintainability and scalability. Develop and implement robust monitoring, alerting, and error-handling mechanisms to ensure pipeline reliability. Partner with business stakeholders, product managers … Provide on-call support as part of a shared team rota to ensure platform availability. ABOUT YOU Proven expertise working with Databricks, including Unity Catalog. Strong programming skills in Scala, Python, Spark & SQL/MYSQL. Solid experience with version control systems, particularly Git. Strong background in designing and optimizing complex data pipelines and infrastructure. Experience leading and mentoring technical teams More ❯
sets through the use of scripts or algorithms Required Skill Sets: • E xperience programming or scripting and debugging in one or more languages such as: Python, Javascript, R, SQL, Scala, etc. • Proficiency with data mining, mathematics, and statistical analysis demonstrated with hands-on academic and project experience conducting statistical analysis, testing, and modeling using regression analysis, linear regression, predictive modeling More ❯
of experience in DevOps/Data Engineering Extensive familiarity with AWS services to include CloudFormation, EC2, S3, and RDS CloudFormation, Ansible, Git, Jenkins, Bash Programming Languages: Python, Java or Scala Processing Tools: Elasticsearch, Spark, NiFi, and/or Docker Datastore Types: Graph, NoSQL, and/or Relational US Citizenship and an active TS/SCI with Polygraph security clearance required More ❯
see the direct impact of your work on revenue opportunities. What you offer 5+ years of experience in a related role with hands-on coding in Python, C++, Java, Scala, or other major languages Strong technical leadership skills, including a systematic mindset and a proven track record of designing elegant, scalable, and pragmatic solutions with immediate impact Ability to work More ❯
teams. A thorough understanding of Java and SQL and a solid grasp of best practices in software development . Experience using big data and related technologies (like Spark, Python, Scala, Kafka). Willingness to become AWS or Confluent Certified Developer. Very good knowledge of Linux Systems and shell scripting. A positive attitude and willingness to feed our family feel, share More ❯
and maintenance, specifically VBA scripting. Proficiency in programming languages such as Python and SQL. Experience with VBA for MS Access automation is essential. Experience with other languages like Java, Scala, R, PL/SQL, C++, C#, and HTML is a plus. Strong understanding of database design principles, including data modeling, schema design, and database normalization. Experience with database management systems More ❯
South East London, London, United Kingdom Hybrid / WFH Options
TEN10 SOLUTIONS LIMITED
Azure DevOps). Excellent communication and stakeholder management skills. Bonus Points for: Previous experience in a consultancy environment . Hands-on coding experience in additional languages like Python, Ruby, Scala, PHP, or C++ . Knowledge of performance testing tools like JMeter, Gatling, K6 or Neoload. Whats in It for You? At Ten10, we believe in recognizing and rewarding great work. More ❯
. Proficiency in performance testing tools like JMeter, Gatling, K6, Neoload, or Webload . Strong coding skills in at least one language such as Java, TypeScript JavaScript, Python, C# Scala, or PHP. Experience designing and building automation frameworks . Familiarity with Agile development environments (SCRUM, Kanban, TDD, BDD). Implementing pipelines using common tooling such as Jenkins, ADO, GitHub actions More ❯
Ability to articulate the challenges faced in large data transformation and proposing solutions to deliver effective value across the business Some hands-on coding experience with SQL, Python or Scala, data visualisation tools would be advantageous Relevant experience in Data Platform Technologies would be beneficial Previous consulting experience would be a plus, but so would the curiosity and ambition to More ❯
Google would be beneficial) Knowledge of the common functions in a typical Data organisation Demonstratable interest and awareness in emerging technologies Hands-on coding experience with SQL, Python or Scala What we look for We are looking for highly motivated individuals who are passionate about Data and Analytics and want to assist wealth and asset management clients to become truly More ❯
Google would be beneficial) Knowledge of the common functions in a typical Data organisation Demonstratable interest and awareness in emerging technologies Hands-on coding experience with SQL, Python or Scala What we look for We are looking for highly motivated individuals who are passionate about Data and Analytics and want to assist wealth and asset management clients to become truly More ❯
understand and translate between language and methodologies used both in research and engineering fields Hands-on experience in other programming language/scripting language and development stack (Java, Rust, Scala, Typescript, etc.) What's in it For You? Hybrid Work Model: We've adopted a flexible hybrid working environment (2-3 days a week in the office depending on the More ❯
Manchester, Lancashire, United Kingdom Hybrid / WFH Options
WorksHub
on AWS. Our services teams are advocates of functional programming, so you can expect to join a team that's applying principles from FP to build these services: using Scala, Cats, ZIO, http4s, FS2, and Cats Effect. THE TEAMS Experimentation Our high-scale in-house A/B testing platform. Deeply integrated into our platform to enable experimentation on every … both technical and soft skills. Comfort with ambiguity and leading conversations where discordant views are present. Experience or strong interest in functional programming and its real-world applications - particularly Scala stacks such as Scalaz, Cats Effect and ZIO. Demonstrable experience working as part of a high performing, collaborative, agile team to deliver significant features/requirements. Experience of designing, building More ❯
goals Collaborate with stakeholders to deliver high-impact, cost-effective data solutions Requirements: Hands-on experience in data engineering, automation, and analytics Proficient in 2+ languages (e.g. Python, Java, Scala, Spark) Strong AWS and SQL/database skills Experience with geospatial data and FME Solid track record in technical delivery and team leadership AWS/FME certifications preferred Bonus: Experience More ❯
Top Secret clearance or above is required Required Skills & Experience 4+ years of professional software engineering experience Active Top Secret clearance or above Experience with JVM languages: Java(preferred), Scala, Groovy, and/or Python Experience with javascript frameworks: React(preferred), Angular, and/or Vue Bachelor's in computer science or related field required Desired Skills & Experience Master's More ❯
Columbia, South Carolina, United States Hybrid / WFH Options
Systemtec Inc
Sagemaker, Unified Studio, R Studio/Posit Workbench, R Shiny/Posit Connect, Posit Package Manager, AWS Data Firehose, Kafka, Hive, Hue, Oozie, Sqoop, Git/Git Actions, IntelliJ, Scala Responsibilities of the Data Engineer (AWS): Act as an internal consultant, advocate, mentor, and change agent providing expertise and technical guidance on complex projects. Work closely with customers, business analysts More ❯
don't do maintenance mode. We're in it for the cool stuff - greenfield, cutting-edge, no legacy handcuffs . Our tech stack? Alive and kicking: Node.js, C++, Java, Scala, Python, SQL, TypeScript, JavaScript, React, Docker, Kubernetes, AWS, Google Cloud Check our GitHub: smartclip - we don't just ship code, we share it. Tools & Loadout: Your gear should never bottleneck More ❯
don't do maintenance mode. We're in it for the cool stuff - greenfield, cutting-edge, no legacy handcuffs . Our tech stack? Alive and kicking: Node.js, C++, Java, Scala, Python, SQL, TypeScript, JavaScript, React, Docker, Kubernetes, AWS, Google Cloud Check our GitHub: smartclip - we don't just ship code, we share it. Tools & Loadout: Your gear should never bottleneck More ❯
and services. Bring in your knowledge and experience to make an impact on our products and the technical architecture. Learn and work with big data technologies such as Node.js, Scala, Kafka, Kubernetes, and Snowflake. Live the complete development cycle, including task breakdown, coding, writing unit tests, participating in code reviews and performing deployments. Your skills: A Bachelor's or higher More ❯
and services. Bring in your knowledge and experience to make an impact on our products and the technical architecture. Learn and work with big data technologies such as Node.js, Scala, Kafka, Kubernetes, and Snowflake. Live the complete development cycle, including task breakdown, coding, writing unit tests, participating in code reviews and performing deployments. Your skills: A Bachelor's or higher More ❯
don't do maintenance mode. We're in it for the cool stuff - greenfield, cutting-edge, no legacy handcuffs . Our tech stack? Alive and kicking: Node.js, C++, Java, Scala, Python, SQL, TypeScript, JavaScript, React, Docker, Kubernetes, AWS, Google Cloud Check our GitHub: smartclip - we don't just ship code, we share it. Tools & Loadout: Your gear should never bottleneck More ❯
and services. Bring in your knowledge and experience to make an impact on our products and the technical architecture. Learn and work with big data technologies such as Node.js, Scala, Kafka, Kubernetes, and Snowflake. Live the complete development cycle, including task breakdown, coding, writing unit tests, participating in code reviews and performing deployments. Your skills: A Bachelor's or higher More ❯