tools listed below: Platform design and engineering (Meta)Data Management and Data Governance Building ELT/ETL pipelines Cloud environments AWS, GCP, and Azure Programming languages such as Python, Scala, Kotlin, and SQL DevOps tools and skills: Building CI/CD pipelines, Kubernetes, Helm, Docker/Podman, Terraform Data Warehousing and Analysis: BigQuery, Snowflake, Redshift, Databricks, DBT Realtime Data Processing More ❯
IIBA (International Institute of Business Analysis)
and Capital Markets is a plus. Experience with Big Data technologies ( i.e. Kafka, Apache Spark, NOSQL) Knowledge of BI tools like Power BI, Microstrategy etc Exposure to Python and Scala Exposure to Salesforce ecosytem About S&P Global Ratings At S&P Global Ratings, our analyst-driven credit ratings, research, and sustainable finance opinions provide critical insights that are essential More ❯
these requirements. In order to secure one of these Senior Data Engineer roles you must be able to demonstrate the following experience: Experience in prominent languages such as Python, Scala, Spark, SQL. Good experience in using Databricks Experience working with any database technologies from an application programming perspective - Oracle, MySQL, Mongo DB etc. Experience with the design, build and maintenance More ❯
5+ years of professional software development experience in designing and building enterprise-level applications Extensive professional knowledge, experience, and understanding of at least one modern programming language, such Java, Scala, and Go. Extensive experience with software engineering principles, including data structures, algorithms, and design patterns. Hands-on experience with cloud technologies (AWS, Azure, GCP) and containerization (Docker, Kubernetes). Excellent More ❯
Professional (or Associate with relevant experience) AWS Certified Data Analytics - Specialty (preferred) Experience with multi-cloud environments (Azure, GCP) is a plus Strong programming skills in Python, SQL, or Scala for data engineering use cases Good to have: Currently working in a major Consulting firm, and/or in industry, but having a Consulting background with proven ability to be More ❯
Professional (or Associate with relevant experience) AWS Certified Data Analytics - Specialty (preferred) Experience with multi-cloud environments (Azure, GCP) is a plus Strong programming skills in Python, SQL, or Scala for data engineering use cases Good to have: Currently working in a major Consulting firm, and/or in industry, but having a Consulting background with proven ability to be More ❯
Manchester, Lancashire, United Kingdom Hybrid / WFH Options
Capgemini
Professional (or Associate with relevant experience) AWS Certified Data Analytics - Specialty (preferred) Experience with multi-cloud environments (Azure, GCP) is a plus Strong programming skills in Python, SQL, or Scala for data engineering use cases Good to have: Currently working in a major Consulting firm, and/or in industry, but having a Consulting background with proven ability to be More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Robert Half
Azure Data Lake Storage Azure SQL Database Solid understanding of data modeling, ETL/ELT, and warehousing concepts Proficiency in SQL and one or more programming languages (e.g., Python, Scala) Exposure to Microsoft Fabric, or a strong willingness to learn Experience using version control tools like Git and knowledge of CI/CD pipelines Familiarity with software testing methodologies and More ❯
focus on cloud-based data pipelines and architectures Strong expertise in Microsoft Fabric and Databricks, including data pipeline development, data warehousing, and data lake management Proficiency in Python, SQL, Scala, or Java Experience with data processing frameworks such as Apache Spark, Apache Beam, or Azure Data Factory Strong understanding of data architecture principles, data modelling, and data governance Experience with More ❯
collaborating with stakeholders to define optimal solutions - Strong SQL skills with experience across relational and non-relational databases - Proficiency in a modern programming language such as Python, Java or Scala - Hands-on experience using DBT for pipeline development and transformation - Familiarity with cloud platforms such as AWS, Azure or GCP - Knowledge of orchestration tooling (e.g., Airflow, Dagster, Azure Data Factory More ❯
3+ years of experience in Framework development and building integration Layers to solve complex business use cases. Technical Skills Strong coding skills in one or more programming languages - Python, Scala, Spark or Java Experience in working with petabyte scale data sets and developing integration layer solutions in Databricks, Snowflake or similar large platforms. Experience with cloud-based data warehousing, transformation More ❯
Proven track record in Data Engineering and supporting the business to gain true insight from data. Experience in data integration and modelling including ELT pipelines. Strong SQL, Python/Scala/R; experience with ELT pipelines and Power BI Solid knowledge of data Lakehouse architecture and Azure services Insurance or MGA data experience preferred Strong communication, stakeholder engagement, and problem More ❯
control, task tracking). ·Demonstrable experience writing ETL scripts and code to make sure the ETL processes perform optimally. ·Experience in other programming languages for data manipulation (e.g., Python, Scala). ·Extensive experience of data engineering and the development of data ingest and transformation routines and services using modern, cloud-based approaches and technologies. ·Understanding of the principles of data More ❯
incl. deep learning, GenAI, LLM, etc. as well as hands on experience on AWS services like SageMaker and Bedrock, and programming skills such as Python, R, SQL, Java, Julia, Scala, Spark/Numpy/Pandas/scikit, JavaScript Amazon is an equal opportunities employer. We believe passionately that employing a diverse workforce is central to our success. We make recruiting More ❯
platforms (e.g., Hadoop, Spark, Kafka) and multi-cloud environments (AWS, Azure, GCP) Experience with machine learning frameworks (e.g., TensorFlow, Scikit-learn, PyTorch) Strong programming skills in Python, Java, or Scala Familiarity with data pipeline development and real-time data processing Proven ability to design and implement data-intensive solutions at scale Experience supporting analytics for the intelligence or defense community More ❯
Burton-on-Trent, Staffordshire, England, United Kingdom Hybrid / WFH Options
Crimson
using Azure. Stay current with Azure technologies and identify areas for enhancement. Databricks (Unity Catalog, DLT), Data Factory, Synapse, Data Lake, Stream Analytics, Event Hubs. Strong knowledge of Python, Scala, C#, .NET. Experience with advanced SQL, T-SQL, relational databases. Azure DevOps, Terraform, BICEP, ARM templates. Distributed computing, cloud-native design patterns. Data modelling, metadata management, data quality, data as More ❯
working knowledge of SQL, SQL Server, and Oracle databases Experience in building data pipelines, using ETL tools, and data modelling Proficient in programming languages such as Python, Java, or Scala Excellent communication and stakeholder management skills - this role will be key to integrating with key stakeholders in the business - must be strong with business partnering and a confident communicator Strong More ❯
web/mobile applications or platforms with either Java/J2EE or .NET tech stack and database technologies such as Oracle, MySQL, etc. Exposure to polyglot programming languages like Scala, Python and Golang will be a plus Ability to read/write code and expertise with various design patterns Have used NoSQL databases such as MongoDB, Cassandra, etc. Responsibilities include More ❯
predictive analyses and machine learning models. Preferred Experience: Preferred: Experience with cloud computing/storage, data lakes/warehouses. Preferred: Experience with Spark and ML applications. Preferred: Experience with Scala, Java Preferred: Experience with Robotic Process Automation (RPA) (e.g., UiPath) Preferred: Experience with Natural Language Processing (NLP) Preferred: Experience in Databricks More ❯
incl. deep learning, GenAI, LLM, etc. as well as hands on experience on AWS services like SageMaker and Bedrock, and programming skills such as Python, R, SQL, Java, Julia, Scala, Spark/Numpy/Pandas/scikit, JavaScript Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace More ❯
container orchestration ( Kubernetes , OpenShift, Docker, Mesos, etc.) Experience with different distributed technologies (e.g. Spark, S3, Snowflake, DynamoDB, CockroachDB, HDFS, Hive, etc.) Experienced with Java/Go/Python/Scala/other languages Proficiency in English Team Data Reply, as part of the Reply Group, offers a wide range of services to help clients become data-driven. The team is More ❯
of experience in DevOps/Data Engineering Extensive familiarity with AWS services to include CloudFormation, EC2, S3, and RDS CloudFormation, Ansible, Git, Jenkins, Bash Programming Languages: Python, Java or Scala Processing Tools: Elasticsearch, Spark, NiFi, and/or Docker Datastore Types: Graph, NoSQL, and/or Relational US Citizenship and an active TS/SCI with Polygraph security clearance required More ❯
teams. A thorough understanding of Java and SQL and a solid grasp of best practices in software development . Experience using big data and related technologies (like Spark, Python, Scala, Kafka). Willingness to become AWS or Confluent Certified Developer. Very good knowledge of Linux Systems and shell scripting. A positive attitude and willingness to feed our family feel, share More ❯
Top Secret clearance or above is required Required Skills & Experience 4+ years of professional software engineering experience Active Top Secret clearance or above Experience with JVM languages: Java(preferred), Scala, Groovy, and/or Python Experience with javascript frameworks: React(preferred), Angular, and/or Vue Bachelor's in computer science or related field required Desired Skills & Experience Master's More ❯
sets through the use of scripts or algorithms Required Skill Sets: • E xperience programming or scripting and debugging in one or more languages such as: Python, Javascript, R, SQL, Scala, etc. • Proficiency with data mining, mathematics, and statistical analysis demonstrated with hands-on academic and project experience conducting statistical analysis, testing, and modeling using regression analysis, linear regression, predictive modeling More ❯