frameworks and practices. Understanding of machine learning workflows and how to support them with robust data pipelines. DESIRABLE LANGUAGES/TOOLS Proficiency in programming languages such as Python, Java, Scala, or SQL for data manipulation and scripting. Strong understanding of data modelling concepts and techniques, including relational and dimensional modelling. Experience in big data technologies and frameworks such as Databricks More ❯
services (AWS, GCP, or Azure) Understanding of data modeling, distributed systems, ETL/ELT pipelines, and streaming architectures Proficiency in SQL and at least one programming language (e.g., Python, Scala, or Java) Demonstrated experience owning complex technical systems end-to-end, from design through production Excellent communication skills with the ability to explain technical concepts to both technical and non More ❯
business to deliver value-driven solutions What were looking for: London/Lloyd's Market experience is essential Strong programming skills in Python and SQL; knowledge of Java or Scala is a plus Solid experience with relational databases and data modelling (Data Vault, Dimensional) Proficiency with ETL tools and cloud platforms (AWS, Azure or GCP) Experience working in Agile and More ❯
West London, London, United Kingdom Hybrid / WFH Options
Young's Employment Services Ltd
knowledge of tools such as Apache Spark, Kafka, Databricks, DBT or similar Experience building, defining, and owning data models, data lakes, and data warehouses Programming proficiency in Python, Pyspark, Scala or Java. Experience operating in a cloud-native environment (e.g. Fabric, AWS, GCP, or Azure). Excellent stakeholder management and communication skills. A strategic mindset, with a practical approach to More ❯
Solr is a strong advantage. Bachelor's or master's degree in computer science, Information Systems , Engineering , or a related field. Deep expertise in Databricks , including Spark (PySpark/Scala) , Delta Lake , and orchestration within Databricks workflows. Strong understanding of cloud infrastructure and data services on at least one major cloud platform (Azure preferred, but AWS or GCP also accepted More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Robert Half
Azure Data Lake Storage Azure SQL Database Solid understanding of data modeling, ETL/ELT, and warehousing concepts Proficiency in SQL and one or more programming languages (e.g., Python, Scala) Exposure to Microsoft Fabric, or a strong willingness to learn Experience using version control tools like Git and knowledge of CI/CD pipelines Familiarity with software testing methodologies and More ❯
of AWS services including S3, Redshift, EMR, Kinesis and RDS. - Experience with Open Source Data Technologies (Hadoop, Hive, Hbase, Pig, Spark, etc.) - Ability to write code in Python, Ruby, Scala or other platform-related Big data technology - Knowledge of professional software engineering practices & best practices for the full software development lifecycle, including coding standards, code reviews, source control management, build More ❯
to perform pair programming and peer code review with fellow teammates. The Perfect Match Proficient Programmer: Expertise in any of the modern programming languages such as Java, Python, NodeJS, Scala, etc. Data Engineering Skills: Experience with data engineering and analytics technologies, including Spark, Hadoop, Kafka, Cassandra, and similar platforms. Educational Background: University degree in Computer Science or equivalent field. Meet More ❯
developing and maintaining components in AWS SpringBoot framework Experience with ELK stack, OpenSearch, SonarQube, Cypress, PowerShell, C#, and Databricks Experience with Docker, SQL, Angular, Spring Boot, Nifi, AWS, python, scala, shell scripting, and XML processing Experience in AWS solution architecture Experience as a scrum participant and software release processes Available to work after hours when mission requires Communicate work using More ❯
5+ years of professional software development experience in designing and building enterprise-level applications Extensive professional knowledge, experience, and understanding of at least one modern programming language, such Java, Scala, and Go. Extensive experience with software engineering principles, including data structures, algorithms, and design patterns. Hands-on experience with cloud technologies (AWS, Azure, GCP) and containerization (Docker, Kubernetes). Excellent More ❯
Extensive experience working with AWS with a strong understanding of Redshift, EMR, Athena, Aurora, DynamoDB, Kinesis, Lambda, S3, EC2, etc. Experience with coding languages like Python/Java/Scala Experience in maintaining data warehouse systems and working on large scale data transformation using EMR, Hadoop, Hive, or other Big Data technologies Experience mentoring and managing other Data Engineers, ensuring More ❯
Bethesda, Maryland, United States Hybrid / WFH Options
Gridiron IT Solutions
Statistics, Mathematics, or related field 7+ years of experience in data science or related field, plus additional 3+ years' experience in a complimentary function Strong programming skills in Python, SCALA, and/or UNIX shell scripting Expertise in machine learning techniques and statistical analysis Proficiency in SQL and NoSQL databases Experience with big data platforms such as Hadoop, Spark, and More ❯
years of experience in software engineering Expert level proficiency in Python Experience in LangGraph, LangChain, and LangSmith Proficiency in at least one other statically-typed language such as Java, Scala, Rust, or C++ Demonstrated, hands-on experience building applications with Large Language Models Other Qualifications: Experience implementing agentic design patterns where an LLM uses tools to interact with its environment More ❯
record of building and deploying ML models on cloud. (e.g., Amazon SageMaker or similar) - 5+ years developing with SQL, Python, and at least one additional programming language (e.g., Java, Scala, JavaScript, TypeScript). Proficient with leading ML libraries and frameworks (e.g., TensorFlow, PyTorch) PREFERRED QUALIFICATIONS - AWS experience preferred, with proficiency in a range of AWS services (e.g., SageMaker, Bedrock, EC2 More ❯
Strong hands-on experience with SQL (complex joins, window functions, CTEs, performance tuning). Proven experience in ETL development using tools like Informatica, Talend, DataStage, or custom Python/Scala frameworks. Familiarity with or experience in using Rhine for metadata-driven pipeline orchestration. Working knowledge of data warehousing concepts and dimensional modeling. Exposure to cloud platforms (AWS, Azure, or GCP More ❯
web/mobile applications or platform with either Java/J2EE or .NET tech stack and database technologies such as Oracle, MySQL, etc. " Exposure to polyglot programming languages like Scala, Python and Golang will be a plus " Ability to read/write code and expertise with various design patterns " Have used NoSQL database such as MongoDB, Cassandra, etc. " Work on More ❯
platforms (e.g., Hadoop, Spark, Kafka) and multi-cloud environments (AWS, Azure, GCP) Experience with machine learning frameworks (e.g., TensorFlow, Scikit-learn, PyTorch) Strong programming skills in Python, Java, or Scala Familiarity with data pipeline development and real-time data processing Proven ability to design and implement data-intensive solutions at scale Experience supporting analytics for the intelligence or defense community More ❯
on the Unclassified side, Internal FBI Systems, etc.) HTML, Vue Framework, Javascript, Java and .NET Apache Solr .NET, C#, Javascript, Java, Perl , .NET, C#, Vue3, Typescript, Python, Jupyter Notebook, Scala, Databricks Experience in ElasticSearch, Vue.js, Java Spring Framework, and .NET Framework web services. Oracle, API Architecture (SOAP/REST), Git, REST APIs using technologies such as ASP.NET Core, Flask, Django More ❯
incl. deep learning, GenAI, LLM, etc. as well as hands on experience on AWS services like SageMaker and Bedrock, and programming skills such as Python, R, SQL, Java, Julia, Scala, Spark/Numpy/Pandas/scikit, JavaScript Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace More ❯
sets through the use of scripts or algorithms Required Skill Sets: • E xperience programming or scripting and debugging in one or more languages such as: Python, Javascript, R, SQL, Scala, etc. • Proficiency with data mining, mathematics, and statistical analysis demonstrated with hands-on academic and project experience conducting statistical analysis, testing, and modeling using regression analysis, linear regression, predictive modeling More ❯
and cloud platforms like AWS or Azure Exposure to SQL/NoSQL databases Leadership or mentoring experience Self-starter, flexible, and eager to learn Bonus: Exposure to C# or Scala Why Join? Be part of a highly collaborative, inclusive team culture Enjoy continuous learning - take up to 5 paid days per year for training, with access to Pluralsight, and support More ❯
including JAXP parsing and XSLT/XPath transforms for indexing. Experience using a templating engine and dynamic language to decorate search results and extend Java API (for example, Groovy, Scala or Python). Monitoring experience with Prometheus and Grafana. DevOps experience with Git/GitLab, Jenkins, Maven, Ansible, and Nexus. Experience with TDD or BDD, test automation, and performance testing. More ❯
Experience in Cloud Data Pipelines Building cloud data pipelines involves using Azure native programming techniques such as PySpark or Scala and Databricks. These pipelines are essential for tasks like sourcing, enriching, and maintaining structured and unstructured data sets for analysis and reporting. They are also crucial for secondary tasks such as flow pipelines, streamlining AI model performance, and enhancing interaction More ❯
and systems-of-systems. • Experience with the evaluation and integration of new technologies and the development and maintenance of computer software. • Experience with NoSQL/NewSQL data stores, the Scala programming language, the Rust programming language, MQTT, and Big Data processing technologies such as Cassandra, Kafka, Spark, and associated technologies. • Experience with visualization technologies and user-interface development, including HTML5 More ❯
Top Secret clearance or above is required Required Skills & Experience 4+ years of professional software engineering experience Active Top Secret clearance or above Experience with JVM languages: Java(preferred), Scala, Groovy, and/or Python Experience with javascript frameworks: React(preferred), Angular, and/or Vue Bachelor's in computer science or related field required Desired Skills & Experience Master's More ❯