Services, Microsoft Azure, Databricks, Snowflake. Architectural and/or feature knowledge of one or more of the following Programming Languages/Packages: Python, Java, Scala, Spark, SQL, NoSQL databases. Experience working within Agile delivery methodologies. Proven ability to be successful in a matrixed organisation, and to enlist support and commitment More ❯
best practices for building efficient and scalable data pipelines. Requirements • Minimum 4 years of experience as a Data Engineer. • Proven experience with Python or Scala • Knowledge in Data Warehousing, Data Lake and Lakehouse paradigms • Experience with orchestration tools like Airflow or Oozie • Ability to design and implement DevOps strategies for More ❯
platforms, and a very smart team You might be a good fit if you: Have 1–3 years of experience writing software (Python, Java, Scala — whatever works) Know your way around SQL and databases Have touched cloud services (AWS, GCP, etc.) Enjoy solving problems, learning fast, and working with good More ❯
platforms, and a very smart team You might be a good fit if you: Have 1–3 years of experience writing software (Python, Java, Scala — whatever works) Know your way around SQL and databases Have touched cloud services (AWS, GCP, etc.) Enjoy solving problems, learning fast, and working with good More ❯
platforms, and a very smart team You might be a good fit if you: Have 1–3 years of experience writing software (Python, Java, Scala — whatever works) Know your way around SQL and databases Have touched cloud services (AWS, GCP, etc.) Enjoy solving problems, learning fast, and working with good More ❯
cloud and hybrid environments. Responsibilities : Data Architecture : Design and implement scalable, reliable data architectures (AWS, Azure, Google Cloud) with modern tech stacks (Spark, Kafka, Scala). Focus on database designs, ETL processes, and data models for growth and efficiency. Data Quality & Security : Ensure high data quality, compliance, and security by More ❯
cloud and hybrid environments. Responsibilities : Data Architecture : Design and implement scalable, reliable data architectures (AWS, Azure, Google Cloud) with modern tech stacks (Spark, Kafka, Scala). Focus on database designs, ETL processes, and data models for growth and efficiency. Data Quality & Security : Ensure high data quality, compliance, and security by More ❯
cloud and hybrid environments. Responsibilities : Data Architecture : Design and implement scalable, reliable data architectures (AWS, Azure, Google Cloud) with modern tech stacks (Spark, Kafka, Scala). Focus on database designs, ETL processes, and data models for growth and efficiency. Data Quality & Security : Ensure high data quality, compliance, and security by More ❯
cloud and hybrid environments. Responsibilities : Data Architecture : Design and implement scalable, reliable data architectures (AWS, Azure, Google Cloud) with modern tech stacks (Spark, Kafka, Scala). Focus on database designs, ETL processes, and data models for growth and efficiency. Data Quality & Security : Ensure high data quality, compliance, and security by More ❯
IMINT, or GEOINT Experience with Infrastructure-as-code tools (Ansible, Terraform) Experience with Big Data technologies (Spark, Cassandra, Hadoop) Experience with Java, Python, or Scala Experience with Docker, Kubernetes, BitBucket, GIT More ❯
security, and compliance (GDPR,HIPAA, SOC 2). Expertise in AWS, Azure, GCP, Snowflake, Databricks, and big data processing frameworks. Proficiency in SQL, Python, Scala, Java, Spark, and data modelling. Client Engagement: Experience in agile project management, stakeholder engagement, and commercial negotiations. Leadership & Collaboration: Ability to scale consulting teams and More ❯
and multi-cloud environments (AWS, Azure, GCP) Experience with machine learning frameworks (e.g., TensorFlow, Scikit-learn, PyTorch) Strong programming skills in Python, Java, or Scala Familiarity with data pipeline development and real-time data processing Proven ability to design and implement data-intensive solutions at scale Experience supporting analytics for More ❯
Love to get your hands dirty and solve challenging technical issues? Key job responsibilities Experience in functional/programming languages such as Python, Spark, Scala, and analytics. Knowledge of distributed systems related to data storage and computing. Deep knowledge of various AWS big data technologies. Experience mentoring junior data and More ❯
data modeling, warehousing, and building ETL pipelines Experience with SQL Experience in at least one modern scripting or programming language, such as Python, Java, Scala, or NodeJS Experience mentoring team members on best practices Experience with big data technologies such as Hadoop, Hive, Spark, EMR Experience operating large data warehouses More ❯
EMR, RDS, Redshift. Experience with stream-processing systems: Storm, Spark-Streaming, etc. Experience with object-oriented/object function scripting languages: Python, Java, C++, Scala, etc. Salary: 30000 per annum + benefits Apply For This Job If you would like to apply for this position, please fill in the information More ❯
your work on revenue opportunities. What you offer 5+ years of experience in a related role with hands-on coding in Python, C++, Java, Scala, or other major languages Strong technical leadership skills, including a systematic mindset and a proven track record of designing elegant, scalable, and pragmatic solutions with More ❯
data clearly and concisely PREFERRED QUALIFICATIONS Experience in data warehousing, BI tools, techniques, and technology Data engineering skills; programming knowledge (JAVA/PYTHON/SCALA) Knowledge of Star Schemas, Dimensional Models, Data Marts, Big Data, and Advanced Analytics Strong analytical skills with attention to detail and big-picture understanding Excellent More ❯
Master's degree in Computer Science, Engineering, Information Systems, or a related field; +6 years of experience as Fullstack developer; Proven experience with both Scala and Angular; Solid understanding of functional programming and RESTful web services; Strong knowledge of HTML5, CSS3, JavaScript, and TypeScript; Familiarity with databases (SQL and/ More ❯
or Lakehouses e.g. Databricks SQL Snowflake, Synapse Analytics, Microsoft Fabric, etc. Comfortable understanding and writing complex SQLs in Data Analytics projects. Excellent Python/Scala and Spark programming skills. Solid understanding of delivery methodology and process What you'll need to do next? If you have a proven track record More ❯
computer science, engineering). Excel at problem-solving and continuous learning. Have experience in building complex systems. Be proficient in programming languages such as Scala, Python, Java, or C#. Have knowledge of data management platforms (SQL, NoSQL, Spark, etc.). Be familiar with modern software tools (Git, CI/CD More ❯
NoSQL/Big Data Databases Experience with finance/derivative products Experience with Java 17+ and/or additional JVM languages like Kotlin/Scala About the Team J.P. Morgan is a global leader in financial services, providing strategic advice and products to the world's most prominent corporations, governments More ❯
. Basic experience with test automation frameworks (e.g., PyTest, Great Expectations, DBT tests, or similar). Proficiency in SQL and scripting languages (e.g., Python, Scala) for test automation. Understanding of CI/CD and DevOps practices , with exposure to Azure DevOps, GitHub Actions, or Jenkins . Knowledge of data governance More ❯
objectives and compliance standards Support test and deployment of new products and features Participate in code reviews 🌱 About You Expert knowledge of Java/Scala/Kotlin. Familiarity with Kotlin or willingness to learn. Industrial experience with AWS/GCP/Azure. Knowledge of common data products such as Hadoop More ❯
objectives and compliance standards Support test and deployment of new products and features Participate in code reviews 🌱 About You Expert knowledge of Java/Scala/Kotlin. Familiarity with Kotlin or willingness to learn. Industrial experience with AWS/GCP/Azure. Knowledge of common data products such as Hadoop More ❯
objectives and compliance standards Support test and deployment of new products and features Participate in code reviews 🌱 About You Expert knowledge of Java/Scala/Kotlin. Familiarity with Kotlin or willingness to learn. Industrial experience with AWS/GCP/Azure. Knowledge of common data products such as Hadoop More ❯