reviews and documentation of testing. Implement and management of scalable big data infrastructure using IaC such as Terraform. AWS expertise in services such as EMR, EC2, Lambda, Step Functions, Glue. Experience of Big Data Technologies - well rounded experience of Hadoop, Spark / Scala and NoSQL databases in a regulatory More ❯
Preferred Qualifications: Domain knowledge in Financial Industry and Capital Markets is a plus. Working knowledge in AWS cloud (EC2, ECS, Load Balancer, Security Group, EMR, Lambda, S3, Glue, etc.). Experience in DevOps development and deployment using Docker and containers. Python experience is a plus. About S&P Global Ratings More ❯
cloud technologies. Proficiency in Python, PySpark, SQL, and AWS Glue for ETL development. Hands-on experience with AWS data services, including Redshift, Athena, Glue, EMR, and Kinesis. Strong knowledge of data modeling, warehousing, and schema design. Experience with event-driven architectures, streaming data, and real-time processing using Kafka or More ❯
frameworks such as PyTorch, TensorFlow, and practical experience in solving complex problems in an applied environment - Experiences related to AWS services such as SageMaker, EMR, S3, DynamoDB and EC2, as well as experience with machine learning, deep learning, NLP, generative AI, distributed training, and model hosting Amazon is an equal More ❯
Job ID: Amazon Development Center U.S., Inc. The analytics team is looking for an experienced engineer to join the core engines team. Athena and EMR are services that our customer use to run large scale analytics, leveraging open source engines like Trino and Spark. The analytic engine organization makes significant More ❯
Nice If You Have: Experience with deploying analytics workloads on platform as a service ( PaaS ) and sof tware as a service ( SaaS ) , including AWS EMR, Redshift, or SageMaker or Azure Databricks, SQL Data Warehouse, or Machine Learning service Experience with distributed or parallel programming frameworks, including Apache Spark or NVIDIA More ❯
Job summary The Programme EPUT and MSEFT are embarking on the implementation of a UK first of type Unified ElectronicPatient Record (UEPR) system that spans Acute, Mental Health and Community services. The new system will be delivered through the Nova Programme, alongside … Foundation Trust (NEP) and South Essex Partnership University NHS Foundation Trust (SEPT). EPUT and MSEFT are embarking on a UK first of type EPR for a single system that spans Acute, Mental Health and Community health services. We have singed a contract with Oracle Health to … advice for medical, nursing and junior pharmacy staff to ensure maximum efficacy, safety and economic drug use. Demonstrate professional accountability to enhance patient care and ensure all clinical pharmacy staff adhere to the same principles. Lead and develop the clinical pharmacy services (across the inpatient and More ❯
About Us: At Ascom, we are revolutionizing healthcare with cutting-edge technology that makes the invisible patient visible. Our innovative solutions empower healthcare professionals with real-time, reliable data to enhance patient care. If you're passionate about technology and making a tangible impact … solutions, mastering their implementation and integration. This role offers a unique opportunity to work with state-of-the-art technology that directly improves patient outcomes and transforms healthcare systems. What You'll Do: Master Ascom Solutions: Quickly become an expert in our solutions, ensuring seamless integration with … and remote. Experience and knowledge of integration methods both in medical device settings and non-medical device settings. Familiarity with healthcare systems (EMR, EHR, PACS) and medical data exchange standards (HL7, DICOM, IHE, etc.). Eagerness and willingness to learn new technologies. Strong understanding and proven experience More ❯
in implementing cloud based data solutions using AWS services such as EC2, S3, EKS, Lambda, API Gateway, Glue and bid data tools like Spark, EMR, Hadoop etc. Hands on experience on data profiling, data modeling and data engineering using relational databases like Snowflake, Oracle, SQL Server; ETL tools like Informatica More ❯
Milton Keynes, Buckinghamshire, United Kingdom Hybrid / WFH Options
Banco Santander SA
Previous experience developing, testing, and deploying data pipelines, data lakes, data warehouses, and data marts using ideally AWS services such as S3, Glue, Athena, EMR, Kinesis, and Lambda. Experience of major operating platforms and their linkage, connectivity functions and issues. Cloud Security - Experience / skills of addressing concept issues More ❯
2. (Mandatory) Demonstrated experience with AWS cloud services, including long-term storage options, and cloud-based database services such as Databricks or Elastic MapReduce (EMR). 3. (Mandatory) Demonstrated experience with SQL database structures and mapping between SQL databases. 4. (Mandatory) Demonstrated experience in large-scale data migration efforts. 5. More ❯
medical technologies (AI, Blockchain, MIoT, RPA, etc.) and their security threats / risks • Knowledge of medical integration standards, including HL7, DICOM, EHR, and FHIR. More ❯
Experience with automation tools (Ansible, Chef, Puppet) and scripting languages (Python, Bash, PowerShell). Experience with cloud-native big data tools such as AWS EMR, Azure Synapse, Google BigQuery, or OCI Big Data Service. Expertise in data pipeline orchestration tools such as Apache Airflow, AWS Glue, or Azure Data Factory. More ❯
understanding of Agile principles and development tools, including JIRA and Confluence. Authorization to work in the United States without sponsorship. Preferred Qualifications Familiar with EHR systems in a healthcare environment. Some knowledge with healthcare interoperability standards such as HL7 and FHIR. Basic understanding of DevOps practices, including building and maintaining More ❯
JDK 1.8) & prefer to have some experience with Spring Boot. Good to have experience using AWS cloud services (e.g., EC2, S3, Lambda, MSK, ECS, EMR, RDS, Athena etc. ) Experience working with Maven, Jenkins, Git etc. Understanding of database concepts and working knowledge with any of the vendors (Preferably Oracle) with More ❯
years of experience in data engineering database design ETL processes and data warehousing 3 years of experience with AWS tools and technologies S3 EMR Glue Athena RedShift RDS Spectrum and Airflow 2 years of experience with CICD tools Strong knowledge of data storage and processing technologies including databasesand data lakes More ❯
recruiting a Software Engineer focussing on Python for our Software Team. Our Tech Stack: Python, FastAPI, Redis, Postgres, React, Plotly, Docker, Athena SQL, Athena & EMR Spark, ECS, Temporal, AWS, Azure. What you can expect as a Software Engineer at Monolith AI: As a Senior / Lead Software Engineer at More ❯
healthcare-related metrics. Prepare and deliver monthly, quarterly, and annual financial reports to executive leadership and stakeholders. Extract and manipulate large datasets from ERP, EMR, and data warehouse systems to generate meaningful analysis. Identify and analyze trends, variances, and key performance indicators to support business planning and budgeting. Support the More ❯
East London, London, United Kingdom Hybrid / WFH Options
Asset Resourcing
and database management systems (e.g., MySQL, PostgreSQL, MongoDB). Hands-on experience with AWS services for data processing, storage and analysis (e.g., S3, Redshift, EMR, Glue). Familiarity with programming languages such as Python or Java. Understanding of data warehousing concepts and data modeling techniques. Experience working with big data More ❯
and data interoperability within the organization. The Integration Developer will be responsible for integration engines such as Mirth Connect, Rhapsody, Cloverleaf, and Ensemble, MEDITECH EHR, protocols such as HL7, FHIR and CDA, as well as SQL databases, RESTful APIs, SOAP, and middleware platforms. This position works closely with project managers More ❯
programming languages (e.g. Python, Pyspark) 1+ years of experience with data orchestration / ETL tools (Airflow, Nifi) Experience with Snowflake, Databricks /EMR/ Spark, and / or Airflow Pay range and compensation package - The hiring range for this position in Santa Monica, CA is More ❯
and database management systems (e.g., MySQL, PostgreSQL, MongoDB). Hands-on experience with AWS services for data processing, storage and analysis (e.g., S3, Redshift, EMR, Glue). Familiarity with programming languages such as Python or Java. Understanding of data warehousing concepts and data modeling techniques. Experience working with big data More ❯
methodologies, and tools, including GitLab CI / CD and Jenkins Experience with distributed data and computing tools, including Spark, Databricks, Hadoop, Hive, AWS EMR, or Kafka Experience leading a team of AI and ML engineers, researchers, and data scientists to develop and deploy advanced AI and ML technologies in More ❯
Proficiency in coding one or more languages such as Java, Python or PySpark Experience in Cloud implementation with AWS Data Services, Glue ETL (or) EMR, S3, Glue Catalog, Athena, Lambda, Step Functions, Event Bridge, ECS, Data De / Serialization, Parquet, JSON format, IAM Services, Encryption, KMS, Secrets Manager. Practical More ❯
solutions that truly matter. Key Responsibilities: Design, develop, and optimize scalable data pipelines using technologies such as Apache Spark, Apache Iceberg, Trino, OpenSearch, AWS EMR, NiFi, and Kubernetes containers. Ingest and move structured and unstructured data using approved methods into enterprise or local storage systems. Create robust ETL processes and More ❯