build and optimize critical data pipelines, transforming raw data into clean, reliable, and performant dimensional models for business intelligence. Modernize Core ETL Processes: Systematically refactor our existing Java & SQL (PostgreSQL) based ETL system. You will identify and resolve core issues (e.g., data duplication, performance bottlenecks), strategically rewriting critical components in Python and migrating orchestration to Airflow . Implement Data Quality More ❯
solutions that meet business objectives 5 years of professional software engineering experience in Azure and Python 3 years of experience with relational and nonrelational databases such as SQL Server, PostgreSQL, Cosmos DB, MongoDB, etc. 2 years demonstratable experience with Azure AI solution design and production delivery, highlighting proficiency in natural language processing (NLP), retrieval augmented generation (RAG), and Azure AI More ❯
applications as a full stack engineer. Expert knowledge of front end development including Javascript, React, Redux and supporting technologies. Knowledge of backend development in Python with Flask, FastAPI, SQLAlchemy, Postgres or another modern stack. Extensive understanding of software design patterns and principles. Experience with Agile/Scrum methodologies. Excellent communication and teamwork skills. The ability to work independently and take More ❯
Ashburn, Virginia, United States Hybrid/Remote Options
Shuvel
optimizing data pipelines and architectures. Must have full life cycle experience in design, development, deployment, and monitoring. Experience with one or more relational database systems such as Oracle, MySQL, Postgres, SQL server, with heavy emphasis on Oracle. Extensive experience with cloud platforms (e.g. AWS, Google Cloud, etc) and cloud based ETL/ELT tools. Experience with Amazon services such as More ❯
Want to live in sunny St George and enjoy a healthy work-life balance; AWS certified and/or extensive experience with AWS services; Linux system administration; MySQL and PostgreSQL database administration; Webserver administration; Writing and managing AWS IAM policies; Setting up secure network access with AWS Security Groups; Configuring CI/CD workflows; Managing codebases with git and Github More ❯
cross-team projects without authority, and driving design and technology decisions. Technologies we use (nice to have experience) Monitoring and alerting: Datadog, Falcon LogScale (formerly Humio) • Database management systems: PostgreSQL, ClickHouse Deployment tools: Flux, Helm, Kustomize Frontend frameworks: React, Angular Infrastructure as code: Terraform, Terragrunt Cloud provider: AWS Event streaming platform: Kafka Big data processing: Databricks About Chainalysis Blockchain technology More ❯
Migration projects from no-code platforms to traditional development environments Technical Skills: Modern web frameworks (React, Vue.js, Angular, or similar) Backend development (Node.js, Python, PHP, or similar) Database management (PostgreSQL, MySQL, MongoDB) API integration Cloud services (AWS) Security best practices for financial applications Preferred Experience Fintech Expertise: Lending platforms, payment processing, or financial management systems African Market Knowledge: Understanding of More ❯
Java Experience in Frontend Technologies (preferably TypeScript, React) Strong experience with data modeling, schema evolution, and ETL/ELT pipelines Hands-on experience with cloud platforms (AWS, GCP), databases (Postgres, BigQuery), and streaming (Kafka, PubSub) Experience building customer-facing APIs and/or data products A strong product mindset and excitement for collaboration with PMs and customers Excellent communication skills More ❯
the change that healthcare - one of the largest and most inefficient industries in the world - needs. We want you to join us. Our Tech Stack: Backend: Python (FastAPI, Flask), PostgreSQL, Redis, Celery Infrastructure & DevOps: Docker, Kubernetes (EKS), AWS (SQS, SNS), GitLab CI/CD, ArgoCD, DataDog, OpenTelemetry, Pulumi Responsibilities: Automate workflows and CI/CD pipelines to speed up delivery More ❯
San Francisco, California, United States Hybrid/Remote Options
Mlabs
in software engineering, AI/ML/NLP engineering, and data analytics/engineering. Core Stack: Proficiency in Python and SQL along with experience using core data tools like PostgreSQL, dbt, and BigQuery. ML/AI Tools: Hands-on experience with ML/AI technologies such as LLM prompt engineering and frameworks like PyTorch . Systems Engineering: Experience with cloud More ❯
capacity Understand and promote security best practices at all levels of the organization Strong understanding of cloud service providers: AWS, Azure, etc. Securing Data Platforms such as Kafka, Oracle, PostgreSQL, etc. Multi-tenant platform or service delivery Continuous Compliance and Auditing methodologies Scripting automation or developing software: Python, PowerShell, NodeJS, etc. Implementing infrastructure-as-code concepts and technologies Exposure to More ❯
capacity Understand and promote security best practices at all levels of the organization Strong understanding of cloud service providers: AWS, Azure, etc. Securing Data Platforms such as Kafka, Oracle, PostgreSQL, etc. Multi-tenant platform or service delivery Continuous Compliance and Auditing methodologies Scripting automation or developing software: Python, PowerShell, NodeJS, etc. Implementing infrastructure-as-code concepts and technologies Exposure to More ❯
capacity Understand and promote security best practices at all levels of the organization Strong understanding of cloud service providers: AWS, Azure, etc. Securing Data Platforms such as Kafka, Oracle, PostgreSQL, etc. Multi-tenant platform or service delivery Continuous Compliance and Auditing methodologies Scripting automation or developing software: Python, PowerShell, NodeJS, etc. Implementing infrastructure-as-code concepts and technologies Exposure to More ❯
capacity Understand and promote security best practices at all levels of the organization Strong understanding of cloud service providers: AWS, Azure, etc. Securing Data Platforms such as Kafka, Oracle, PostgreSQL, etc. Multi-tenant platform or service delivery Continuous Compliance and Auditing methodologies Scripting automation or developing software: Python, PowerShell, NodeJS, etc. Implementing infrastructure-as-code concepts and technologies Exposure to More ❯
capacity Understand and promote security best practices at all levels of the organization Strong understanding of cloud service providers: AWS, Azure, etc. Securing Data Platforms such as Kafka, Oracle, PostgreSQL, etc. Multi-tenant platform or service delivery Continuous Compliance and Auditing methodologies Scripting automation or developing software: Python, PowerShell, NodeJS, etc. Implementing infrastructure-as-code concepts and technologies Exposure to More ❯
and on the job development will enable you to plug any knowledge gaps. Software development in web technologies and object oriented programming Database technologies such as Oracle SQL, Mongo, Postgres Know your way around Linux and Windows command lines, e.g. Bash and PowerShell Monitoring large systems using technologies such as Grafana, Prometheus, ELK, Splunk Experience of working in Agile teams More ❯
london, south east england, united kingdom Hybrid/Remote Options
Wealth Dynamix
in the financial sector or similar data-sensitive environments. Familiarity with Microsoft Azure and related services (Azure Service Bus, Key Vault, Table Storage, Cognitive Services). Knowledge of SQL (PostgreSQL) including data modelling, performance optimization, and security. Experience with Kafka and event-driven architectures. Exposure to enterprise integration tools such as Informatica, BizTalk, MuleSoft, or TIBCO, with an understanding of More ❯
IT field. 10-15 years of extensive software development experience. Strong proficiency in Java-class creation, JVM internals, garbage collection-with certifications preferred. Proficiency in SQL (Oracle, MS SQL, PostgreSQL), including stored procedures; certifications preferred. Expertise in Linux command-line scripting (bash) and system administration. Hands-on experience developing low-latency, high-throughput Java components. Practical experience building cash equity More ❯
distributed system design. Proficiency in Linux administration, Docker, and network configuration. Familiarity with Python or scripting languages for automation and data handling. Experience with databases (e.g. TimescaleDB, InfluxDB, or PostgreSQL) and data orchestration tools. Understanding of industrial cybersecurity principles and safe plant integration. Desirable Exposure to AWS/Azure IoT/Greengrass/Kubernetes or other relevant orchestration frameworks. Experience More ❯
YAML, JSON. Comfortable with multiple varied data formats and the pros/cons of each (e.g., CSV, Feather, Parquet, etc.). Proficiency in multiple database structures (e.g., MS SQL, Postgres, Snowflake, etc.). Compensation & Benefits The anticipated salary range for candidates is $105,400/year in our lowest geographic market range to up to $140,000/year in More ❯
competing technical solutions through awareness of the constantly shifting collections, processing, storage and analytic capabilities and limitations. Required Skills/Experience Experience with SQL databases and data models (e.g., PostgreSQL , Amazon Aurora) Experience with programming in R, Python, Java Experience with NoSQL databases ( e.g., MongoDB, Accumulo) Experience designing data models and approaches Experience working in a cloud environment ( e.g., AWS More ❯
competing technical solutions through awareness of the constantly shifting collections, processing, storage and analytic capabilities and limitations. Required Skills/Experience Experience with SQL databases and data models (e.g., PostgreSQL , Amazon Aurora) Experience with programming in R, Python, Java Experience with NoSQL databases ( e.g., MongoDB, Accumulo) Experience designing data models and approaches Experience working in a cloud environment ( e.g., AWS More ❯
Systems, or a related field 3-6 years of experience in data engineering, ETL development, or database architecture (experience in financial services or trading a plus) Strong SQL expertise (PostgreSQL, MySQL, or similar) and experience designing normalized and denormalized data models Proficiency with Python and other scripting languages for building and maintaining ETL pipelines and data loaders Experience with API More ❯
systems. Experience with core AWS services (S3, RDS, EMR, EKS, OpenSearch etc.). Experience with writing complex yet efficient SQL queries for data analysis purposes. Experience with common DBMS (PostgreSQL, MySQL, MongoDB etc.). Experience with data warehousing and parquet data manipulation (e.g. Athena, Redshift, BigQuery). Experience with Spark, Beam, Kafka, Hadoop or other data processing tools. Fluency in More ❯