accountability, and communication skills. Bonus Points For: Experience leading projects in SCIF environments. Expertise in Cyber Analytics, PCAP, or network monitoring. Familiarity with Spark, Dask, Snowpark, Kafka, or task schedulers like Airflow and Celery. More ❯
leading project development efforts from a SCIF. Familiarity with cybersecurity analytics, including PCAP, CVEs, and network monitoring. Experience integrating with technologies such as Spark, Dask, Snowpark, or Kafka. Background in web application stacks (e.g., Flask, Django) or task schedulers (e.g., Airflow, Celery, Prefect). Compensation & Benefits: Competitive salary, equity, and More ❯
technical concepts to non-technical stakeholders. In-depth understanding of the Python software development stacks, ecosystems, frameworks and tools such as Numpy, Scipy, Pandas, Dask, spaCy, NLTK, sci-kit-learn and PyTorch Experience with popular Python frameworks such as Django, Flask or Pyramid Experience with Jupyter Notebooks Experience with EDA More ❯
development practices. Knowledge of Apache Spark and similar programming to support streaming data. Experience with some of the following Python libraries: NumPy, Pandas, PySpark, Dask, Apache Airflow, Luigi, SQLAlchemy, Great Expectations, Petl, Boto3, matplotlib, dbutils, koalas, OpenPyXL, XlsxWriter. Experience with monitoring and logging tools (e.g., Prometheus, Grafana, ELK Stack, Splunk More ❯
cases. Proficient in one of the deep learning stacks such as PyTorch or Tensorflow. Working knowledge of parallelisation and async paradigms in Python, Spark, Dask, Apache Ray. An awareness and interest in economic, financial and general business concepts and terminology. Excellent written and verbal command of English. Strong problem-solving More ❯
scikit-learn, XGBoost, LightGBM, statsmodels NLP & ML Frameworks: spaCy, Hugging Face Transformers, TensorFlow or PyTorch (where applicable) Data Engineering & Pipelines: dbt, Airflow, SQL, Spark, Dask Visualisation: Plotly, Seaborn, Matplotlib, Dash, Streamlit Dev & Collaboration Tools: Jupyter, Git, Docker, VS Code, CI/CD tools Ideal but Not Required: Fluency in French More ❯
scikit-learn, XGBoost, LightGBM, statsmodels NLP & ML Frameworks: spaCy, Hugging Face Transformers, TensorFlow or PyTorch (where applicable) Data Engineering & Pipelines: dbt, Airflow, SQL, Spark, Dask Visualisation: Plotly, Seaborn, Matplotlib, Dash, Streamlit Dev & Collaboration Tools: Jupyter, Git, Docker, VS Code, CI/CD tools Ideal but Not Required: Fluency in French More ❯
of Python, R, and Java. Experience scaling machine learning on data and compute grids. Proficiency with Kubernetes, Docker, Linux, and cloud computing. Experience with Dask, Airflow, and MLflow. MLOps, CI, Git, and Agile processes. Why you do not want to miss this career opportunity? We are a mission-driven firm More ❯
ecosystems; experience with other languages is a plus Ability to deploy ML capabilities into systems Familiarity with Python data science libraries (NumPy, SciPy, Pandas, Dask, spaCy, NLTK, scikit-learn) Commitment to writing clean, maintainable, and well-tested code Proficiency in automation, system monitoring, and cloud platforms like AWS or Azure More ❯
ML models into production environments, including both batch and real-time/streaming contexts Proficiency working with distributed computing frameworks such as Apache Spark , Dask, or similar Experience with cloud-native ML deployment , particularly on AWS , using services like ECS, EKS, Fargate, Lambda, S3, and more Familiarity with orchestration and More ❯
Airflow Willingness to work with proprietary technologies like Slang/SECDB Understanding of compute resources and performance metrics Knowledge of distributed computing frameworks like DASK and cloud processing Experience managing projects through entire SDLC About Goldman Sachs Goldman Sachs is a leading global investment banking, securities, and investment management firm More ❯
Airflow Willingness to work with proprietary technologies like Slang/SECDB Understanding of compute resources and performance metrics Knowledge of distributed computing frameworks like DASK and cloud processing Experience managing projects through the entire SDLC ABOUT GOLDMAN SACHS At Goldman Sachs, we dedicate our people, capital, and ideas to help More ❯
to interpret performance metrics (e.g., CPU, memory, threads, file handles). Knowledge and experience in distributed computing - parallel computation on a single machine like DASK, Distributed processing on Public Cloud. Knowledge of SDLC and experience in working through the entire life cycle of the project from start to end. ABOUT More ❯
to interpret performance metrics (e.g., CPU, memory, threads, file handles). • Knowledge and experience in distributed computing - parallel computation on a single machine like DASK, Distributed processing on Public Cloud. • Knowledge of SDLC and experience in working through entire life cycle of the project from start to end ABOUT GOLDMAN More ❯
to interpret performance metrics (e.g., CPU, memory, threads, file handles). • Knowledge and experience in distributed computing - parallel computation on a single machine like DASK, Distributed processing on Public Cloud. • Knowledge of SDLC and experience in working through entire life cycle of the project from start to end ABOUT GOLDMAN More ❯
C/C++ HPC experience: development and operations of highly distributed software systems and distributed, multi-core data driven processing algorithm development (such as Dask, MPI and/or OpenMP More ❯
COMPANY OVERVIEW XCEL Engineering, Inc. is an award-winning small business that provides trusted information technology, engineering, consulting and project management solutions and services to federal agencies and organizations. Originally founded in 1971 by professional engineers at the University of More ❯