Job Description: We are seeking a Data Scientist proficient in Python and experienced in automating workflows, data manipulation, and visualization using Jupyter Notebooks. This role involves leveraging Python expertise to streamline processes and create insightful visualizations for data-driven decision-making. The Level 3 Data Scientist shall possess the following capabilities: Employ some combination (2 or more) of the following More ❯
i.e. Apache Spark, Scikit-learn, XGBoost, etc.) Experience with various data manipulation and pipeline libraries (i.e. Pandas, Polars, matplotlib, Plotly, numpy, scipy, etc.)Experience with data science environments (e.g. Jupyter Notebook, Data Bricks, or Amazon Sage Maker) Experience with Python, R, and/or Java programming languages Experience implementing the data science process, developing experiments, reporting and explaining results Familiar More ❯
instances using Docker, Kubernetes, Ansible, and Terraform technologies. Created and maintained data objects using the AWS S3 Bucket services. Integrated authentication & authorization services using Keycloak services. Created Python and Jupyter Notebook modules and components to support machine learning and data science activities. This is a hands-on coding/programmer position. 1) Data flow & analytic services using the AWS Cloud … and Neo4J technologies (Full Stack) 4) AWS automation and orchestration services. 5) Creat test units using Mockito & Junit components 6) Kubernetes 7) Ansible and Terraform Technologies 8) Python and Jupyter Notebook modules More ❯
preparation, data governance and analytic development and production Must have in-depth experience with at least two of the following advanced scripting languages and tools; Python, R, SQL, Lucene, Jupyter, Pig, Scala, ELK Stack, Splunk, PowerBI, or Jupyter Notebooks Preferred: Experience instructing NCU courses/NCU Adjunct Certified Compensation Range: $110,000 - $256,000 (dependent on years of experience, education More ❯
like Apache Airflow, NiFi and Kafka Strong analytical and problem-solving skills Excellent communication and teamwork abilities Eagerness to learn and grow in a fast-paced environment Experience in Jupyter Notebooks, PostgreSQL. Experience with version control systems (e.g., Git) Desired Qualifications: Knowledge of data lake technologies and big data tools (e.g., Spark) Familiarity with containerization tools like Docker More ❯
taskings, or collection/processing workflows. Prior work with Sponsor data systems, applications, or database structures. Familiarity with Sponsor data handling procedures and clearance environments. Hands-on experience with Jupyter Notebooks, Visio, SharePoint, Confluence, or other visualization/documentation tools. Experience working with APIs, including integration and data extraction. Knowledge of version control systems such as GitHub. Proficiency in R More ❯
you're iterative, pragmatic, and eager to deliver value through experimentation, learning, and continuous improvement. You're developing fluency with modern data science tools and technologies such as Python, Jupyter, SQL, pandas, and visualization libraries. You have a basic understanding of machine learning workflows and are excited to deepen your knowledge through hands-on experience. You embrace AI tools to More ❯
TDNAs, DSs, and fellow SMEs. SIGDEV or RF (HF, VHF, UHF, SATCOM) experience are nice to haves. You will need demonstrated experience with data processing, normalization, and exploratory analysis. Jupyter Notebooks & Python experience, data modeling experience leveraging various algorithms, and data visualization skills are required. You will need to have the capability to deliver presentations or briefings to a non More ❯
to Have: SIGINT, Cyber, and/or Computer Network Operations (CNO) background. Additional Experience in: Javascript, Vue.js, Hadoop, GM analytic development, .NET environment, debuggers, development of packet level programs, Jupyter Notebooks, Jira, Confluence, Gitlab. $85,000 - $250,000 a year The pay range for this job, with multi-levels, is a general guideline only and not a guarantee of compensation More ❯
preprocessing, language modeling, and semantic similarity. Strong proficiency in Python, including use of ML libraries such as TensorFlow, PyTorch, or similar. Experience with data science tools and platforms (e.g., Jupyter, Pandas, NumPy, MLFlow). Familiarity with cloud-based AI tools and infrastructure, especially within the AWS ecosystem. Strong understanding of data structures, algorithms, and statistical analysis. Experience working with ETL More ❯
preprocessing, language modeling, and semantic similarity. Strong proficiency in Python, including use of ML libraries such as TensorFlow, PyTorch, or similar. Experience with data science tools and platforms (e.g., Jupyter, Pandas, NumPy, MLFlow). Familiarity with cloud-based AI tools and infrastructure, especially within the AWS ecosystem. Strong understanding of data structures, algorithms, and statistical analysis. Experience working with ETL More ❯
Learning, AI, Statistics, Economics or equivalent) 5+ years of professional working experience Someone who thrives in the incremental delivery of high quality production systems Proficiency in Java, Python, SQL, Jupyter Notebook Experience with Machine Learning and statistical inference. Understanding of ETL processes and data pipelines and ability to work closely with Machine Learning Engineers for product implementation Ability to communicate More ❯
Query Language (NoSQL), Application Program Interface (API) Building, Extract, Transform, and Load (ETL) pipelines, Web Application Servers, or Search Index. • Experience using programming languages and products such as Python, Jupyter Notebook, Pandas, Numpy, Requests, or Antigravity. • Experience applying complex mathematical and statistical concepts. • Experience applying statistical and operations research methods and tools. • Experience employing spreadsheets for data manipulation and visualization. More ❯
clustering, classification, predictive modelling) through coursework, internships, or independent projects You are proficient in Python (especially pandas, numpy, scikit-learn, or similar libraries) and comfortable performing data analysis using Jupyter notebooks or similar tools You are comfortable writing clear, efficient SQL for extracting, cleaning, and preparing datasets, demonstrated through coursework, internships, or personal analytical projects You have demonstrated initiative by More ❯
real business impact. What We're Looking For A strong foundation in data structures, algorithms, data modelling, and software architecture. Solid hands- on experience in Python and its ecosystem ( Jupyter, Pandas, Scikit- learn, Matplotlib), and comfort working with SQL for data analysis. Experience with LangChain is a plus. Experience of delivering AI and ML- based products into production environments, with More ❯
Mc Lean, Virginia, United States Hybrid / WFH Options
MITRE
machines). • Experience working with databases (e.g., PostgreSQL, Oracle, MySQL, MongoDB, Neo4J). • Experience using version control (e.g., Git, Mercurial, SVN) to support collaborative development. • Experience utilizing notebooks (e.g., Jupyter, R Markdown, Zeppelin). • Experience developing interactive data visualizations using open-source technologies (e.g., Angular, Vue, React, D3.js) or other frameworks (e.g., Shiny). Why Join MITRE? Why choose between More ❯
analytics. Enhance existing models to leverage analytic output for driving automated tasking. Develop scripts for running and testing the algorithms and analytics, using programs such as Databricks, Python, and Jupyter Notebook. Make presentations and reports at the request of the government in addition to a task kick off, midpoint review, and final report and presentation. Experience in at least two … work experience. Desirable skills and experience include synthetic aperture radar image science or analysis; other remote sensing imaging such as multispectral; programming and script development; data science; Python, Databricks, Jupyter Notebook; Intelligence Community analytical workflows. More ❯
create new dashboards, reports and any additional ad-hoc requests, using a wide variety of database applications/analytical tools/languages including GCP BigQuery, SQL, SAS, R, Python, Jupyter Notebook, Adobe Analytics, and Tableau. 10% - Management of Data - Collaborate with other decision analysts to explore new sources of data; cleanse, validate and test data. 5% - Project Management & Support - Supports … and roadmaps for insights for the online merchandising, marketing, and ELT organizations Ability to partner with requesting organizations to build KPI frameworks Ability to code in Python, R, SQL, Jupyter Notebook, and VertexAI. Experience managing multiple projects and working in a fast-paced environment Critical thinking skills to identify the strengths and weaknesses of alternative solutions; ability to understand and More ❯
culture of Millennium, judged by the ability to deliver timely solutions to portfolio and risk managers within the firm. Mandatory Requirements 3+ years Python development experience (pandas, numpy, polars, jupyter notebooks, FAST API) Experience with AWS services, such as: S3, EC2, AWS Batch and Redshift Proficiency in relational and non-relational database technologies BA or Master in computer science/ More ❯
requirements: Bachelor's Degree Part time or full time: Full Time Flexibility option: High Co-Location (3-4 days on-site) Skills: Data Analytics Data Science Data Visualization GitLab Jupyter Notebooks Python and R Studio More ❯
requirements: Bachelor's Degree • Part time or full time: Full Time • Flexibility option: High Co-Location (3-4 days on-site) ,Skills: • Data Analytics • Data Science • Data Visualization • GitLab • Jupyter Notebooks • Python and R Studio More ❯
working with Sponsor business or mission data, Sponsor applications, or Sponsor database structures. Demonstrated experience with the Sponsor's data handling procedures. Demonstrated experience with visualization tools, such as Jupyter, Visio, SharePoint, or Confluence. Demonstrated experience with APIs. Demonstrated experience using GitHub, or similar version control systems. Demonstrated experience using Python or R. Utilizing critical thinking and analytic judgements and More ❯
Washington, Washington DC, United States Hybrid / WFH Options
M9 Solutions
programming languages such as Python. Agile development experience along with related technologies (e.g., Jira). Ability, openness, and eagerness to learn. Skills: Data Analytics Data Science Data Visualization GitLab Jupyter Notebooks Python and R Studio Full-Time Employee Compensation M9 Solutions' pay range for this position is a general guideline only and not a guarantee of compensation or salary. Additional More ❯
Playwright or similar testing frameworks. REST APIs: Strong understanding of integrating and working with RESTful services. Data Skills: Experience in data wrangling/analysis (e.g., using SQL or Python, Jupyter Notebook). Collaboration: Experience working in an Agile environment (Scrum/Kanban). Problem-Solving: Strong analytical and troubleshooting skills. Desirable Skills Familiarity with state management libraries (MobX, Redux). More ❯
software, libraries, and packages in a Linux environment Extensive software development experience with Python and Java Experience with Big Data streaming platforms including Spark Experience with deploying and managing Jupyter Notebook environments Experience with data parsing/transformation technologies including JSON, XMl, CSV, and Parquet formats Experience with stream/batch Big Data processing and analytic frameworks Experience with CI More ❯