network changes. Required Qualifications: Bachelor's degree in Computer Science with at least eight (8) years of relevant experience or equivalent. Proficiency with Python, C++, and Spark. Experience with Jupyter Notebooks, JIRA, Confluence, and Git/Gitlab. Understanding of ASDF and TCLD. Knowledge of XKEYSCORE/ABYSS fingerprints. Preferred Qualifications: Knowledge of customer corporate tools and data repositories. Security Clearance More ❯
as part of a cross functional team Preferred Qualifications Experience with Big Data technologies (e.g. HDFS, AWS, Spark, Kafka, Cassandra) Experience with Big Data query tools and engine (e.g. Jupyter Notebook, Trino, DBeaver) Experience with near real-time (NRT) and Batch data pipelines Experience black box testing Experience Client-Server products Knowledge in Data Quality, Data Profiling and Data Integration More ❯
cleaning, and automation Ability to turn data into stories using tools like Looker Studio, Power BI, or Tableau Comfortable working with cloud-based platforms and spreadsheets (e.g. GCP BigQuery, Jupyter notebooks, Google Sheets) Understanding of A/B testing, regression models, and evaluation methods More ❯
cleaning, and automation Ability to turn data into stories using tools like Looker Studio, Power BI, or Tableau Comfortable working with cloud-based platforms and spreadsheets (e.g., GCP BigQuery, Jupyter notebooks, Google Sheets) Understanding of A/B testing, regression models, and evaluation methods More ❯
and dedicated time for your personal development What you'll be working with: •Backend: Distributed, event-driven core Java (90% of the code-base), MySQL, Kafka •Data analytics: Python & Jupyter notebooks, Parquet, Docker •Testing: JUnit, JMH, JCStress, Jenkins, Selenium, many in-house tools •OS: Linux (Fedora for development, Rocky in production) The LMAX way is to use the right tool More ❯
/AI/LLM Solutions. A passion for Generative AI, and an understanding of strengths and weaknesses of Generative LLM's and AI technologies. Comfortable working with Python and Jupyter Notebooks. Excellent communication skills - you can toggle seamlessly between presenting to CEOs, and getting in the weeds or white boarding with technical audiences. High tolerance for ambiguity. You can identify More ❯
cleaning, and automation Ability to turn data into stories using tools like Looker Studio, Power BI, or Tableau Comfortable working with cloud-based platforms and spreadsheets (e.g. GCP BigQuery, Jupyter notebooks, Google Sheets) Understanding of A/B testing, regression models, and evaluation methods Competencies: 1. Celebrate the Difference 2. Remember Why We're Here 3. Find a Way More ❯
MITRE ATT & CK, STIX, CAPA, and knowledge capture with customer's relevant knowledge base systems CISSP, GIAC GREM, or CREA Certification is required Preferred Intermediate or greater Python/Jupyter experience More ❯
and dedicated time for your personal development What you'll be working with: •Backend: Distributed, event-driven core Java (90% of the code-base), MySQL, Kafka •Data analytics: Python & Jupyter notebooks, Parquet, Docker •Testing: JUnit, JMH, JCStress, Jenkins, Selenium, many in-house tools •OS: Linux (Fedora for development, Rocky in production) The LMAX way is to use the right tool More ❯
adoption, user engagement, and client satisfaction to measure the success and identify areas for improvement Stay Ahead of the Curve: Continuously monitor advancements with our integrated platforms (eg Excel, Jupyter and more) to identify new opportunities to drive user engagement and satisfaction You'll need to have: 5+ years of product management experience, demonstrating strong execution capabilities Strong understanding of More ❯
Big Data, dataflows, Artificial Intelligence/Machine Learning (AI/ML) familiarity, Analytics in GME, Jupyter notebooks, and Spark. Due to federal contract requirements, United States citizenship and an active TS/SCI security clearance and polygraph are required for the position. Required: Must be a US Citizen Must have TS/SCI clearance w/active polygraph This position … limitations. These Qualifications Would be Nice to Have: Fully Cleared polygraph is preferred Knowledge of working with Big Data, dataflows, Machine Learning/Artificial Intelligence familiarity. Analytics in GME, Jupyter notebooks, and Spark. $120,000 - $220,000 a year The pay range for this job, with multi-levels, is a general guideline only and not a guarantee of compensation or More ❯
yrs Relevant experience must be in computer or information systems design/development/analysis Discovery Analyst knowledgeable in high-capacity RF accesses. Highly proficient in AMOD tools, microplugins, jupyter notebooks, PCAP analysis, protocol analysis, and other related mission tools Preferred X4 High-Capacity Communications Completion of military training in a relevant area such as JCAC (Joint Cyber Analysis Course More ❯
experience with one of more of the following: - Network Exploitation and Target Analysis (NETA) - Network+ - Microsoft Excel - DNI - Network Discovery Desired Skills: - DataXplorer. - XKeyscore. - Grapevine. - Darkquest. - Autosnap. - Roadbed. - Reflectionpool. - Jupyter Notebook. Eligibility requirements apply to some benefits and may depend on your job classification and length of employment. Benefits are subject to change and may be subject to specific elections More ❯
work experience. Experience with designing cloud-native architectures using cloud services such as AWS, Google, IBM, and Oracle. Experience and expertise in writing efficient and performant SQL queries. Originate JupyterHub Notebooks for user consumption, in addition to providing user facilitation in both originating and correct SQL queries to maximize system performance, while ensuring user requirements are exceeded. Experience designing and … Agile certification Databases: Postgres, MariaDB, ELK, Minio, AWS S3, Neo4j, MongoDB, noSQL Languages: Python, SQL Operating Systems: Centos7, RockyLinux8 Orchestration: Kubernetes, Docker, Docker-Compose, Docker-Swarm Development Tools: Gitlab, Jupyterhub/Notebooks Environments: large collaboration, Agile development environments Data types: Unstructured, structured, or semi-structured data, including: CSV, JSON, JSONL, AVRO, Protocol Buffers, Parquet, etc. Position Clearance Requirement: TS/ More ❯
Defender XDR, Entra, Purview). Create scripts, APIs, and orchestrations that reduce manual effort and improve speed and accuracy in security operations. - Tell Stories with Data: Use tools like Jupyter Notebooks, Kusto Query Language (KQL), and Python to query and visualize large-scale security datasets. Translate telemetry into insights and share narratives that influence decision-making across engineering and leadership … engineering, preferably in cloud-native or regulated environments. - Strong programming/scripting skills (Python preferred) with a focus on infrastructure and operations tooling. - Experience working with large datasets in Jupyter Notebooks and building dashboards or reports for security posture and compliance. - Strong communication skills with an ability to convey technical concepts to non-technical stakeholders. - Role is UK based and More ❯
services. The CCE provides several containerized services that customers can provision and access on demand inside of a Kubernetes cluster. Some of the services include Gitlab CI/CD, Jupyter Notebooks, MinIO, and several other products. The Software Engineer shall perform maintenance and troubleshooting of containerized applications, upgrade services, assist customers using the CCE and develop any required software for … Charts for Kubernetes Experience with developing services on Kubernetes. Experience creating containerized applications/services using Docker. Experience providing customer outreach and troubleshooting support. Position Desired Skills Experience with Jupyter Notebooks Experience with Gitlab CI/CD pipelines Experience updating containerized applications to address STE/STN requirements Experience developing with REST APIs Experience with the Atlassian Tool suite (Bitbucket More ❯