-
Developing Brain Institute (DBI)
- Washington, D.C.
-
01:05
(UTC -05:00)
Stars
PDF++: the most Obsidian-native PDF annotation & viewing tool ever. Comes with optional Vim keybindings.
USCIS Employment-based-2 national interest waiver wait time
An extremely fast Python package and project manager, written in Rust.
The LLM's practical guide: From the fundamentals to deploying advanced LLM and RAG apps to AWS using LLMOps best practices
Ontology for all aspects of human mental functioning
Python script that downloads all pubmed abstracts corresponding to user-specified keyword searches, by performing automated NCBI E-utility queries
This repository contains tools and resources developed by Team Nassar during the "Resistance is Futile: A Codeathon to Combat Antimicrobial Resistance" (September 2024), hosted by NCBI and NIAID.
Train transformer language models with reinforcement learning.
Robust recipes to align language models with human and AI preferences
A significantly higher label quality version of the popular TACRED dataset
A basic AMIA (American Medical Informatics Association) style compliant LaTeX paper skeleton
The Levenshtein Python C extension module contains functions for fast computation of Levenshtein distance and string similarity
Evaluation script for named entity recognition (NER) systems based on entity-level F1 score.
[WWWJ 2024] LLMs for Knowledge Graph Construction and Reasoning: Recent Capabilities and Future Opportunities
A python module to repair invalid JSON from LLMs
CUDA Templates and Python DSLs for High-Performance Linear Algebra
This repository holds the code for working with data from counselchat.com
🧠💬 Enhance mental health research with a custom Named Entity Recognition model ❤️💊
Accurately find/replace/remove emojis in text strings
NLP based mental health classification.
A curated list of NLP resources focused on Transformer networks, attention mechanism, GPT, BERT, ChatGPT, LLMs, and transfer learning.
The simplest, fastest repository for training/finetuning medium-sized GPTs.
A minimal PyTorch re-implementation of the OpenAI GPT (Generative Pretrained Transformer) training
An implementation of model parallel GPT-2 and GPT-3-style models using the mesh-tensorflow library.
An implementation of model parallel autoregressive transformers on GPUs, based on the Megatron and DeepSpeed libraries
A workshop on SPARQL, OWL, and Datalog reasoning with RDFox
unsubscribe all channels in Youtube
A package supporting the conversion from Synthea CSV to OMOP CDM