Machine learning resources
Online resources : π dataloop | π€ huggingface models & papers
Coding assistants : π‘Windsurf vs-code plugin π‘Refact.ai πͺcodeconvert
APIs : π°οΈAnthropic API π°οΈGoogle GenAI π°οΈOpenAI-python β¦
LLMs : π¦OLlama, π€Whisper, πGemma Λ, πSpaCy
Light LLMs : litgpt π pythia π TinyLlama π NanoGPT-128MΛ
Physics : PhysicsNeMo βοΈ ( e.g. Fourier πͺ¨ darcy_fno
Geophysics : SPADE-Terrain-GAN π ada_multigrid_ppo
Digital Rock Physics : πͺ¨ GeoSlicer
Training :π Lookahead π Gymnasium
Distributed training : NCCL π§ GLOO π§ MPI π§ Λ
ML deployment : llama.cpp π¦ Pytorch π’ onnx π¦ jax onnx runtime π¦ ggml π§ MLflowΛ
UI : πͺ ComfyUI
Agents : βοΈstanford-oval/storm, π±ottomator
Datasets : VQAπ π¦οΈ WeatherBench πΊοΈ COCOΛ πΌοΈ FineWebΛ
Computational costs, based on F.G.Raeiniβs MSc, Per batch size of 128, GPU: N100, CPU: AMD 5700U
Model | Num-Params | size (MB) | Inf time (GPU) | Train-time CPU! | Acc. VQAv1/2,AOK |
---|---|---|---|---|---|
ViLT | 82M | 470 | 2 s | 25 s | 72%, 44% |
ResAttLSTM | 80 | 2.5 s | 62%, 30% |
Larger models:
GIT: 707MB | Qwen2-72B: 43GB | BLIP: 990MB-1.9GB | Florence-230M | LLaVA-7B: 15GB | LAVIS
VQA datasets:
dataset | VQA-v2 | VQA-v1 | AOK-VQA | VizWiz |
---|---|---|---|---|
train, val: | 443k, 214k | 214k, 121k | 17.0k, 1.14k | 200k, 40k Λ |