aifaq.wtf

"How do you know about all this AI stuff?"
I just read tweets, buddy.

#local models

Page 1 of 1

Improving Search Ranking with Few-Shot Prompting of LLMs

#fine-tuning   #shortcuts   #local models   #models   #performance   #evaluation   #link  

This is good in combination with Hugging Face's Synthetic data: save money, time and carbon with open source.

What is better ? 7B-Q4_K_M or 13B_Q2_K ? (Reddit)

#local models   #quantization   #diy   #explanations and guides and tutorials   #link  

A discussion of model sizes vs quantization on /r/LocalLLaMA, relevant for anyone interested in running models on their own machines. Generally:

Ive read that a larger sized model even at a lower quant will most likely yield better results than a smaller model at a higher quant

And this is a great ELI5 for quantization.

@simonw on July 12, 2023

#local models   #user experience   #user interface   #tools   #open models   #models   #tweets