text-generation-webui/modules
2023-06-06 13:05:05 -03:00
..
AutoGPTQ_loader.py Extend AutoGPTQ support for any GPTQ model (#1668) 2023-06-02 01:33:55 -03:00
callbacks.py Remove mutable defaults from function signature. (#1663) 2023-05-08 22:55:41 -03:00
chat.py Fix a minor bug 2023-06-06 12:57:13 -03:00
deepspeed_parameters.py Style improvements (#1957) 2023-05-09 22:49:39 -03:00
evaluate.py Remove some unused imports 2023-06-06 07:05:46 -03:00
extensions.py Prevent unwanted log messages from modules 2023-05-21 22:42:34 -03:00
GPTQ_loader.py Prevent unwanted log messages from modules 2023-05-21 22:42:34 -03:00
html_generator.py Add markdown table rendering 2023-05-10 13:41:23 -03:00
llama_attn_hijack.py Prevent unwanted log messages from modules 2023-05-21 22:42:34 -03:00
llamacpp_model.py Make llama.cpp read prompt size and seed from settings (#2299) 2023-05-25 10:29:31 -03:00
logging_colors.py Prevent unwanted log messages from modules 2023-05-21 22:42:34 -03:00
LoRA.py Handle the case of older autogptq install 2023-06-06 13:05:05 -03:00
models.py Remove softprompt support 2023-06-06 07:42:23 -03:00
monkey_patch_gptq_lora.py Better warning messages 2023-05-03 21:43:17 -03:00
RWKV.py Fix the missing Chinese character bug (#2497) 2023-06-02 13:45:41 -03:00
sampler_hijack.py Add tail-free and top-a sampling (#2357) 2023-05-29 21:40:01 -03:00
shared.py Remove softprompt support 2023-06-06 07:42:23 -03:00
text_generation.py Remove softprompt support 2023-06-06 07:42:23 -03:00
training.py Fix warning for qlora (#2438) 2023-05-30 11:09:18 -03:00
ui.py Use AutoGPTQ by default for GPTQ models 2023-06-05 15:41:48 -03:00
utils.py Remove softprompt support 2023-06-06 07:42:23 -03:00