text-generation-webui/modules
2023-08-10 10:02:48 -07:00
..
AutoGPTQ_loader.py Add --no_use_cuda_fp16 param for AutoGPTQ 2023-06-23 12:22:56 -03:00
block_requests.py Block a cloudfare request 2023-07-06 22:24:52 -07:00
callbacks.py Make stop_everything work with non-streamed generation (#2848) 2023-06-24 11:19:16 -03:00
chat.py Change the filenames for caches and histories 2023-08-09 07:47:19 -07:00
deepspeed_parameters.py Fix typo in deepspeed_parameters.py (#3222) 2023-07-24 11:17:28 -03:00
evaluate.py Sort some imports 2023-06-25 01:44:36 -03:00
exllama.py Credit turboderp 2023-08-06 13:43:15 -07:00
exllama_hf.py Add Classifier Free Guidance (CFG) for Transformers/ExLlama (#3325) 2023-08-06 17:22:48 -03:00
extensions.py Add extension example, replace input_hijack with chat_input_modifier (#3307) 2023-07-25 18:49:56 -03:00
github.py Implement sessions + add basic multi-user support (#2991) 2023-07-04 00:03:30 -03:00
GPTQ_loader.py Remove unused import 2023-08-10 00:10:14 -05:00
html_generator.py Change the filenames for caches and histories 2023-08-09 07:47:19 -07:00
llama_attn_hijack.py Prevent unwanted log messages from modules 2023-05-21 22:42:34 -03:00
llamacpp_hf.py Refactor everything (#3481) 2023-08-06 21:49:27 -03:00
llamacpp_model.py Refactor everything (#3481) 2023-08-06 21:49:27 -03:00
loaders.py Add RoPE scaling support for transformers (including dynamic NTK) 2023-08-08 21:25:48 -07:00
logging_colors.py Add menus for saving presets/characters/instruction templates/prompts (#2621) 2023-06-11 12:19:18 -03:00
LoRA.py Use 'torch.backends.mps.is_available' to check if mps is supported (#3164) 2023-07-17 21:27:18 -03:00
models.py Add RoPE scaling support for transformers (including dynamic NTK) 2023-08-08 21:25:48 -07:00
models_settings.py Refactor everything (#3481) 2023-08-06 21:49:27 -03:00
monkey_patch_gptq_lora.py Revert "Remove GPTQ-for-LLaMa monkey patch support" 2023-08-10 08:39:41 -07:00
presets.py Add Classifier Free Guidance (CFG) for Transformers/ExLlama (#3325) 2023-08-06 17:22:48 -03:00
prompts.py Move characters/instruction-following to instruction-templates 2023-08-06 17:50:32 -07:00
relative_imports.py Add ExLlama+LoRA support (#2756) 2023-06-19 12:31:24 -03:00
RWKV.py Add ExLlama support (#2444) 2023-06-16 20:35:38 -03:00
sampler_hijack.py Fix: Mirostat fails on models split across multiple GPUs 2023-08-05 13:45:47 -03:00
shared.py Streamline GPTQ-for-LLaMa support 2023-08-09 23:42:34 -05:00
text_generation.py Refactor everything (#3481) 2023-08-06 21:49:27 -03:00
training.py Revert "Remove GPTQ-for-LLaMa monkey patch support" 2023-08-10 08:39:41 -07:00
ui.py Refactor everything (#3481) 2023-08-06 21:49:27 -03:00
ui_chat.py Increase the Context/Greeting boxes sizes 2023-08-08 00:09:00 -03:00
ui_default.py Refactor everything (#3481) 2023-08-06 21:49:27 -03:00
ui_file_saving.py Refactor everything (#3481) 2023-08-06 21:49:27 -03:00
ui_model_menu.py Update installation documentation 2023-08-10 00:53:48 -05:00
ui_notebook.py Refactor everything (#3481) 2023-08-06 21:49:27 -03:00
ui_parameters.py Refactor everything (#3481) 2023-08-06 21:49:27 -03:00
ui_session.py Refactor everything (#3481) 2023-08-06 21:49:27 -03:00
utils.py Don't show oobabooga_llama-tokenizer in the model dropdown 2023-08-10 10:02:48 -07:00