Commit graph

3657 commits

Author SHA1 Message Date
oobabooga
18ca35faaa Space between chat tab and extensions block 2023-12-25 08:34:02 -08:00
oobabooga
73ba7a8921 Change height -> min-height for .chat 2023-12-25 08:32:02 -08:00
oobabooga
29b0f14d5a
Bump llama-cpp-python to 0.2.25 (#5077) 2023-12-25 12:36:32 -03:00
oobabooga
c06f630bcc Increase max_updates_second maximum value 2023-12-24 13:29:47 -08:00
Casper
92d5e64a82
Bump AutoAWQ to 0.1.8 (#5061) 2023-12-24 14:27:34 -03:00
oobabooga
4aeebfc571 Merge branch 'dev' into TheLounger-style_improvements 2023-12-24 09:24:55 -08:00
oobabooga
d76b00c211 Pin lm_eval package version 2023-12-24 09:22:31 -08:00
oobabooga
8c60495878 UI: add "Maximum UI updates/second" parameter 2023-12-24 09:17:40 -08:00
zhangningboo
1b8b61b928
Fix output_ids decoding for Qwen/Qwen-7B-Chat (#5045) 2023-12-22 23:11:02 -03:00
kabachuha
dbe438564e
Support for sending images into OpenAI chat API (#4827) 2023-12-22 22:45:53 -03:00
Stefan Daniel Schwarz
8956f3ebe2
Synthia instruction templates (#5041) 2023-12-22 22:19:43 -03:00
Yiximail
afc91edcb2
Reset the model_name after unloading the model (#5051) 2023-12-22 22:18:24 -03:00
Lounger
554a8f910b Attempt at shrinking chat area when input box grows 2023-12-22 04:51:20 +01:00
Lounger
588b37c032 Add slight padding to top of message container 2023-12-21 22:04:41 +01:00
Lounger
568541aa31 Remove bottom padding on chat tab 2023-12-21 21:48:34 +01:00
oobabooga
c1b99f45cb Make --help output instant 2023-12-21 09:32:20 -08:00
Lounger
0dd759c44f Claim more vertical space 2023-12-21 05:42:06 +01:00
Lounger
6fbd64db72 Set borders for all chat styles 2023-12-21 05:00:56 +01:00
oobabooga
2706149c65
Organize the CMD arguments by group (#5027) 2023-12-21 00:33:55 -03:00
oobabooga
c727a70572 Remove redundancy from modules/loaders.py 2023-12-20 19:18:07 -08:00
Lounger
e3e053ab99 UI: Expand chat vertically and handle header wrapping 2023-12-21 03:42:23 +01:00
Lounger
a098c7eee3 Merge branch 'dev' into style_improvements 2023-12-20 23:09:15 +01:00
luna
6efbe3009f
let exllama v1 models load safetensor loras (#4854) 2023-12-20 13:29:19 -03:00
oobabooga
bcba200790 Fix EOS being ignored in ExLlamav2 after previous commit 2023-12-20 07:54:06 -08:00
oobabooga
f0f6d9bdf9 Add HQQ back & update version
This reverts commit 2289e9031e.
2023-12-20 07:46:09 -08:00
oobabooga
b15f510154 Optimize ExLlamav2 (non-HF) loader 2023-12-20 07:31:42 -08:00
oobabooga
258c695ead Add rich requirement 2023-12-19 21:58:36 -08:00
oobabooga
fadb295d4d Lint 2023-12-19 21:36:57 -08:00
oobabooga
2289e9031e Remove HQQ from requirements (after https://github.com/oobabooga/text-generation-webui/issues/4993) 2023-12-19 21:33:49 -08:00
oobabooga
fb8ee9f7ff Add a specific error if HQQ is missing 2023-12-19 21:32:58 -08:00
oobabooga
366c93a008 Hide a warning 2023-12-19 21:03:20 -08:00
oobabooga
9992f7d8c0 Improve several log messages 2023-12-19 20:54:32 -08:00
oobabooga
23818dc098 Better logger
Credits: vladmandic/automatic
2023-12-19 20:38:33 -08:00
oobabooga
95600073bc Add an informative error when extension requirements are missing 2023-12-19 20:20:45 -08:00
Lounger
f9accd38e0 UI: Update chat instruct styles 2023-12-20 02:54:08 +01:00
oobabooga
d8279dc710 Replace character name placeholders in chat context (closes #5007) 2023-12-19 17:31:46 -08:00
Lounger
ff3e845b04 UI: Header boy is dropping shadows 2023-12-20 01:24:34 +01:00
Lounger
40d5bf6c35 Set margin on other tabs too 2023-12-19 23:42:13 +01:00
Lounger
f42074b6c1 UI: Remove header margin on chat tab 2023-12-19 23:27:11 +01:00
oobabooga
e83e6cedbe Organize the model menu 2023-12-19 13:18:26 -08:00
oobabooga
f4ae0075e8 Fix conversion from old template format to jinja2 2023-12-19 13:16:52 -08:00
oobabooga
de138b8ba6
Add llama-cpp-python wheels with tensor cores support (#5003) 2023-12-19 17:30:53 -03:00
oobabooga
0a299d5959
Bump llama-cpp-python to 0.2.24 (#5001) 2023-12-19 15:22:21 -03:00
oobabooga
83cf1a6b67 Fix Yi space issue (closes #4996) 2023-12-19 07:54:19 -08:00
oobabooga
9847809a7a Add a warning about ppl evaluation without --no_use_fast 2023-12-18 18:09:24 -08:00
oobabooga
f6d701624c UI: mention that QuIP# does not work on Windows 2023-12-18 18:05:02 -08:00
oobabooga
a23a004434 Update the example template 2023-12-18 17:47:35 -08:00
oobabooga
3d10c574e7 Fix custom system messages in instruction templates 2023-12-18 17:45:06 -08:00
dependabot[bot]
9e48e50428
Update optimum requirement from ==1.15.* to ==1.16.* (#4986) 2023-12-18 21:43:29 -03:00
俞航
9fa3883630
Add ROCm wheels for exllamav2 (#4973)
---------

Co-authored-by: oobabooga <112222186+oobabooga@users.noreply.github.com>
2023-12-18 21:40:38 -03:00