text-generation-webui加载codellama报错DLL load failed while importing flash
使用text-generation-webui加载codellama,报错:
Traceback (most recent call last): File "C:\Users\Ma\AppData\Roaming\Python\Python310\site-packages\transformers\utils\import_utils.py", line 1353, in _get_module return importlib.import_module("." + module_name, self.__name__) File "D:\Anaconda\Anaconda\envs\codellama\lib\importlib_init_.py", line 126, in import_module return _bootstrap._gcd_import(name[level:], package, level) File "", line 1050, in _gcd_import File "", line 1027, in _find_and_load File "", line 1006, in _find_and_load_unlocked File "", line 688, in _load_unlocked File "", line 883, in exec_module File "", line 241, in _call_with_frames_removed File "C:\Users\Ma\AppData\Roaming\Python\Python310\site-packages\transformers\models\llama\modeling_llama.py", line 48, in from flash_attn import flash_attn_func, flash_attn_varlen_func File "C:\Users\Ma\AppData\Roaming\Python\Python310\site-packages\flash_attn_init_.py", line 3, in from flash_attn.flash_attn_interface import ( File "C:\Users\Ma\AppData\Roaming\Python\Python310\site-packages\flash_attn\flash_attn_interface.py", line 8, in import flash_attn_2_cuda as flash_attn_cuda ImportError: DLL load failed while importing flash_attn_2_cuda: 找不到指定的模块。 The above exception was the direct cause of the following exception: Traceback (most recent call last): File "E:\模型\text-generation-webui\text-generation-webui\modules\ui_model_menu.py", line 209, in load_model_wrapper shared.model, shared.tokenizer = load_model(shared.model_name, loader) File "E:\模型\text-generation-webui\text-generation-webui\modules\models.py", line 85, in load_model output = load_func_map[loader](model_name) File "E:\模型\text-generation-webui\text-generation-webui\modules\models.py", line 155, in huggingface_loader model = LoaderClass.from_pretrained(path_to_model, **params) File "C:\Users\Ma\AppData\Roaming\Python\Python310\site-packages\transformers\models\auto\auto_factory.py", line 565, in from_pretrained model_class = _get_model_class(config, cls._model_mapping) File "C:\Users\Ma\AppData\Roaming\Python\Python310\site-packages\transformers\models\auto\auto_factory.py", line 387, in _get_model_class supported_models = model_mapping[type(config)] File "C:\Users\Ma\AppData\Roaming\Python\Python310\site-packages\transformers\models\auto\auto_factory.py", line 740, in getitem return self._load_attr_from_module(model_type, model_name) File "C:\Users\Ma\AppData\Roaming\Python\Python310\site-packages\transformers\models\auto\auto_factory.py", line 754, in _load_attr_from_module return getattribute_from_module(self._modules[module_name], attr) File "C:\Users\Ma\AppData\Roaming\Python\Python310\site-packages\transformers\models\auto\auto_factory.py", line 698, in getattribute_from_module if hasattr(module, attr): File "C:\Users\Ma\AppData\Roaming\Python\Python310\site-packages\transformers\utils\import_utils.py", line 1343, in getattr module = self._get_module(self._class_to_module[name]) File "C:\Users\Ma\AppData\Roaming\Python\Python310\site-packages\transformers\utils\import_utils.py", line 1355, in _get_module raise RuntimeError( RuntimeError: Failed to import transformers.models.llama.modeling_llama because of the following error (look up to see its traceback): DLL load failed while importing flash_attn_2_cuda: 找不到指定的模块。
一开始排查是以为transformers的版本不对,先确定了transformers的版本,transformers的版本应该大于4.35.0
把transformers升级为4.35.0后仍然报错
接着排查cuda和torch的版本
最后发现是cuda版本与torch版本不匹配
>>> print(torch.version.cuda) # 检查CUDA版本 >>> 11.8
控制台运行nvcc --version :
输出:
nvcc: NVIDIA (R) Cuda compiler driver Copyright (c) 2005-2023 NVIDIA Corporation Built on Wed_Feb__8_05:53:42_Coordinated_Universal_Time_2023 Cuda compilation tools, release 12.1, V12.1.66 Build cuda_12.1.r12.1/compiler.32415258_0
最后解决:
先卸载原本的torch:
pip uninstall torch torchvision torchaudio
然后安装12.1的:
pip install torch torchvision torchaudio -f https://download.pytorch.org/whl/cu121/torch_stable.html
最后加载成功codellama
免责声明:我们致力于保护作者版权,注重分享,被刊用文章因无法核实真实出处,未能及时与作者取得联系,或有版权异议的,请联系管理员,我们会立即处理! 部分文章是来自自研大数据AI进行生成,内容摘自(百度百科,百度知道,头条百科,中国民法典,刑法,牛津词典,新华词典,汉语词典,国家院校,科普平台)等数据,内容仅供学习参考,不准确地方联系删除处理! 图片声明:本站部分配图来自人工智能系统AI生成,觅知网授权图片,PxHere摄影无版权图库和百度,360,搜狗等多加搜索引擎自动关键词搜索配图,如有侵权的图片,请第一时间联系我们。