|  | 
 
| 
 F:\BaiduNetdiskDownload\GPT-SoVITS-beta0306fix2>runtime\python.exe webui.py
 Running on local URL:  http://0.0.0.0:9874
 "F:\BaiduNetdiskDownload\GPT-SoVITS-beta0306fix2\runtime\python.exe" GPT_SoVITS/prepare_datasets/2-get-hubert-wav32k.py
 "F:\BaiduNetdiskDownload\GPT-SoVITS-beta0306fix2\runtime\python.exe" GPT_SoVITS/prepare_datasets/2-get-hubert-wav32k.py
 Traceback (most recent call last):
 File "F:\BaiduNetdiskDownload\GPT-SoVITS-beta0306fix2\GPT_SoVITS\prepare_datasets\2-get-hubert-wav32k.py", line 9, in <module>
 Traceback (most recent call last):
 File "F:\BaiduNetdiskDownload\GPT-SoVITS-beta0306fix2\GPT_SoVITS\prepare_datasets\2-get-hubert-wav32k.py", line 9, in <module>
 os.environ["CUDA_VISIBLE_DEVICES"]= os.environ.get("_CUDA_VISIBLE_DEVICES")
 File "os.py", line 684, in __setitem__
 os.environ["CUDA_VISIBLE_DEVICES"]= os.environ.get("_CUDA_VISIBLE_DEVICES")
 File "os.py", line 684, in __setitem__
 File "os.py", line 742, in check_str
 File "os.py", line 742, in check_str
 TypeError: str expected, not NoneType
 TypeError: str expected, not NoneType
 "F:\BaiduNetdiskDownload\GPT-SoVITS-beta0306fix2\runtime\python.exe" GPT_SoVITS/prepare_datasets/2-get-hubert-wav32k.py
 F:\BaiduNetdiskDownload\GPT-SoVITS-beta0306fix2\runtime\lib\site-packages\transformers\utils\generic.py:441: UserWarning: torch.utils._pytree._register_pytree_node is deprecated. Please use torch.utils._pytree.register_pytree_node instead.
 _torch_pytree._register_pytree_node(
 F:\BaiduNetdiskDownload\GPT-SoVITS-beta0306fix2\runtime\lib\site-packages\transformers\utils\generic.py:309: UserWarning: torch.utils._pytree._register_pytree_node is deprecated. Please use torch.utils._pytree.register_pytree_node instead.
 _torch_pytree._register_pytree_node(
 Some weights of the model checkpoint at GPT_SoVITS/pretrained_models/chinese-hubert-base were not used when initializing HubertModel: ['encoder.pos_conv_embed.conv.weight_v', 'encoder.pos_conv_embed.conv.weight_g']
 - This IS expected if you are initializing HubertModel from the checkpoint of a model trained on another task or with another architecture (e.g. initializing a BertForSequenceClassification model from a BertForPreTraining model).
 - This IS NOT expected if you are initializing HubertModel from the checkpoint of a model that you expect to be exactly identical (initializing a BertForSequenceClassification model from a BertForSequenceClassification model).
 Some weights of HubertModel were not initialized from the model checkpoint at GPT_SoVITS/pretrained_models/chinese-hubert-base and are newly initialized: ['encoder.pos_conv_embed.conv.parametrizations.weight.original0', 'encoder.pos_conv_embed.conv.parametrizations.weight.original1']
 You should probably TRAIN this model on a down-stream task to be able to use it for predictions and inference.
 
 
 
 | 
 |