生成器 `max_length` 的 query() 成功
目标:在 Hugging Face Transformers 生成器查询中设置 min_length
和 max_length
。
我已传递 50, 200
作为这些参数。然而,我的输出长度要高得多......
没有运行时故障。
from transformers import pipeline, set_seed
generator = pipeline('text-generation', model='gpt2')
set_seed(42)
def query(payload, multiple, min_char_len, max_char_len):
print(min_char_len, max_char_len)
list_dict = generator(payload, min_length=min_char_len, max_length=max_char_len, num_return_sequences=multiple)
test = [d['generated_text'].split(payload)[1].strip() for d in list_dict]
for t in test: print(len(t))
return test
query('example', 1, 50, 200)
输出:
50 200
Setting `pad_token_id` to `eos_token_id`:50256 for open-end generation.
1015
Goal: set min_length
and max_length
in Hugging Face Transformers generator query.
I've passed 50, 200
as these parameters. Yet, the length of my outputs are much higher...
There's no runtime failure.
from transformers import pipeline, set_seed
generator = pipeline('text-generation', model='gpt2')
set_seed(42)
def query(payload, multiple, min_char_len, max_char_len):
print(min_char_len, max_char_len)
list_dict = generator(payload, min_length=min_char_len, max_length=max_char_len, num_return_sequences=multiple)
test = [d['generated_text'].split(payload)[1].strip() for d in list_dict]
for t in test: print(len(t))
return test
query('example', 1, 50, 200)
Output:
50 200
Setting `pad_token_id` to `eos_token_id`:50256 for open-end generation.
1015
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(1)
说明:
正如 Narsil 在 Hugging Face
Explanation:
As explained by Narsil on Hugging Face ???? Transformers Git Issue response
Solution:
As I resolved...