题意:怎样从OpenAI API获取响应头(Headers)
问题背景:
I am relatively new to the OpenAI API and am trying to obtain my rate limits through the HTTP headers of the response, as discussed https://platform.openai.com/docs/guides/rate-limits/usage-tiers?context=tier-free. However, none of the calls to OpenAI are returning a response with a headers attribute.
我相对来说是OpenAI API的新手,并正在尝试通过响应的HTTP头部来获取我的速率限制,正如在https://platform.openai.com/docs/guides/rate-limits/usage-tiers?context=tier-free中讨论的那样。然而,所有对OpenAI的调用都没有返回一个带有headers属性的响应。
I have tried with both the [https://platform.openai.com/docs/libraries/python-library](openai package) (code below) and llama-index.
我尝试了使用[https://platform.openai.com/docs/libraries/python-library](openai package)(下面的代码)和llama-index
from openai import OpenAI
client = OpenAI(
# Defaults to os.environ.get("OPENAI_API_KEY")
# Otherwise use: api_key="Your_API_Key",
)
chat_completion = client.chat.completions.create(
model="gpt-3.5-turbo",
messages=[{"role": "user", "content": "Hello world"}]
)
I'm obviously missing something obvious so if anyone can tell me it would be hugely appreciated!
我显然遗漏了一些显而易见的东西,所以如果有人能告诉我,我将不胜感激!
问题解决:
Ask for the raw response object like: 请求原始响应对象,像这样:
from openai import OpenAI
client = OpenAI(
# Defaults to os.environ.get("OPENAI_API_KEY")
# Otherwise use: api_key="Your_API_Key",
)
raw_response = client.chat.completions.with_raw_response.create(
model="gpt-3.5-turbo",
messages=[{"role": "user", "content": "Hello world"}]
)
chat_completion = raw_response.parse()
response_headers = raw_response.headers