OpenAI vs Mistral vs Claude vs Gemini APIs

Anh-Thi Dinh
draft
⚠️
This is a quick & dirty draft, for me only!
⚠️
In any case, you should check the official document, this note may be outdated in the future.

Official references

Remarks (for all services)

  • Mistral and Claude don’t accept null in the request body. OpenAI allows that. In OpenAI request, if you input null, the default value for that property will be used. However, there will be an error if you put null in the request body of Mistral.
  • Only OpenAI has user and n parameters.

Function calling (Tools / Tool call / Tool use)

  • Stronger models give better results.

Temperature

Range
[0, 2]
[0, 1]
[0, 1]
[0, 2] (gemini-1.5-pro)
[0, 1] (
gemini-1.0-pro-vision)
[0, 2] (
gemini-1.0-pro-002)
[0, 1] (
gemini-1.0-pro-001)
Default
1
0.7
1
1 (gemini-1.5-pro)
0.4 (
gemini-1.0-pro-vision)
1 (
gemini-1.0-pro-002)
0.9 (
gemini-1.0-pro-001)

Max tokens

No services give a very clear discription about the max tokens. You have to read the documentation carefully.
  • Claude gives the max output to 4096 tokens to all models.
  • OpenAI sometimes give 4096 tokens for the output.

OpenAI

  • Mistral and Claude don’t accept null in the request body. OpenAI allows that. In OpenAI request, if you input null, the default value for that property will be used. However, there will be an error if you put null in the request body of Mistral.

Mistral

  • Mistral and Claude don’t accept null in the request body. OpenAI allows that. In OpenAI request, if you input null, the default value for that property will be used. However, there will be an error if you put null in the request body of Mistral.

Claude

  • Mistral and Claude don’t accept null in the request body. OpenAI allows that. In OpenAI request, if you input null, the default value for that property will be used. However, there will be an error if you put null in the request body of Mistral.
  • max_tokens is required!
  • It has a very different format in comparison with OpenAI and Mistral (both in input and output).
  • Claude doesn’t have an endpoint for getting the list of models like OpenAI/Mistral.
  • If you use REST API, you have to indicate version of the API, if you use SDK, you don’t have to.
  • First message must use the “user” role. OpenAI/Mistral don’t restrict this rule.

Gemini

  • Different from other services, the role in the content message is either “user” or “model” (not “assistant”)
  • Different from other services, Gemini doesn’t allow the name of a tool starting with special characters like -.
  • “system_instruction” isn’t enabled for model gemini-1.0-pro, if you use it, there will be an error Developer instruction is not enabled for models/gemini-1.0-pro". In this case, you can use "role": "model" for the instruction.
    • 1{
      2	"contents": [
      3		{
      4			"role": "model", 
      5			"parts": { "text": "You are a very helpful assistant!" } 
      6		},
      7		{
      8			"role": "user", 
      9			"parts": { "text": "Hello, who are you?" } 
      10		}
      11	]
      12}