HCODX/Function Calling Converter
100% browser-based · OpenAI · Anthropic · Gemini

Function Calling Converter

Convert function/tool schemas between OpenAI, Anthropic, and Google Gemini formats. Paste the JSON for one provider, choose the target — get the equivalent definition out.

Input schema
Output schema
Convert options
JSON Schema to Tool
From
OpenAI
To
Anthropic
Name
Status
Ready
Example

OpenAI → Anthropic, one click

OpenAI wraps the function spec under {type:"function", function:{…}}; Anthropic uses a flat object with input_schema. The semantics are identical — only the envelope changes.

OpenAI
{
  "type": "function",
  "function": {
    "name": "get_weather",
    "parameters": { "type": "object", ... }
  }
}
Anthropic
{
  "name": "get_weather",
  "input_schema": { "type": "object", ... }
}
Use cases

What you'll use this for

Multi-provider agents need one tool registry but three serializations. This is the fastest way to keep them in sync.

Cross-provider porting

Ship the same tools to OpenAI, Claude and Gemini without re-typing each schema.

RAG / agent dev

Iterate tool specs locally, then convert before plumbing into your provider SDK.

OpenAI → Claude switch

Mid-flight provider swap: convert your tools array to Anthropic's format in one step.

Schema testing

Round-trip your schema to confirm no fields are lost in translation.

Step by step

How to convert a tool schema

1

Paste your schema

Drop a single tool/function definition as JSON. Sample loaders show the OpenAI shape.

2

Pick source format

Tell the converter how to parse it — OpenAI, Anthropic, or Gemini.

3

Pick target format

The output editor updates live. Auto-convert is on by default.

4

Copy or download

Save as tool.json or paste straight into your SDK.

FAQ

Frequently asked questions

Anthropic uses input_schema. OpenAI wraps the function in {type:"function", function:{…}}. Gemini is a flat object similar to OpenAI without the wrapper, using parameters.

Yes. No signup, no limits. Your schemas stay in your browser.

Only JSON-level parsing. Semantic correctness against each provider's spec (allowed types, nesting depth, etc.) is your responsibility — use the JSON validator if needed.

The tool definition is the same whether you call one or many tools per turn. Parallel-call flags belong on the request body, not the tool schema.

Yes — flip the From / To dropdowns. All three formats round-trip cleanly.

About

About function calling formats

Function calling (sometimes called "tool use") lets a model emit structured JSON arguments instead of free-form text. Every major LLM provider supports it, but each chose a slightly different envelope around the same JSON-Schema parameters object.

OpenAI

  • Wraps tool defs in { "type": "function", "function": { "name", "description", "parameters" } }.
  • Optional "strict": true flag for structured outputs.

Anthropic

  • Flat: { "name", "description", "input_schema" }.
  • The input_schema is plain JSON Schema.

Gemini

  • Flat: { "name", "description", "parameters" }.
  • Lives inside a functionDeclarations array at the request level.
Related

Related tools