from IPython.display import Image, display
from pathlib import Path
core
from pathlib import Path
API Exploration
Anthropic’s Claude and OpenAI’s GPT models are some of the most popular LLMs.
Let’s take a look at their APIs and to learn how we should structure our messages for a simple text chat.
openai
from openai import OpenAI
= OpenAI()
client
client.responses.create(="gpt-4.1",
modelinput=[ {"role": "user", "content": "Hello, world!"} ]
)
Hello! 🌍 How can I help you today?
- id: resp_68588c698304819f884df4486e914e5708c46f56b4a6e7d0
- created_at: 1750633577.0
- error: None
- incomplete_details: None
- instructions: None
- metadata: {}
- model: gpt-4.1-2025-04-14
- object: response
- output: [ResponseOutputMessage(id=‘msg_68588c69c214819fa49ce49ad0cdecc808c46f56b4a6e7d0’, content=[ResponseOutputText(annotations=[], text=‘Hello! 🌍 How can I help you today?’, type=‘output_text’, logprobs=None)], role=‘assistant’, status=‘completed’, type=‘message’)]
- parallel_tool_calls: True
- temperature: 1.0
- tool_choice: auto
- tools: []
- top_p: 1.0
- background: False
- max_output_tokens: None
- previous_response_id: None
- prompt: None
- reasoning: Reasoning(effort=None, generate_summary=None, summary=None)
- service_tier: default
- status: completed
- text: ResponseTextConfig(format=ResponseFormatText(type=‘text’))
- truncation: disabled
- usage: ResponseUsage(input_tokens=11, input_tokens_details=InputTokensDetails(cached_tokens=0), output_tokens=12, output_tokens_details=OutputTokensDetails(reasoning_tokens=0), total_tokens=23)
- user: None
- store: True
anthropic
from anthropic import Anthropic
= Anthropic()
client
client.messages.create(="claude-3-haiku-20240307",
model=1024,
max_tokens=[ {"role": "user", "content": "Hello, world!"} ]
messages )
Hello! It’s nice to meet you. I’m an AI assistant created by Anthropic. I’m here to help with any questions or tasks you may have. Please let me know if there’s anything I can assist you with.
- id:
msg_01XFGugPiSQ2PsEiWTLDcc6V
- content:
[{'citations': None, 'text': "Hello! It's nice to meet you. I'm an AI assistant created by Anthropic. I'm here to help with any questions or tasks you may have. Please let me know if there's anything I can assist you with.", 'type': 'text'}]
- model:
claude-3-haiku-20240307
- role:
assistant
- stop_reason:
end_turn
- stop_sequence:
None
- type:
message
- usage:
{'cache_creation_input_tokens': 0, 'cache_read_input_tokens': 0, 'input_tokens': 11, 'output_tokens': 51, 'server_tool_use': None, 'service_tier': 'standard'}
As we can see both APIs use the exact same message structure.
mk_msg
Ok, let’s build the first version of mk_msg
to handle this case
def mk_msg(content:str, role:str="user")->dict:
"Create an OpenAI/Anthropic compatible message."
return dict(role=role, content=content)
Let’s test it out with the OpenAI API. To do that we’ll need to setup two things:
- install the openai SDK by running
pip install openai
- add your openai api key to your env vars
export OPENAI_API_KEY="YOUR_OPEN_API_KEY"
= OpenAI()
oa_cli
= oa_cli.responses.create(
r ="gpt-4o-mini",
modelinput=[mk_msg("Hello, world!")]
) r.output_text
'Hello! How can I assist you today?'
Now, let’s test out mk_msg
on the Anthropic API. To do that we’ll need to setup two things:
- install the openai SDK by running
pip install anthropic
- add your anthropic api key to your env vars
export ANTHROPIC_API_KEY="YOUR_ANTHROPIC_API_KEY"
= Anthropic()
a_cli
= a_cli.messages.create(
r ="claude-3-haiku-20240307",
model=1024,
max_tokens=[mk_msg("Hello, world!")]
messages
)0].text r.content[
"Hello! It's nice to meet you. I'm an AI assistant created by Anthropic. I'm here to help with all kinds of tasks, from analysis and research to creative projects and casual conversation. Feel free to ask me anything!"
So far so good!
Helper Functions
Before going any further, let’s create some helper functions to make it a little easier to call the OpenAI and Anthropic APIs. We’re going to be making a bunch of API calls to test our code and typing the full expressions out each time will become a little tedious. These functions won’t be included in the final package.
def openai_chat(msgs: list)->tuple:
"call the openai chat responses endpoint with `msgs`."
= oa_cli.responses.create(model="o4-mini", input=msgs)
r return r, r.output_text
Let’s double check that mk_msg
still works with our simple text example from before.
= openai_chat([mk_msg("Hello, world!")])
_, text text
'Hello there! How can I help you today?'
def anthropic_chat(msgs: list)->tuple:
"call the anthropic messages endpoint with `msgs`."
= a_cli.messages.create(model="claude-sonnet-4-20250514", max_tokens=1024, messages=msgs)
r return r, r.content[0].text
and Anthropic…
= anthropic_chat([mk_msg("Hello, world!")])
_, text text
'Hello! Nice to meet you! How are you doing today? Is there anything I can help you with?'
Images
Ok, let’s see how both APIs handle image messages.
openai
import base64, httpx
= "https://claudette.answer.ai/index_files/figure-html/cell-35-output-1.jpeg" img_url
= "image/jpeg"
mtype = httpx.get(img_url).content img_content
= base64.b64encode(img_content).decode("utf-8")
img
= OpenAI()
client = client.responses.create(
r ="gpt-4o-mini",
modelinput=[
{"role":"user",
"content": [
"type":"input_text","text":"What's in this image?"},
{"type":"input_image","image_url":f"data:image/jpeg;base64,{img}"},
{
],
}
],
) r.output_text
'The image contains a puppy lying on the grass near some flowers. The puppy has a white coat with brown markings and appears to be playful and curious. The setting seems to be outdoors, with greenery and blooming flowers in the background.'
anthropic
= "image/jpeg"
mtype = base64.b64encode(img_content).decode("utf-8")
img
= Anthropic()
client = client.messages.create(
r ="claude-3-haiku-20240307",
model=1024,
max_tokens=[
messages
{"role":"user",
"content": [
"type":"text","text":"What's in this image?"},
{"type":"image","source":{"type":"base64","media_type":mtype,"data":img}}
{
],
}
],
)0].text r.content[
"This image shows a cute puppy lying in a grassy area with purple flowers in the background. The puppy appears to be a Cavalier King Charles Spaniel, with a long, silky coat in a reddish-brown color with white markings. The puppy has a friendly, inquisitive expression on its face as it gazes directly at the camera. The image conveys a sense of tranquility and natural beauty, with the vibrant purple flowers providing a lovely contrast to the puppy's warm coloring."
Both APIs format images slightly differently and the structure of the message content
is a little more complex.
In a text chat, content
is a simple string but for a multimodal chat (text+images) we can see that content
is a list of dictionaries.
Msg Class
Basics
Let’s create _mk_img
to make our code a little DRY’r.
Exported source
def _mk_img(data:bytes)->tuple:
"Convert image bytes to a base64 encoded image"
= base64.b64encode(data).decode("utf-8")
img = mimetypes.types_map["."+imghdr.what(None, h=data)]
mtype return img, mtype
To handle the additional complexity of multimodal messages let’s build a Msg
class for the content
data structure:
{
"role": "user",
"content": [{"type": "text", "text": "What's in this image?"}],
}
Msg
Msg ()
Helper class to create a message for the OpenAI and Anthropic APIs.
Exported source
class Msg:
"Helper class to create a message for the OpenAI and Anthropic APIs."
pass
As both APIs handle images differently let’s subclass Msg
for each API and handle the image formatting in a method called img_msg
.
OpenAiMsg
OpenAiMsg ()
Helper class to create a message for the OpenAI API.
Exported source
class OpenAiMsg(Msg):
"Helper class to create a message for the OpenAI API."
pass
AnthropicMsg
AnthropicMsg ()
Helper class to create a message for the Anthropic API.
Exported source
class AnthropicMsg(Msg):
"Helper class to create a message for the Anthropic API."
pass
Let’s write some helper functions for mk_content
to use.
Exported source
def _is_img(data): return isinstance(data, bytes) and bool(imghdr.what(None, data))
A PDF file should start with %PDF
followed by the pdf version %PDF-1.1
Exported source
def _is_pdf(data): return isinstance(data, bytes) and data.startswith(b'%PDF-')
We create an appropriate type based on content:
Msg.mk_content
Msg.mk_content (content, text_only=False)
Exported source
@patch
def mk_content(self:Msg, content, text_only=False)->dict:
if isinstance(content, str): return self.text_msg(content, text_only=text_only)
if _is_img(content): return self.img_msg(content)
if _is_pdf(content): return self.pdf_msg(content)
return content
…then we call the model with this content:
@patch
def __call__(self:Msg, role:str, content:[list, str], text_only:bool=False, **kw)->dict:
"Create an OpenAI/Anthropic compatible message with `role` and `content`."
if content is not None and not isinstance(content, list): content = [content]
= [self.mk_content(o, text_only=text_only) for o in content] if content else ''
content return dict(role=role, content=content[0] if text_only else content, **kw)
OpenAI implementations:
OpenAiMsg.text_msg
OpenAiMsg.text_msg (s:str, text_only=False)
Convert s
to a text message
Exported source
@patch
def img_msg(self:OpenAiMsg, data:bytes)->dict:
"Convert `data` to an image message"
= _mk_img(data)
img, mtype return {"type": "input_image", "image_url": f"data:{mtype};base64,{img}"}
@patch
def text_msg(self:OpenAiMsg, s:str, text_only=False)->dict:
"Convert `s` to a text message"
return s if text_only else {"type": "input_text", "text":s}
OpenAiMsg.img_msg
OpenAiMsg.img_msg (data:bytes)
Convert data
to an image message
Anthropic implementations:
AnthropicMsg.text_msg
AnthropicMsg.text_msg (s:str, text_only=False)
Convert s
to a text message
Exported source
@patch
def img_msg(self:AnthropicMsg, data:bytes)->dict:
"Convert `data` to an image message"
= _mk_img(data)
img, mtype = {"type": "base64", "media_type": mtype, "data":img}
r return {"type": "image", "source": r}
@patch
def text_msg(self:AnthropicMsg, s:str, text_only=False)->dict:
"Convert `s` to a text message"
return s if text_only else {"type": "text", "text":s}
AnthropicMsg.img_msg
AnthropicMsg.img_msg (data:bytes)
Convert data
to an image message
mk_msg
mk_msg (content:Union[list,str], role:str='user', *args, api:str='openai', **kw)
Create an OpenAI/Anthropic compatible message.
"Hello world", "how are you?"], api='openai') mk_msg([
{ 'content': [ {'text': 'Hello world', 'type': 'input_text'},
{'text': 'how are you?', 'type': 'input_text'}],
'role': 'user'}
"Hello world", "how are you?"], api='anthropic') mk_msg([
{ 'content': [ {'text': 'Hello world', 'type': 'text'},
{'text': 'how are you?', 'type': 'text'}],
'role': 'user'}
= mk_msg([img_content, "describe this picture"], api="openai")
msg = openai_chat([msg])
_, text text
'A small puppy, likely a young spaniel, is lying in green grass beside a pot of purple daisy-like flowers. Key details: \n• Coat: Soft white fur with rich chestnut-brown patches, especially around its floppy ears and eyes. \n• Pose: Front paws stretched forward, body low to the ground, head slightly tilted, looking straight at the camera with a curious, gentle expression. \n• Setting: Bright daylight, fresh green lawn, and clusters of delicate purple blooms tucked into a terracotta or wooden planter just behind the puppy. \n• Mood: Calm and inquisitive—its wide eyes and relaxed posture give the impression it’s quietly exploring its surroundings.'
= mk_msg([img_content, "describe this picture"], api="anthropic")
msg = anthropic_chat([msg])
_, text text
"This is an adorable photograph of a young puppy, likely a Cavalier King Charles Spaniel or similar breed, with beautiful reddish-brown and white fur markings. The puppy has distinctive coloring with a white face featuring a brown patch around one eye, and longer, silky ears that are a rich auburn color. \n\nThe puppy is positioned on green grass and appears to be resting or lying down near some purple flowers, which look like small daisies or asters. The setting appears to be outdoors in a garden area, with what looks like a brick or stone structure in the background. The lighting gives the photo a warm, natural feel, and the puppy's expression is sweet and gentle, looking directly at the camera with dark, soulful eyes. The overall composition creates a charming, pastoral scene that highlights the puppy's natural beauty."
PDFs
What about chatting with PDFs? Unfortunately, OpenAI’s message completions API doesn’t offer PDF support at the moment, but Claude does.
Under the hood, Claude extracts the text from the PDF and converts each page to an image. This means you can ask Claude about any text, pictures, charts, and tables in the PDF. Here’s an example from the Claude docs. Overall the message structure is pretty similar to an image message.
= "https://assets.anthropic.com/m/1cd9d098ac3e6467/original/Claude-3-Model-Card-October-Addendum.pdf"
pdf_url = base64.standard_b64encode(httpx.get(pdf_url).content).decode("utf-8")
pdf_data = anthropic.Anthropic()
client = client.messages.create(
message ="claude-3-5-sonnet-20241022", max_tokens=1024,
model=[{
messages"role": "user",
"content": [
{"type": "document",
"source": { "type": "base64", "media_type": "application/pdf", "data": pdf_data }
},
{"type": "text",
"text": "Which model has the highest human preference win rates across each use-case?"
}
]
}] )
Let’s create a method that converts a byte string to the base64 encoded string that Anthropic expects.
Exported source
def _mk_pdf(data:bytes)->str:
"Convert pdf bytes to a base64 encoded pdf"
return base64.standard_b64encode(data).decode("utf-8")
We add a pdf_msg
method to AnthropicMsg
that uses _mk_pdf
.
AnthropicMsg.pdf_msg
AnthropicMsg.pdf_msg (data:bytes)
Convert data
to a pdf message
Exported source
@patch
def pdf_msg(self:AnthropicMsg, data: bytes) -> dict:
"Convert `data` to a pdf message"
= {"type": "base64", "media_type": "application/pdf", "data":_mk_pdf(data)}
r return {"type": "document", "source": r}
Let’s test our changes on a financial report.
= Path('financial_report.pdf').read_bytes()
pdf = mk_msg([pdf, "what was the average monthly revenue for product D?"], api="anthropic")
msg = anthropic_chat([msg])
_, text text
'To find the average monthly revenue for Product D, I need to read the values from the bar chart on page 5.\n\nFrom the Product D chart, the monthly revenue values (in dollars) are approximately:\n- Jan: 900\n- Feb: 500\n- Mar: 400\n- Apr: 700\n- May: 800\n- Jun: 900\n- Jul: 1000\n- Aug: 1050\n- Sep: 1200\n- Oct: 1300\n- Nov: 1300\n- Dec: 1300\n\nTotal revenue = 900 + 500 + 400 + 700 + 800 + 900 + 1000 + 1050 + 1200 + 1300 + 1300 + 1300 = 11,350\n\nAverage monthly revenue = 11,350 ÷ 12 = $945.83\n\nThe average monthly revenue for Product D was approximately $946.'
Conversation
LLMs are stateless. To continue a conversation we need to include the entire message history in every API call. By default the role in each message alternates between user
and assistant
.
Let’s add a method that alternates the roles for us and then calls mk_msgs
.
def mk_msgs(msgs: list, *args, api:str="openai", **kw) -> list:
"Create a list of messages compatible with OpenAI/Anthropic."
if isinstance(msgs, str): msgs = [msgs]
return [mk_msg(o, ('user', 'assistant')[i % 2], *args, api=api, **kw) for i, o in enumerate(msgs)]
"Hello", "Some assistant response", "tell me a joke"]) mk_msgs([
[{'role': 'user', 'content': 'Hello'},
{'role': 'assistant', 'content': 'Some assistant response'},
{'role': 'user', 'content': 'tell me a joke'}]
SDK Objects
To make our lives even easier, it would be nice if mk_msg
could format the SDK objects returned from a previous chat so that we can pass them straight to mk_msgs
.
The OpenAI SDK accepts objects like ChatCompletion
as messages. Anthropic is different and expects every message to have the role
, content
format that we’ve seen so far.
Msg.__call__
Msg.__call__ (role:str, content:[<class'list'>,<class'str'>], text_only:bool=False, **kw)
Create an OpenAI/Anthropic compatible message with role
and content
.
AnthropicMsg.find_block
AnthropicMsg.find_block (r)
Find the message in r
.
AnthropicMsg.is_sdk_obj
AnthropicMsg.is_sdk_obj (r)
Check if r
is an SDK object.
OpenAiMsg.find_block
OpenAiMsg.find_block (r)
Find the message in r
.
OpenAiMsg.is_sdk_obj
OpenAiMsg.is_sdk_obj (r)
Check if r
is an SDK object.
mk_msgs
mk_msgs (msgs:list, *args, api:str='openai', **kw)
Create a list of messages compatible with OpenAI/Anthropic.
Let’s test our changes.
= ["tell me a joke"]
msgs = openai_chat(mk_msgs(msgs))
r, text text
'Why did the scarecrow win an award? \nBecause he was outstanding in his field!'
+= [r, "tell me another joke that's similar to your first joke"]
msgs = mk_msgs(msgs)
mm mm
[{'role': 'user', 'content': 'tell me a joke'},
ResponseReasoningItem(id='rs_68588c8944b0819db384cbf628a0d7b20702288fa4b11d37', summary=[], type='reasoning', encrypted_content=None, status=None),
ResponseOutputMessage(id='msg_68588c8ae410819d822acbdcd3d2ab910702288fa4b11d37', content=[ResponseOutputText(annotations=[], text='Why did the scarecrow win an award? \nBecause he was outstanding in his field!', type='output_text', logprobs=None)], role='assistant', status='completed', type='message'),
{'role': 'user',
'content': "tell me another joke that's similar to your first joke"}]
= openai_chat(mm)
r, text text
'Why did the bicycle fall over? \nBecause it was two-tired!'
Usage
To make msglm
a little easier to use let’s create OpenAI and Anthropic wrappers for mk_msg
and mk_msgs
.
= partial(mk_msg, api="anthropic")
mk_msg_anthropic = partial(mk_msgs, api="anthropic") mk_msgs_anthropic
If you’re using OpenAI you should be able to use the import below
from msglm import mk_msg_openai as mk_msg, mk_msgs_openai as mk_msgs
Similarily for Anthropic
from msglm import mk_msg_anthropic as mk_msg, mk_msgs_anthropic as mk_msgs
Extra features
Caching
Anthropic currently offers prompt caching, which can reduce cost and latency.
To cache a message, we simply add a cache_control
field to our content as shown below.
{"role": "user",
"content": [
{"type": "text",
"text": "Hello, can you tell me more about the solar system?",
"cache_control": {"type": "ephemeral"}
}
] }
Let’s update our mk_msg
and mk_msgs
Anthropic wrappers to support caching.
mk_msgs_anthropic
mk_msgs_anthropic (*args, cache=False, cache_last_ckpt_only=False, api:str='openai')
Create a list of Anthropic compatible messages.
mk_msg_anthropic
mk_msg_anthropic (*args, cache=False, role:str='user', api:str='openai')
Create an Anthropic compatible message.
Let’s see caching in action
"Don't cache my message") mk_msg_anthropic(
{'content': "Don't cache my message", 'role': 'user'}
"Please cache my message", cache=True) mk_msg_anthropic(
{ 'content': [ { 'cache_control': {'type': 'ephemeral'},
'text': 'Please cache my message',
'type': 'text'}],
'role': 'user'}
Citations
The Anthropic API provides detailed citations when answering questions about documents.
When citations are enabled a citations block like the one below will be included in the response.
{"content": [
"type": "text", "text": "According to the document, " },
{
{"type": "text", "text": "the grass is green",
"citations": [{
"type": "char_location",
"cited_text": "The grass is green.",
"document_index": 0, "document_title": "Example Document",
"start_char_index": 0, "end_char_index": 20
}]
}
] }
To enable citations you need to create an Anthropic document with the following structure.
{"type": "document",
"source": {...},
"title": "Document Title", # optional
"context": "Context about the document that will not be cited from", # optional
"citations": {"enabled": True}
}
Currently Anthropic supports citations on 3 document types: - text - pdfs - custom
A text document has the following source structure.
"type": "text", "media_type": "text/plain", "data": "Plain text content..."} {
Here’s the source structure for a pdf.
"type": "base64", "media_type": "application/pdf", "data": b64_enc_data} {
Finally, here’s the source structure for a custom document.
{"type": "content",
"content": [
"type": "text", "text": "First chunk"},
{"type": "text", "text": "Second chunk"}
{
] }
mk_ant_doc
mk_ant_doc (content, title=None, context=None, citation=True, **kws)
Create an Anthropic document.
Here’s how you would implement the example from the citation’s docs.
= mk_ant_doc("The grass is green. The sky is blue.", title="My Document", context="This is a trustworthy document.")
doc "What color is the grass and sky?"]) mk_msg([doc,
{ 'content': [ { 'citations': {'enabled': True},
'context': 'This is a trustworthy document.',
'source': { 'data': 'The grass is green. The sky is blue.',
'media_type': 'text/plain',
'type': 'text'},
'title': 'My Document',
'type': 'document'},
{ 'text': 'What color is the grass and sky?',
'type': 'input_text'}],
'role': 'user'}