Skip to main content

ChatCohere

This notebook covers how to get started with Cohere chat models.

from langchain.chat_models import ChatCohere
from langchain.schema import HumanMessage
chat = ChatCohere()
messages = [HumanMessage(content="knock knock")]
chat(messages)
AIMessage(content="Who's there?")

ChatCohere also supports async and streaming functionality:

from langchain.callbacks.manager import CallbackManager
from langchain.callbacks.streaming_stdout import StreamingStdOutCallbackHandler
await chat.agenerate([messages])
Who's there?
LLMResult(generations=[[ChatGenerationChunk(text="Who's there?", message=AIMessageChunk(content="Who's there?"))]], llm_output={}, run=[RunInfo(run_id=UUID('1e9eaefc-9c99-4fa9-8297-ef9975d4751e'))])
chat = ChatCohere(
streaming=True,
verbose=True,
callback_manager=CallbackManager([StreamingStdOutCallbackHandler()]),
)
chat(messages)
Who's there?
AIMessageChunk(content="Who's there?")