Supercharging Gen-AI
Applications with Gemini
Function Calling
Suresh Peiris
Co-Founder at Inforwaves
Community Lead - GDG Sri Lanka
A type of artificial
intelligence that
generates content
for you.
What is
an LLM?
LLMs Explained
[...] [...] [...]
[...]
0.02
0.03
0.9 0.01 0.0 …
Dogs Rain Drops Fish Wind …
and
cats
raining
It’s
Roses are red,
Violets are blue,
Sugar is sweet,
LLMs Explained
Roses are red,
Violets are blue,
Sugar is sweet,
LLMs Explained
Modern LLMs are
large
LLMs Explained
Classic Natural
Language Problems
LLMs Explained
Entity extraction Classification Summarization
Sentiment Analysis Translation …
LLMs let us
prototype fast
LLMs Explained
Why are large language
models different?
LLMs are characterized by emergent
abilities, or the ability to perform tasks
that were not present in smaller models.
LLMs contextual understanding of human
language changes how we interact with
data and intelligent systems.
LLMs can find patterns and connections in
massive, disparate data corpora.
Search
Conversation
Content generation
The Gemini Ecosystem
The most advanced AI from Google
For Developers
For Consumers
For Business and Enterprise
Models
Gemini API
(in Google AI Studio + ai.google.dev)
Gemini for Google Workspace
Gemini for Google Cloud
Gemini in Vertex AI
Gemini | app and web
Gemini in the Google App
Gemini in Gmails, Docs…
this deck
is about
The Gemini Ecosystem
Models
The Gemini Ecosystem
Models
Getting started with
the Gemini API
Visit the cookbook:
https://github.com/google-gemini/cookbook
Start developing:
1. Go to Google AI Studio.
2. Login with your Google account.
3. Create an API key.
4. Use a quickstart for Python, or call the REST API using
curl.
Get Started
ai.google.dev
REST API + Client libraries for Python, Node, Java, and Swift
Libraries
Libraries
SDKs
Vertex AI
Enterprise grade support.
Full MLOps (Examples: Model
evaluation, monitoring, registry)
Vertex AI
Check it out when you're
ready for production
Gemini API and Vertex AI
Both give access Gemini family
models.
Vertex AI
Cloud Production
ai.google.dev/docs/migrate_to_cloud
PlatfoEndpoints
rms
Platforms
Platforms
Vertex AI is the Google Cloud
product group for ML
End-to-end ML Platform
LLMs Explained
Model Garden
Cloud Production
PlatfoEndpoints
rms
Platforms
Model Garden
● Foundation Models
● Open Source Models
● Task Specific Models
Sample Models
Cloud Production
PlatfoEndpoints
rms
Platforms
Model Garden
Function
Calling
Have you ever asked a generative AI model for
real-time information, only to receive outdated
or inaccurate responses?
Problem
Function Calling
https://github.com/GoogleCloudPlatform/generative-ai/tree/main/gemini/function-calling/sql-talk-a goo.gle/gemini-fn-call-sql-github
Function Calling
https://github.com/GoogleCloudPlatform/generative-ai/tree/main/gemini/function-calling/sql-talk-app
SELECT ROUND((
COUNT(DISTINCT IF(returned_at IS NOT NULL, order_id, NULL)) / COUNT(DISTINCT
order_id)) * 100, 2)
AS return_rate
FROM thelook_ecommerce.orders
goo.gle/gemini-fn-call-sql-github
Function calling in Gemini
● A framework to connect LLMs to real-time data
● Delegate certain data processing tasks to
functions
Function Calling
Function Calling
Elements of a Function call
● Function Declaration (name, description, parameters, required)
● Tools (group functions into tools)
● Attach (tools to Gemini initialization)
● Response Parts (parts that are returned to Gemini for response
generation)
Function Calling
Function Declaration in Gemini tools
Function Calling
Gemini Tools
Function Calling
Gemini Fetch Functions
Function Calling
Gemini Function calling also supports
● Parallel function processing
● Nested parameters and complex schemas
● Function calling modes:
○ AUTO, The model decides whether to predict a function call or a
natural language response
○ ANY mode forces the model to predict a function call
○ NONE mode instructs the model to not predict function calls
Function Calling
Nested Params
Let's consider a scenario where we want
to generate detailed product
descriptions for an e-commerce
website.
Prompt:
Generate a detailed product description based on
the provided information. Highlight the key
features and benefits of the product.
Function Calling
Gemini Function calling also supports
● Parallel function processing
● Nested parameters and complex schemas
● Function calling modes:
○ AUTO, The model decides whether to predict a function call or a
natural language response
○ ANY mode forces the model to predict a function call
○ NONE mode instructs the model to not predict function calls
Function Calling
Gemini Function calling benefits
● Ability to feed in real time results from own app/database
● Adheres output to a specific schema - receiving consistently formatted
responses.
● Ability to work on multimodal inputs
● Agent and workflow processing
● Reasoning engine, and orchestrator jobs STEPS to simulate compile/run of
output
Function Calling
Function Calling
Function Calling
● Describe external functions to the model.
● The model may ask you to call the function to help it
respond to your queries.
● Endless possibilities for integrating external tools.
Function calling - Basics
Function Calling
● How?
The google.generativeai SDK will inspect the function's
type hints to determine the schema.
● Allowed types are limited:
AllowedTypes = (
int | float | str | bool | list | dict )
https://ai.google.dev/tutorials/function_calling_python_quickstart
def multiply(a: float, b: float):
"""Returns a * b."""
return a*b
model = genai.GenerativeModel(
model_name='gemini-1.0-pro',
tools=[multiply])
Function calling - Basics
Function Calling
● Because function calling requires alternating turns, it's
easiest to use through chat.
● Enable "automatic function calling" when you start a chat,
and the ChatSession will call the function(s) for you.
○ You don't have to use automatic function calling, it just
makes simple cases easier.
https://ai.google.dev/tutorials/function_calling_python_quickstart
chat = model.start_chat(
enable_automatic_function_calling=True)
response = chat.send_message(
'I have 57 cats, each owns 44 mittens, '
'how many mittens is that in total?')
chat = model.start_chat(
enable_automatic_function_calling=True)
response = chat.send_message(
'I have 57 cats, each owns 44 mittens, '
'how many mittens is that in total?')
print(response.text)
# The number of mittens in total is 2508.
Function calling - Basics
Function Calling
● What happened? Use the chat history to find out.
● The chat history collects all the function calls and responses
that took place.
https://ai.google.dev/tutorials/function_calling_python_quickstart
for content in chat.history:
part = content.parts[0]
print(content.role, "->", type(part).to_dict(part))
for content in chat.history:
part = content.parts[0]
print(content.role, "->", type(part).to_dict(part))
# user -> {'text': 'I have 57 cats, each owns 44 mittens, '
# 'how many mittens is that in total?'}
# model -> {'function_call': {'name': 'multiply',
# 'args': {'a': 57.0, 'b': 44.0}}}
# user -> {'function_response': {'name': 'multiply',
# 'response': {'result': 2508.0}}}
# model -> {'text': ' The number of mittens in total is 2508.'}
Contents
Tools
Text
Function Declaration
Function Declaration
Text
Function Calling nteraction
Function Call
Function Response
Text
model may predict a function call
based on user content
model can understand the function
response
and generate text
OR another function call
if one or more function declarations are
provided, function calling feature will
turn on
Function calling - More Examples
Function Calling
● Wikipedia research aid
○ Integrates a search tool.
○ Uses the Gemini API inside the function call to summarize pages.
def wikipedia_search(queries:list[str]) -> list[str]:
...
https://ai.google.dev/docs/search_reranking_using_embeddings/
model = genai.GenerativeModel('gemini-pro', tools=[wikipedia_search])
chat = model.start_chat(enable_automatic_function_calling=True)
query = "Explain how deep-sea life survives."
res = chat.send_message(instructions.format(query=query))
model = genai.GenerativeModel('gemini-pro', tools=[wikipedia_search])
chat = model.start_chat(enable_automatic_function_calling=True)
query = "Explain how deep-sea life survives."
res = chat.send_message(instructions.format(query=query))
# Searching for "How do deep-sea creatures survive the extreme pressure?"
# Related search terms: ['Deep sea', 'Deep-sea community', 'Deep-sea fish']
# Fetching page: "Deep sea"
# Information Source: https://en.wikipedia.org/wiki/Deep_sea
# Fetching page: "Deep-sea community"
# Information Source: https://en.wikipedia.org/wiki/Deep-sea_community
# Fetching page: "Deep-sea fish"
# Information Source: https://en.wikipedia.org/wiki/Deep-sea_fish
# Searching for "How do deep-sea creatures survive the cold temperatures?"
# Related search terms: ['Deep-sea community', 'Deep sea', 'Deep-water coral']
# Fetching page: "Deep-water coral"
Function calling - More Examples
Function Calling
● SQL Talk
○ Use function calling to talk to a database.
○ Live example: https://goo.gle/gemini-fn-call-sql
https://github.com/GoogleCloudPlatform/generative-ai/tree/main/gemini/function-calling/sql-talk-app
sql_query_func = FunctionDeclaration(
name="sql_query",
description="Get information from data in BigQuery using SQL queries",
parameters={
"type": "object",
"properties": {
"query": {
"type": "string",
"description": "SQL query on a single line ...
Function Calling
https://github.com/GoogleCloudPlatform/generative-ai/tree/main/gemini/function-calling/sql-talk-app
SELECT ROUND((
COUNT(DISTINCT IF(returned_at IS NOT NULL, order_id, NULL)) / COUNT(DISTINCT
order_id)) * 100, 2)
AS return_rate
FROM thelook_ecommerce.orders
Function calling - Schema
Function Calling
● Automatically building the schema from the type hints doesn't
[currently 02/2024] work for everything.
● The allowed types are actually:
AllowedType= (
int | float | str | bool |
list['AllowedType'] |
dict[str, 'AllowedType']
)
https://ai.google.dev/tutorials/function_calling_python_quickstart
Function calling - Schema
Function Calling
● Let's look at how the schema is constructed.
https://ai.google.dev/tutorials/function_calling_python_quickstart
model = genai.GenerativeModel(
model_name='gemini-1.0-pro',
tools=[multiply])
model._tools.to_proto()
[function_declarations {
name: "multiply"
description: "returns a * b."
parameters {
type_: OBJECT
properties {
key: "b" value { type_: NUMBER }}
properties {
key: "a" value { type_: NUMBER }}
required: "a" required: "b" }}]
Function calling - Schema
Function Calling
● It's an OpenAPI schema, written as a protobuf.
● The protobuf-classes are available in the google.ai.generativelanguage
client library.
● Reference docs:
https://ai.google.dev/api/python/google/ai/generativelanguage/FunctionDeclar
ation
https://ai.google.dev/tutorials/function_calling_python_quickstart
import google.ai.generativelanguage as glm
calculator = glm.Tool(
function_declarations=[
glm.FunctionDeclaration(
name='multiply',
description="Returns the product of two numbers.",
parameters=glm.Schema(
type=glm.Type.OBJECT,
properties={
'a': glm.Schema(type=glm.Type.NUMBER),
'b': glm.Schema(type=glm.Type.NUMBER)},
required=['a','b']))])
Function calling - Schema
Function Calling
● They can be written out as JSON-compatible objects as well.
https://ai.google.dev/tutorials/function_calling_python_quickstart
calculator = {
'function_declarations': [{
'name': 'multiply',
'description': 'Returns the product of two numbers.',
'parameters': {
'type': 'OBJECT',
'properties': {
'a': {'type': 'NUMBER'},
'b': {'type': 'NUMBER'}},
'required': ['a', 'b']}}]}
model = genai.GenerativeModel(
model_name='gemini-1.0-pro',
tools=[calculator])
Function calling - Structured data
Function Calling
● Structured data extraction.
● You can just ask the model to do it and return JSON.
https://ai.google.dev/tutorials/structured_data_extraction
response = model.generate_content(textwrap.dedent("""
Please return JSON describing the the people, places, things and relationships from this
story using the following schema:
{"people": list[PERSON], "places":list[PLACE], "things":list[THING], "relationships": list[RELATIONSHIP]}
PERSON = {"name": str, "description": str, "start_place_name": str, "end_place_name": str}
PLACE = {"name": str, "description": str}
THING = {"name": str, "description": str, "start_place_name": str, "end_place_name": str}
RELATIONSHIP = {"person_1_name": str, "person_2_name": str, "relationship": str}
Here is the story:
""") + story)
Function calling - Structured data
Function Calling
● Asking for JSON often works.
● Function calling lets you strictly describe the schema.
● With a strict description, we can strictly enforce that that's
what gets returned.
https://ai.google.dev/tutorials/structured_data_extraction
add_to_database = glm.FunctionDeclaration(
name="add_to_database",
description="Adds entities to the database.",
parameters=glm.Schema(
type=glm.Type.OBJECT,
properties = {
'people': glm.Schema(
type=glm.Type.ARRAY,
items=glm.Schema(
type = glm.Type.OBJECT,
properties = {
'name': glm.Schema(type=glm.Type.STRING),
'description': glm.Schema(type=glm.Type.STRING),
'start_place_name': glm.Schema(type=glm.Type.STRING),
'end_place_name': glm.Schema(type=glm.Type.STRING)})),
'places': glm.Schema(
...
add_to_database = glm.FunctionDeclaration(
name="add_to_database",
description="Adds entities to the database.",
parameters={
"type": "OBJECT",
"properties": {
'people': {
"type": "ARRAY",
"items": {
"type": "OBJECT",
"properties": {
'name': {"type":"STRING"},
'description': {"type":"STRING"},
'start_place_name': {"type":"STRING"},
'end_place_name': {"type":"STRING"}})),
'places': {...},
...
model = model = genai.GenerativeModel(
model_name='gemini-1.0-pro',
tools = [add_to_database])
Resources
Resources
Gemini Cookbook Sample Repo Official Codelab
Official Notebook
Resources
Notebooks
Thank You!
Suresh Peiris
@sureshmichael

Getting Started with Gemini Function Calling

  • 1.
    Supercharging Gen-AI Applications withGemini Function Calling Suresh Peiris Co-Founder at Inforwaves Community Lead - GDG Sri Lanka
  • 2.
    A type ofartificial intelligence that generates content for you.
  • 3.
    What is an LLM? LLMsExplained [...] [...] [...] [...] 0.02 0.03 0.9 0.01 0.0 … Dogs Rain Drops Fish Wind … and cats raining It’s
  • 4.
    Roses are red, Violetsare blue, Sugar is sweet, LLMs Explained
  • 5.
    Roses are red, Violetsare blue, Sugar is sweet, LLMs Explained
  • 6.
  • 7.
    Classic Natural Language Problems LLMsExplained Entity extraction Classification Summarization Sentiment Analysis Translation …
  • 8.
    LLMs let us prototypefast LLMs Explained
  • 9.
    Why are largelanguage models different? LLMs are characterized by emergent abilities, or the ability to perform tasks that were not present in smaller models. LLMs contextual understanding of human language changes how we interact with data and intelligent systems. LLMs can find patterns and connections in massive, disparate data corpora. Search Conversation Content generation
  • 11.
    The Gemini Ecosystem Themost advanced AI from Google For Developers For Consumers For Business and Enterprise Models Gemini API (in Google AI Studio + ai.google.dev) Gemini for Google Workspace Gemini for Google Cloud Gemini in Vertex AI Gemini | app and web Gemini in the Google App Gemini in Gmails, Docs… this deck is about
  • 12.
  • 13.
  • 14.
  • 15.
    Visit the cookbook: https://github.com/google-gemini/cookbook Startdeveloping: 1. Go to Google AI Studio. 2. Login with your Google account. 3. Create an API key. 4. Use a quickstart for Python, or call the REST API using curl. Get Started
  • 16.
    ai.google.dev REST API +Client libraries for Python, Node, Java, and Swift Libraries Libraries SDKs
  • 17.
    Vertex AI Enterprise gradesupport. Full MLOps (Examples: Model evaluation, monitoring, registry) Vertex AI Check it out when you're ready for production Gemini API and Vertex AI Both give access Gemini family models. Vertex AI Cloud Production ai.google.dev/docs/migrate_to_cloud PlatfoEndpoints rms Platforms Platforms
  • 18.
    Vertex AI isthe Google Cloud product group for ML End-to-end ML Platform LLMs Explained
  • 19.
    Model Garden Cloud Production PlatfoEndpoints rms Platforms ModelGarden ● Foundation Models ● Open Source Models ● Task Specific Models
  • 20.
  • 21.
  • 22.
    Have you everasked a generative AI model for real-time information, only to receive outdated or inaccurate responses? Problem
  • 23.
  • 24.
    Function Calling https://github.com/GoogleCloudPlatform/generative-ai/tree/main/gemini/function-calling/sql-talk-app SELECT ROUND(( COUNT(DISTINCTIF(returned_at IS NOT NULL, order_id, NULL)) / COUNT(DISTINCT order_id)) * 100, 2) AS return_rate FROM thelook_ecommerce.orders goo.gle/gemini-fn-call-sql-github
  • 25.
    Function calling inGemini ● A framework to connect LLMs to real-time data ● Delegate certain data processing tasks to functions Function Calling
  • 26.
  • 27.
    Elements of aFunction call ● Function Declaration (name, description, parameters, required) ● Tools (group functions into tools) ● Attach (tools to Gemini initialization) ● Response Parts (parts that are returned to Gemini for response generation) Function Calling
  • 28.
    Function Declaration inGemini tools Function Calling
  • 29.
  • 30.
  • 31.
    Gemini Function callingalso supports ● Parallel function processing ● Nested parameters and complex schemas ● Function calling modes: ○ AUTO, The model decides whether to predict a function call or a natural language response ○ ANY mode forces the model to predict a function call ○ NONE mode instructs the model to not predict function calls Function Calling
  • 32.
    Nested Params Let's considera scenario where we want to generate detailed product descriptions for an e-commerce website. Prompt: Generate a detailed product description based on the provided information. Highlight the key features and benefits of the product. Function Calling
  • 33.
    Gemini Function callingalso supports ● Parallel function processing ● Nested parameters and complex schemas ● Function calling modes: ○ AUTO, The model decides whether to predict a function call or a natural language response ○ ANY mode forces the model to predict a function call ○ NONE mode instructs the model to not predict function calls Function Calling
  • 34.
    Gemini Function callingbenefits ● Ability to feed in real time results from own app/database ● Adheres output to a specific schema - receiving consistently formatted responses. ● Ability to work on multimodal inputs ● Agent and workflow processing ● Reasoning engine, and orchestrator jobs STEPS to simulate compile/run of output Function Calling
  • 35.
    Function Calling Function Calling ●Describe external functions to the model. ● The model may ask you to call the function to help it respond to your queries. ● Endless possibilities for integrating external tools.
  • 36.
    Function calling -Basics Function Calling ● How? The google.generativeai SDK will inspect the function's type hints to determine the schema. ● Allowed types are limited: AllowedTypes = ( int | float | str | bool | list | dict ) https://ai.google.dev/tutorials/function_calling_python_quickstart
  • 37.
    def multiply(a: float,b: float): """Returns a * b.""" return a*b model = genai.GenerativeModel( model_name='gemini-1.0-pro', tools=[multiply])
  • 38.
    Function calling -Basics Function Calling ● Because function calling requires alternating turns, it's easiest to use through chat. ● Enable "automatic function calling" when you start a chat, and the ChatSession will call the function(s) for you. ○ You don't have to use automatic function calling, it just makes simple cases easier. https://ai.google.dev/tutorials/function_calling_python_quickstart
  • 39.
    chat = model.start_chat( enable_automatic_function_calling=True) response= chat.send_message( 'I have 57 cats, each owns 44 mittens, ' 'how many mittens is that in total?')
  • 40.
    chat = model.start_chat( enable_automatic_function_calling=True) response= chat.send_message( 'I have 57 cats, each owns 44 mittens, ' 'how many mittens is that in total?') print(response.text) # The number of mittens in total is 2508.
  • 41.
    Function calling -Basics Function Calling ● What happened? Use the chat history to find out. ● The chat history collects all the function calls and responses that took place. https://ai.google.dev/tutorials/function_calling_python_quickstart
  • 42.
    for content inchat.history: part = content.parts[0] print(content.role, "->", type(part).to_dict(part))
  • 43.
    for content inchat.history: part = content.parts[0] print(content.role, "->", type(part).to_dict(part)) # user -> {'text': 'I have 57 cats, each owns 44 mittens, ' # 'how many mittens is that in total?'} # model -> {'function_call': {'name': 'multiply', # 'args': {'a': 57.0, 'b': 44.0}}} # user -> {'function_response': {'name': 'multiply', # 'response': {'result': 2508.0}}} # model -> {'text': ' The number of mittens in total is 2508.'}
  • 44.
    Contents Tools Text Function Declaration Function Declaration Text FunctionCalling nteraction Function Call Function Response Text model may predict a function call based on user content model can understand the function response and generate text OR another function call if one or more function declarations are provided, function calling feature will turn on
  • 45.
    Function calling -More Examples Function Calling ● Wikipedia research aid ○ Integrates a search tool. ○ Uses the Gemini API inside the function call to summarize pages. def wikipedia_search(queries:list[str]) -> list[str]: ... https://ai.google.dev/docs/search_reranking_using_embeddings/
  • 46.
    model = genai.GenerativeModel('gemini-pro',tools=[wikipedia_search]) chat = model.start_chat(enable_automatic_function_calling=True) query = "Explain how deep-sea life survives." res = chat.send_message(instructions.format(query=query))
  • 47.
    model = genai.GenerativeModel('gemini-pro',tools=[wikipedia_search]) chat = model.start_chat(enable_automatic_function_calling=True) query = "Explain how deep-sea life survives." res = chat.send_message(instructions.format(query=query)) # Searching for "How do deep-sea creatures survive the extreme pressure?" # Related search terms: ['Deep sea', 'Deep-sea community', 'Deep-sea fish'] # Fetching page: "Deep sea" # Information Source: https://en.wikipedia.org/wiki/Deep_sea # Fetching page: "Deep-sea community" # Information Source: https://en.wikipedia.org/wiki/Deep-sea_community # Fetching page: "Deep-sea fish" # Information Source: https://en.wikipedia.org/wiki/Deep-sea_fish # Searching for "How do deep-sea creatures survive the cold temperatures?" # Related search terms: ['Deep-sea community', 'Deep sea', 'Deep-water coral'] # Fetching page: "Deep-water coral"
  • 48.
    Function calling -More Examples Function Calling ● SQL Talk ○ Use function calling to talk to a database. ○ Live example: https://goo.gle/gemini-fn-call-sql https://github.com/GoogleCloudPlatform/generative-ai/tree/main/gemini/function-calling/sql-talk-app
  • 49.
    sql_query_func = FunctionDeclaration( name="sql_query", description="Getinformation from data in BigQuery using SQL queries", parameters={ "type": "object", "properties": { "query": { "type": "string", "description": "SQL query on a single line ...
  • 50.
    Function Calling https://github.com/GoogleCloudPlatform/generative-ai/tree/main/gemini/function-calling/sql-talk-app SELECT ROUND(( COUNT(DISTINCTIF(returned_at IS NOT NULL, order_id, NULL)) / COUNT(DISTINCT order_id)) * 100, 2) AS return_rate FROM thelook_ecommerce.orders
  • 51.
    Function calling -Schema Function Calling ● Automatically building the schema from the type hints doesn't [currently 02/2024] work for everything. ● The allowed types are actually: AllowedType= ( int | float | str | bool | list['AllowedType'] | dict[str, 'AllowedType'] ) https://ai.google.dev/tutorials/function_calling_python_quickstart
  • 52.
    Function calling -Schema Function Calling ● Let's look at how the schema is constructed. https://ai.google.dev/tutorials/function_calling_python_quickstart
  • 53.
  • 54.
    [function_declarations { name: "multiply" description:"returns a * b." parameters { type_: OBJECT properties { key: "b" value { type_: NUMBER }} properties { key: "a" value { type_: NUMBER }} required: "a" required: "b" }}]
  • 55.
    Function calling -Schema Function Calling ● It's an OpenAPI schema, written as a protobuf. ● The protobuf-classes are available in the google.ai.generativelanguage client library. ● Reference docs: https://ai.google.dev/api/python/google/ai/generativelanguage/FunctionDeclar ation https://ai.google.dev/tutorials/function_calling_python_quickstart
  • 56.
    import google.ai.generativelanguage asglm calculator = glm.Tool( function_declarations=[ glm.FunctionDeclaration( name='multiply', description="Returns the product of two numbers.", parameters=glm.Schema( type=glm.Type.OBJECT, properties={ 'a': glm.Schema(type=glm.Type.NUMBER), 'b': glm.Schema(type=glm.Type.NUMBER)}, required=['a','b']))])
  • 57.
    Function calling -Schema Function Calling ● They can be written out as JSON-compatible objects as well. https://ai.google.dev/tutorials/function_calling_python_quickstart
  • 58.
    calculator = { 'function_declarations':[{ 'name': 'multiply', 'description': 'Returns the product of two numbers.', 'parameters': { 'type': 'OBJECT', 'properties': { 'a': {'type': 'NUMBER'}, 'b': {'type': 'NUMBER'}}, 'required': ['a', 'b']}}]}
  • 59.
  • 60.
    Function calling -Structured data Function Calling ● Structured data extraction. ● You can just ask the model to do it and return JSON. https://ai.google.dev/tutorials/structured_data_extraction
  • 61.
    response = model.generate_content(textwrap.dedent(""" Pleasereturn JSON describing the the people, places, things and relationships from this story using the following schema: {"people": list[PERSON], "places":list[PLACE], "things":list[THING], "relationships": list[RELATIONSHIP]} PERSON = {"name": str, "description": str, "start_place_name": str, "end_place_name": str} PLACE = {"name": str, "description": str} THING = {"name": str, "description": str, "start_place_name": str, "end_place_name": str} RELATIONSHIP = {"person_1_name": str, "person_2_name": str, "relationship": str} Here is the story: """) + story)
  • 62.
    Function calling -Structured data Function Calling ● Asking for JSON often works. ● Function calling lets you strictly describe the schema. ● With a strict description, we can strictly enforce that that's what gets returned. https://ai.google.dev/tutorials/structured_data_extraction
  • 63.
    add_to_database = glm.FunctionDeclaration( name="add_to_database", description="Addsentities to the database.", parameters=glm.Schema( type=glm.Type.OBJECT, properties = { 'people': glm.Schema( type=glm.Type.ARRAY, items=glm.Schema( type = glm.Type.OBJECT, properties = { 'name': glm.Schema(type=glm.Type.STRING), 'description': glm.Schema(type=glm.Type.STRING), 'start_place_name': glm.Schema(type=glm.Type.STRING), 'end_place_name': glm.Schema(type=glm.Type.STRING)})), 'places': glm.Schema( ...
  • 64.
    add_to_database = glm.FunctionDeclaration( name="add_to_database", description="Addsentities to the database.", parameters={ "type": "OBJECT", "properties": { 'people': { "type": "ARRAY", "items": { "type": "OBJECT", "properties": { 'name': {"type":"STRING"}, 'description': {"type":"STRING"}, 'start_place_name': {"type":"STRING"}, 'end_place_name': {"type":"STRING"}})), 'places': {...}, ...
  • 65.
    model = model= genai.GenerativeModel( model_name='gemini-1.0-pro', tools = [add_to_database])
  • 66.
  • 67.
  • 68.

Editor's Notes

  • #3 Large language models (LLMs) predict the next word in a sequence by: Learning from patterns LLMs learn to predict the next word by analyzing patterns in large amounts of text data. This process is called self-supervised pre-training. Using a statistical model LLMs use a statistical model of language to estimate the probability of the next word in a sequence. Comparing predictions to actual words LLMs compare their predictions to the actual next word in the data. Adjusting parameters LLMs use a mathematical process called backpropagation to adjust their parameters so they're more likely to predict the correct next word. Analyzing the prompt LLMs analyze the prompt and the predicted first word of the response to generate the second word. Using decoding strategies LLMs use decoding strategies to extract coherent text strings from their probability estimates.
  • #31 Parallel - For prompts such as "Get weather details in New Delhi and San Francisco?", the model may propose several parallel function calls. For a list of models that support parallel function calling, see Supported models.
  • #32 Parallel - For prompts such as "Get weather details in New Delhi and San Francisco?", the model may propose several parallel function calls. For a list of models that support parallel function calling, see Supported models.
  • #33 Parallel - For prompts such as "Get weather details in New Delhi and San Francisco?", the model may propose several parallel function calls. For a list of models that support parallel function calling, see Supported models.