[BuildWithAI] Introduction to Gemini.pdf

SandroMoreira8 289 views 108 slides May 07, 2024
Slide 1
Slide 1 of 108
Slide 1
1
Slide 2
2
Slide 3
3
Slide 4
4
Slide 5
5
Slide 6
6
Slide 7
7
Slide 8
8
Slide 9
9
Slide 10
10
Slide 11
11
Slide 12
12
Slide 13
13
Slide 14
14
Slide 15
15
Slide 16
16
Slide 17
17
Slide 18
18
Slide 19
19
Slide 20
20
Slide 21
21
Slide 22
22
Slide 23
23
Slide 24
24
Slide 25
25
Slide 26
26
Slide 27
27
Slide 28
28
Slide 29
29
Slide 30
30
Slide 31
31
Slide 32
32
Slide 33
33
Slide 34
34
Slide 35
35
Slide 36
36
Slide 37
37
Slide 38
38
Slide 39
39
Slide 40
40
Slide 41
41
Slide 42
42
Slide 43
43
Slide 44
44
Slide 45
45
Slide 46
46
Slide 47
47
Slide 48
48
Slide 49
49
Slide 50
50
Slide 51
51
Slide 52
52
Slide 53
53
Slide 54
54
Slide 55
55
Slide 56
56
Slide 57
57
Slide 58
58
Slide 59
59
Slide 60
60
Slide 61
61
Slide 62
62
Slide 63
63
Slide 64
64
Slide 65
65
Slide 66
66
Slide 67
67
Slide 68
68
Slide 69
69
Slide 70
70
Slide 71
71
Slide 72
72
Slide 73
73
Slide 74
74
Slide 75
75
Slide 76
76
Slide 77
77
Slide 78
78
Slide 79
79
Slide 80
80
Slide 81
81
Slide 82
82
Slide 83
83
Slide 84
84
Slide 85
85
Slide 86
86
Slide 87
87
Slide 88
88
Slide 89
89
Slide 90
90
Slide 91
91
Slide 92
92
Slide 93
93
Slide 94
94
Slide 95
95
Slide 96
96
Slide 97
97
Slide 98
98
Slide 99
99
Slide 100
100
Slide 101
101
Slide 102
102
Slide 103
103
Slide 104
104
Slide 105
105
Slide 106
106
Slide 107
107
Slide 108
108

About This Presentation

Workshop Build With AI - Google Developers Group Rio Verde


Slide Content

Rio Verde

Introdução a
Google Gemini API
Sandro Moreira ele/dele
Google Mentor @SandroEnsinaAI
Rio Verde

Visite: https://github.com/google-gemini/cookbook

Iniciando o desenvolvimento
1.Acesse Google AI Studio.
2.Faça login com sua conta Google.
3.Crie uma chave de API (API key).
4.Utilize esse quickstart para Python ou faça uma requisição REST API usando curl.

Leia mais sobre o Google Gemini API

Get started!

O que é a ia generativa?

O que é
um LLM?
Explicando LLMs
[...]

[...]

[...][...]

0.020.030.9 0.01 0.0 …
Dogs Rain Drops Fish Wind …
andcatsrainingIt’s

Rosas são vermelhas,
Violets are blue,
Sugar is sweet,
Explicando LLMs

Rosas são vermelhas,
Violetas são azuis,
Açúcar é doce,
LLMs Explained

for(var i = 0;; i < 10; i++) {

for(var i = 0; i < 10; i++) {

LLMs modernos
são grandes.
Explicando LLMs

Problemas Clássicos
de Linguagem Natural
Explicando LLMs
Extração de Entidades Classificação Resumos
Análise de Sentimentos Tradução …

LLMs permitem
prototipação
rapidamente.
Explicando LLMs

Explique essa piada: “O que o
tomate foi fazer no banco?
- Tirar extrato”
Model: A piada se baseia em um
trocadilho com a palavra
"extrato".

Modelo:
Usuário:
A piada sugere que o tomate foi ao banco para
obter um "extrato", mas ao invés do extrato
bancário, ele queria o extrato de tomate, fazendo
um jogo de palavras com o duplo sentido.
Extrato bancário: É um documento que mostra o
histórico de transações de uma conta bancária.
Extrato de tomate: É um produto concentrado
feito a partir de tomates.
Explicando LLMs

Modelo:
1. **Pintura com Redes Neurais**
Implementar um modelo de rede neural, como uma Rede Generativa
Adversarial (GAN), para transformar fotos em pinturas de diferentes
estilos (ex: impressionista, cubista, etc.).
2. **Reconhecimento de Emoções**
Desenvolver um sistema que use uma rede neural para reconhecer
emoções em imagens de rostos.
3. **Música com Redes Neurais**
Criar um sistema que use uma rede neural recorrente (RNN) para gerar
melodias ou acompanhar uma melodia tocada pelo visitante.
4. **Tradução de Linguagem de Sinais**
Desenvolver um sistema que use uma rede neural convolucional (CNN)
para reconhecer sinais de linguagem de sinais e traduzi-los para texto ou
fala.
5. **Controlador de jogos com Imagens do jogador**
Utilizar imagens corporais e gestos do jogador para executar comandos
que controlam um personagem de um jogo.
Explicando LLMs
Me dê cinco ideias de projetos
para um feira de Inteligência
Artificial
Usuário:

Google é pioneiro na indústria de IA
Responsible AI
3,000
Pesquisadores
7,000
Publicações
Construído e testado
para segurança
Privacidade no
Design
Mantém altos padrões
científicos
Responsável perante as
pessoas
Socialmente Benéfico
Evite criar
preconceitos
injustos
2015
Google DeepMind
AlphaGo derrota o
campeão de Go
2016
Google
DeepMind
ajuda a
detectar
doenças em
olhos
2017
Google inventa
o Transformer,
pontapé inicial
da revolução
dos LLM
2018
Google cria um
modelo
inovador de
grande
linguagem
(BERT)
2019
Text-to-Text
Transfer
Transformer
LLM 10B P
Model Open
Sourced
2020
Google LaMDA
Modelo
treinado para
conversação
2022
AlphaFold prevê
estruturas de
todas as
proteínas
conhecidas
2023
Serviço de IA
Conversacional
desenvolvido
com PaLM2
2024
Família de LLMs e
produtos
multimodais

Multimodalidade

(Android AICore)

ai.google.dev/gemma

O Ecosistema Gemini
A mais avançada IA do Google

Para Desenvolvedores
Para Consumidores
Para Negócios e Empresas
Modelos
Gemini API
(in Google AI Studio + ai.google.dev)
Gemini for Google Workspace
Gemini for Google Cloud
Gemini in Vertex AI

Gemini | app and web
Gemini in the Google App
Gemini in Gmails, Docs…

Vamos focar um
pouco mais aqui

Começando com
Gemini API

AI Studio
aistudio.google.com
●Generate API Keys
●Create, test, and save prompts
●Customize models in minutes
●Generate starter code

AI Studio
aistudio.google.com

AI Studio
aistudio.google.com

AI Studio
aistudio.google.com

AI Studio
aistudio.google.com

AI Studio

ai.google.dev
REST API + Client libraries for Python, Node, Java, and Swift









LibrariesLibrariesSDKs

Vertex AI

Enterprise grade support.
Full MLOps (Examples: Model
evaluation, monitoring, registry)


Vertex AI

Check it out when you're
ready for production

Gemini API and Vertex AI

Both give access Gemini family
models.
Vertex AI
Cloud Production
ai.google.dev/docs/migrate_to_cloud
PlatfoEndpoints
rms
PlatformsPlatforms

Basicamente, prompts

Escreva um poema sobre uma mochila mágica.

In a world of wonder, behold,
A backpack with tales yet untold.
Its fabric shimmers, alive with might,
A portal to realms, day and night.

aistudio.google.com

Escreva um poema sobre uma mochila mágica

A mochila pendurada, um simples saco de pano,
Mas guarda em suas costuras segredos sobre-humanos.
Um universo inteiro, onde tudo pode caber,
De florestas e oceanos, a um reino para se esconder.
...
aistudio.google.com

Gere uma pequena dica sobre Javascript. Boas dicas são curtas e
contém um código de exemplo demonstrando.
**Tip: Use Array.prototype.flat() to flatten an array of
arrays.**

```javascript
const array = [[1, 2], [3, 4], [5, 6]];
console.log(array.flat());

// Output: [1, 2, 3, 4, 5, 6]
```

Gere uma pequena dica sobre Javascript. Boas dicas são curtas e
contém um código de exemplo demonstrando.

**Dica: Use Array.prototype.flat() para achatar uma matriz de
matrizes.**

```javascript
const array = [[1, 2], [3, 4], [5, 6]];
console.log(array.flat());

// Output: [1, 2, 3, 4, 5, 6]
```

Gere uma pequena dica sobre Javascript. Boas dicas são curtas e
contém um código de exemplo demonstrando.

tópico: let e const
dica: Prefira usar `let` e `const` para inicializar variáveis, em
vez de `var`, pois `var` não tem escopo de bloco ou proteção
contra mutabilidade.

e.g. const myVar = "EXAMPLE";

tópico: arrow syntax
dica:

AI StudioEndpointsPrompting

import google.generativeai as genai

model = genai.GenerativeModel( 'models/gemini-pro')

resp = model.generate_content(
'Escreva o primeiro parágrafo de uma história sobre uma mochila
mágica')

>>> print(resp.text)
Clara nunca pensou que sua vida mudaria por causa de uma mochila velha e
empoeirada. Era feita de um tecido gasto e desbotado, com remendos
coloridos e uma alça desfiada. Ela a encontrara esquecida no sótão da
casa de sua avó, entre caixas de fotografias antigas e brinquedos
quebrados...

chat = model.start_chat()

response = chat.send_message(
"Olá, o que devo comer no jantar?" )

print(response.text)
# 'Aqui tem algumas sugestões...'

response = chat.send_message(
"Como eu preparo a primeira?" )

Contents
Settings/Configs
Text
Text
Text
Tools
Tools
Safety Settings
Safety Settings
Generation Config
Content of role: "user"
containing multimodal Parts:
Content of role: "model"
"user" content: a text Part
Tools specified by the caller
Safety setting configured by caller
Generation Config includes: temperature, Top P, Top K,
stop seq, max output tokens etc.
Generate Content Request

Candidate(s)
Feedback
Text
Block Reason
Safety Ratings
Candidate "Content"
note: only one candidate returned today
why the model stopped generating
feedback on the prompt
Finish Reason
Finish Message set if finish reason is present
Safety Ratings
Safety Ratings
how safe is the response
Generate Content Response

AI StudioEndpointsPrompting
https://ai.google.dev/docs/prompt_best_practices
●Prompts encadeados - faça um plano e execute-o
●Contexto – Poucos e pequenos prompts
●Parâmetros de geração - Temperatura, configurações de segurança,
Top-P, TopK

Geração de códigos

Code Generation
Content generation
Convert human language
requests to machine
requests
Rapid tool buildingGenerate code based on a
user prompt

Examples
Code generation
●Generate data

Examples
Code generation
●Generate data
●Generate a SQL query

Examples
Code generation
●Generate data
●Generate a SQL query
●Simulate Execution

Examples
Code generation
●Generate data
●Generate a SQL query
●Simulate Execution
●Quick coding help - Smart pair programer.

Escreva um teste de unidade para esta função Python:

def is_diver_neutrally_buoyant(density_water, density_diver):
"""Verifique se um mergulhador tem flutuabilidade neutra."""
if density_water == density_diver:
return True

import unittest

class IsDiverNeutrallyBuoyantTest (unittest.TestCase):

def test_diver_is_neutrally_buoyant(self):
self.assertTrue(is_diver_neutrally_buoyant( 1000, 1000))

def test_diver_is_not_neutrally_buoyant(self):
self.assertFalse(is_diver_neutrally_buoyant( 1000, 1010))

def test_invalid_input_types(self):
with self.assertRaises(TypeError):
is_diver_neutrally_buoyant( "1000", 1000)
with self.assertRaises(TypeError):
is_diver_neutrally_buoyant( 1000, "1000")

Busca e síntese da
informação

●Models have knowledge cut-offs
●LLMs are not fact engines
●No exposure to private data
BYO Data
Search & IR

●Instructions + Context + Question all in the prompt
●Easy to implement
○No extra code, just ask.
Use the prompt's context window
Search & IR

model = genai.GenerativeModel( 'gemini-pro')

document = pathlib.Path( 'document.txt').read_text()

result = model.generate_content( f"""
Explique como pode haver vida no mar profundo.

Responda a pergunta baseado no seguinte documento:
{document}""")

Use the prompt's context window
Learning more
●Limited by the model's context length
○gemini-1.0-pro: 30K tokens.

Search & IR

Use the prompt's context window
Learning more
●gemini-1.5-pro: 1M tokens.
○Remember: Everything is tokens.
●Using context may be more flexible than Retrieval Augmented
Generation (RAG).
●More input context means higher latency
●Join the waitlist:
https://aistudio.google.com/app/waitlist/97445851

https://blog.google/technology/ai/google-gemini-next-generation-model-february-2024


Search & IR

Use respostas de perguntas atribuídas
(AQA) com pedaços embutidos
AQA
●Instructions + Chunks + Question in one request

●AQA is a new feature
○Feedback is welcome
○The AQA model is specifically tuned for RAG

import google.generativeai.answer
import google.generativeai as genai

chunks = ["Your Googlecar has a climate control system that allo..." ,
"Your Googlecar has a large touchscreen display that p..." ,
"Your Googlecar has an automatic transmission. To shif..." ]

answer = genai.answer.generate_answer(
contents='How do you make it less windy?' ,
answer_style=AnswerStyle. ABSTRACTIVE,
inline_passages=chunks)

answer { content { parts {
text: "To decrease the airflow inside the car, "
"turn the airflow knob counterclockwise." }}
...
grounding_attributions {
source_id { grounding_passage { passage_id: "0" } }
content { parts { text:
"Your Googlecar has a climate control system that allows..." }}}
}
answerable_probability: 0.995222747

Use AQA with inline chunks
Learning more
●Limited by the request size - Max 4MB.
●Inefficient when asking about the same data repeatedly.
●Simple to use.
●Example: handle chunks returned by a search tool.
AQA

Use AQA with the retriever service
Learning more
●Corpus < Document < Chunk hierarchy.
●Pass a reference to a document or corpus to the
generate_answer function.
Corpus
Document Chunk
AQA

from google.generativeai import retriever

my_corpus = retriever.create_corpus(display_name= "My Documents")

for document in my_documents:
doc = my_corpus.create_document(display_name=document.title)
doc.batch_create_chunks(htmlchunker.chunk(document.html))

answer = genai.answer.generate_answer(
semantic_retreiver_config=my_corpus, ...)
Coming
soon

Use AQA with the retriever service
Learning more
●Supports filtering based on chunk metadata
●Limits
○10k documents per corpus
○1M chunks per corpus
○5 corpora per user
AQA

Chamada de Funções

Function Calling
https://github.com/GoogleCloudPlatform/generative-ai/tree/main/gemini/function-calling/sql-talk-app goo.gle/gemini-fn-call-sql-github

Function Calling
https://github.com/GoogleCloudPlatform/generative-ai/tree/main/gemini/function-calling/sql-talk-app
SELECT ROUND((
COUNT(DISTINCT IF(returned_at IS NOT NULL, order_id, NULL)) /
COUNT(DISTINCT order_id)) * 100, 2)
AS return_rate
FROM thelook_ecommerce.orders
goo.gle/gemini-fn-call-sql-github

Function Calling
Function Calling
●Describe external functions to the model.
●The model may ask you to call the function to help it respond
to your queries.
●Endless possibilities for integrating external tools.

Function calling - Basics
Function Calling
●How?
●The google.generativeai SDK will inspect the function's
type hints to determine the schema.
●Allowed types are limited:

AllowedTypes = (
int | float | str | bool | list | dict )
https://ai.google.dev/tutorials/function_calling_python_quickstart

def multiply(a: float, b: float):
"""Returns a * b."""
return a*b

model = genai.GenerativeModel(
model_name='gemini-1.0-pro',
tools=[multiply])

Function calling - Basics
Function Calling
●Because function calling requires alternating turns, it's easiest
to use through chat.
●Enable "automatic function calling" when you start a chat, and
the ChatSession will call the function(s) for you.
○You don't have to use automatic function calling, it just
makes simple cases easier.
https://ai.google.dev/tutorials/function_calling_python_quickstart

chat = model.start_chat(
enable_automatic_function_calling= True)

response = chat.send_message(
'I have 57 cats, each owns 44 mittens, '
'how many mittens is that in total?' )

chat = model.start_chat(
enable_automatic_function_calling= True)

response = chat.send_message(
'I have 57 cats, each owns 44 mittens, '
'how many mittens is that in total?' )

print(response.text)
# The number of mittens in total is 2508.

Function calling - Basics
Function Calling
●What happened? Use the chat history to find out.
●The chat history collects all the function calls and responses
that took place.
https://ai.google.dev/tutorials/function_calling_python_quickstart

for content in chat.history:
part = content.parts[ 0]
print(content.role, "->", type(part).to_dict(part))

for content in chat.history:
part = content.parts[ 0]
print(content.role, "->", type(part).to_dict(part))

# user -> {'text': 'I have 57 cats, each owns 44 mittens, '
# 'how many mittens is that in total?' }
# model -> {'function_call': {'name': 'multiply',
# 'args': {'a': 57.0, 'b': 44.0}}}
# user -> {'function_response': {'name': 'multiply',
# 'response': {'result': 2508.0}}}
# model -> {'text': ' The number of mittens in total is 2508.' }

Contents
Tools
Text
Function Declaration
Function Declaration
Text
Function Calling interaction
Function Call
Function Response
Text
model may predict a function call
based on user content
model can understand the function response
and generate text
OR another function call
if one or more function declarations are
provided, function calling feature will turn on

Function calling - More Examples
Function Calling
●Wikipedia research aid
○Integrates a search tool.
○Uses the Gemini API inside the function call to summarize pages.


def wikipedia_search(queries:list[str]) -> list[str]:
...



https://ai.google.dev/docs/search_reranking_using_embeddings/

model = genai.GenerativeModel( 'gemini-pro', tools=[wikipedia_search])

chat = model.start_chat(enable_automatic_function_calling=True)

query = "Explain how deep-sea life survives."

res = chat.send_message(instructions.format(query=query))

model = genai.GenerativeModel( 'gemini-pro', tools=[wikipedia_search])

chat = model.start_chat(enable_automatic_function_calling=True)

query = "Explain how deep-sea life survives."

res = chat.send_message(instructions.format(query=query))
# Searching for "How do deep-sea creatures survive the extreme pressure?"
# Related search terms: [ 'Deep sea', 'Deep-sea community' , 'Deep-sea fish']
# Fetching page: "Deep sea"
# Information Source: https://en.wikipedia.org/wiki/Deep_sea
# Fetching page: "Deep-sea community"
# Information Source: https://en.wikipedia.org/wiki/Deep-sea_community
# Fetching page: "Deep-sea fish"
# Information Source: https://en.wikipedia.org/wiki/Deep-sea_fish
# Searching for "How do deep-sea creatures survive the cold temperatures?"
# Related search terms: [ 'Deep-sea community' , 'Deep sea', 'Deep-water coral']
# Fetching page: "Deep-water coral"
# Information Source: https://en.wikipedia.org/wiki/Deep-water_coral
# Searching for "What adaptations do deep-sea creatures have to survive?"
# Related search terms: [ 'Deep sea', 'Deep-sea community' , 'Deep-sea fish']
# Searching for "What are the unique characteristics of deep-sea creatures?"
# Related search terms: [ 'Deep-sea community' , 'Deep-sea fish', 'Deep sea']
# Searching for "How do deep-sea creatures find food?"
# Related search terms: [ 'Deep-sea community' , 'Deep-sea fish', 'The Deep (TV series)' ]
# Fetching page: "The Deep (TV series)"
# Information Source: https://en.wikipedia.org/wiki/The_Deep_(TV_series)
# Searching for "How do deep-sea creatures communicate with each other?"

Function calling - More Examples
Function Calling
●SQL Talk
○Use function calling to talk to a database.
○Live example: https://goo.gle/gemini-fn-call-sql
https://github.com/GoogleCloudPlatform/generative-ai/tree/main/gemini/function-calling/sql-talk-app

sql_query_func = FunctionDeclaration(
name="sql_query",
description="Get information from data in BigQuery using SQL queries" ,
parameters={
"type": "object",
"properties": {
"query": {
"type": "string",
"description": "SQL query on a single line ...

Function Calling
https://github.com/GoogleCloudPlatform/generative-ai/tree/main/gemini/function-calling/sql-talk-app
SELECT ROUND((
COUNT(DISTINCT IF(returned_at IS NOT NULL, order_id, NULL)) /
COUNT(DISTINCT order_id)) * 100, 2)
AS return_rate
FROM thelook_ecommerce.orders

Function calling - Schema
Function Calling
●Automatically building the schema from the type hints doesn't
[currently 02/2024] work for everything.
●The allowed types are actually:

AllowedType= (
int | float | str | bool |
list['AllowedType'] |
dict[str, 'AllowedType']
)

https://ai.google.dev/tutorials/function_calling_python_quickstart

Function calling - Schema
Function Calling
●Let's look at how the schema is constructed.
https://ai.google.dev/tutorials/function_calling_python_quickstart

model = genai.GenerativeModel(
model_name='gemini-1.0-pro',
tools=[multiply])

model._tools.to_proto()

[function_declarations {
name: "multiply"
description: "returns a * b."
parameters {
type_: OBJECT
properties {
key: "b" value { type_: NUMBER }}
properties {
key: "a" value { type_: NUMBER }}
required: "a" required: "b" }}]

Function calling - Schema
Function Calling
●It's an OpenAPI schema, written as a protobuf.
●The protobuf-classes are available in the
google.ai.generativelanguage client library.
●Reference docs:

https://ai.google.dev/api/python/google/ai/generativelanguage
/FunctionDeclaration
https://ai.google.dev/tutorials/function_calling_python_quickstart

import google.ai.generativelanguage as glm

calculator = glm.Tool(
function_declarations=[
glm.FunctionDeclaration(
name='multiply',
description= "Returns the product of two numbers." ,
parameters=glm.Schema(
type=glm.Type. OBJECT,
properties={
'a': glm.Schema(type=glm.Type. NUMBER),
'b': glm.Schema(type=glm.Type. NUMBER)},
required=[ 'a','b']))])

Function calling - Schema
Function Calling
●They can be written out as JSON-compatible objects as well.
https://ai.google.dev/tutorials/function_calling_python_quickstart

calculator = {
'function_declarations' : [{
'name': 'multiply',
'description': 'Returns the product of two numbers.' ,
'parameters': {
'type': 'OBJECT',
'properties': {
'a': {'type': 'NUMBER'},
'b': {'type': 'NUMBER'}},
'required': ['a', 'b']}}]}

model = genai.GenerativeModel(
model_name='gemini-1.0-pro',
tools=[calculator])

Function calling - Structured data
Function Calling
●Structured data extraction.
●You can just ask the model to do it and return JSON.
https://ai.google.dev/tutorials/structured_data_extraction

response = model.generate_content(textwrap.dedent( """\
Please return JSON describing the the people, places, things and relationships from this
story using the following schema:

{"people": list[PERSON], "places":list[PLACE], "things":list[THING], "relationships": list[RELATIONSHIP]}

PERSON = {"name": str, "description": str, "start_place_name": str, "end_place_name": str}
PLACE = {"name": str, "description": str}
THING = {"name": str, "description": str, "start_place_name": str, "end_place_name": str}
RELATIONSHIP = {"person_1_name": str, "person_2_name": str, "relationship": str}

Here is the story:

""") + story)

Function calling - Structured data
Function Calling
●Asking for JSON often works.
●Function calling lets you strictly describe the schema.
●With a strict description, we can strictly enforce that that's
what gets returned.
https://ai.google.dev/tutorials/structured_data_extraction

add_to_database = glm.FunctionDeclaration(
name="add_to_database",
description="Adds entities to the database. ",
parameters=glm.Schema(
type=glm.Type. OBJECT,
properties = {
'people': glm.Schema(
type=glm.Type. ARRAY,
items=glm.Schema(
type = glm.Type. OBJECT,
properties = {
'name': glm.Schema(type=glm.Type. STRING),
'description': glm.Schema(type=glm.Type. STRING),
'start_place_name': glm.Schema(type=glm.Type. STRING),
'end_place_name': glm.Schema(type=glm.Type. STRING)})),
'places': glm.Schema(
...

add_to_database = glm.FunctionDeclaration(
name="add_to_database",
description="Adds entities to the database." ,
parameters={
"type": "OBJECT",
"properties": {
'people': {
"type": "ARRAY",
"items": {
"type": "OBJECT",
"properties": {
'name': {"type":"STRING"},
'description': {"type":"STRING"},
'start_place_name': {"type":"STRING"},
'end_place_name': {"type":"STRING"}})),
'places': {...},
...

model = model = genai.GenerativeModel(
model_name='gemini-1.0-pro',
tools = [add_to_database])

Function calling - Structured data
Function Calling
●Why?
○Function calling doesn 't return a string.
○Function calling returns data-structures: You don't parse
text.
https://ai.google.dev/tutorials/structured_data_extraction

response = model.generate_content(f """
Please add the people, places, things and relationships
from this story to the database:

{story}
""")

fc = response.candidates[ 0].content.parts[0].function_call
pprint.pprint(type(fc).to_dict(fc))

{'name': 'add_to_database',
'args': {'people': [{'description': 'A curious girl with wild hair of '
'midnight blue and eyes that shone like... ,
'name': 'Luna',
...
'places': [{'description': 'Nestled between rolling hills and '
'sparkling rivers, where Luna and her '
'family lived.',
'name': 'Small town'}],
'relationships': [{'person_1_name': 'Luna', 'person_2_name': 'Elara',
'relationship': 'Mother-daughter'},
{ 'person_1_name': 'Luna', 'person_2_name': 'Orion',
'relationship': 'Father-daughter'}],
'things': [{'description': 'A dusty, forgotten backpack that Luna '
'found in the attic, which possessed a...' ,
'end_place_name': None,
'name': 'Backpack',
'start_place_name': 'Attic'},
...