Workshop Build With AI - Google Developers Group Rio Verde
Size: 3.46 MB
Language: en
Added: May 07, 2024
Slides: 108 pages
Slide Content
Rio Verde
Introdução a
Google Gemini API
Sandro Moreira ele/dele
Google Mentor @SandroEnsinaAI
Rio Verde
Visite: https://github.com/google-gemini/cookbook
Iniciando o desenvolvimento
1.Acesse Google AI Studio.
2.Faça login com sua conta Google.
3.Crie uma chave de API (API key).
4.Utilize esse quickstart para Python ou faça uma requisição REST API usando curl.
Explique essa piada: “O que o
tomate foi fazer no banco?
- Tirar extrato”
Model: A piada se baseia em um
trocadilho com a palavra
"extrato".
Modelo:
Usuário:
A piada sugere que o tomate foi ao banco para
obter um "extrato", mas ao invés do extrato
bancário, ele queria o extrato de tomate, fazendo
um jogo de palavras com o duplo sentido.
Extrato bancário: É um documento que mostra o
histórico de transações de uma conta bancária.
Extrato de tomate: É um produto concentrado
feito a partir de tomates.
Explicando LLMs
Modelo:
1. **Pintura com Redes Neurais**
Implementar um modelo de rede neural, como uma Rede Generativa
Adversarial (GAN), para transformar fotos em pinturas de diferentes
estilos (ex: impressionista, cubista, etc.).
2. **Reconhecimento de Emoções**
Desenvolver um sistema que use uma rede neural para reconhecer
emoções em imagens de rostos.
3. **Música com Redes Neurais**
Criar um sistema que use uma rede neural recorrente (RNN) para gerar
melodias ou acompanhar uma melodia tocada pelo visitante.
4. **Tradução de Linguagem de Sinais**
Desenvolver um sistema que use uma rede neural convolucional (CNN)
para reconhecer sinais de linguagem de sinais e traduzi-los para texto ou
fala.
5. **Controlador de jogos com Imagens do jogador**
Utilizar imagens corporais e gestos do jogador para executar comandos
que controlam um personagem de um jogo.
Explicando LLMs
Me dê cinco ideias de projetos
para um feira de Inteligência
Artificial
Usuário:
Google é pioneiro na indústria de IA
Responsible AI
3,000
Pesquisadores
7,000
Publicações
Construído e testado
para segurança
Privacidade no
Design
Mantém altos padrões
científicos
Responsável perante as
pessoas
Socialmente Benéfico
Evite criar
preconceitos
injustos
2015
Google DeepMind
AlphaGo derrota o
campeão de Go
2016
Google
DeepMind
ajuda a
detectar
doenças em
olhos
2017
Google inventa
o Transformer,
pontapé inicial
da revolução
dos LLM
2018
Google cria um
modelo
inovador de
grande
linguagem
(BERT)
2019
Text-to-Text
Transfer
Transformer
LLM 10B P
Model Open
Sourced
2020
Google LaMDA
Modelo
treinado para
conversação
2022
AlphaFold prevê
estruturas de
todas as
proteínas
conhecidas
2023
Serviço de IA
Conversacional
desenvolvido
com PaLM2
2024
Família de LLMs e
produtos
multimodais
Multimodalidade
(Android AICore)
ai.google.dev/gemma
O Ecosistema Gemini
A mais avançada IA do Google
Para Desenvolvedores
Para Consumidores
Para Negócios e Empresas
Modelos
Gemini API
(in Google AI Studio + ai.google.dev)
Gemini for Google Workspace
Gemini for Google Cloud
Gemini in Vertex AI
Gemini | app and web
Gemini in the Google App
Gemini in Gmails, Docs…
Vamos focar um
pouco mais aqui
Começando com
Gemini API
AI Studio
aistudio.google.com
●Generate API Keys
●Create, test, and save prompts
●Customize models in minutes
●Generate starter code
AI Studio
aistudio.google.com
AI Studio
aistudio.google.com
AI Studio
aistudio.google.com
AI Studio
aistudio.google.com
AI Studio
ai.google.dev
REST API + Client libraries for Python, Node, Java, and Swift
LibrariesLibrariesSDKs
Vertex AI
Enterprise grade support.
Full MLOps (Examples: Model
evaluation, monitoring, registry)
Vertex AI
Check it out when you're
ready for production
Gemini API and Vertex AI
Both give access Gemini family
models.
Vertex AI
Cloud Production
ai.google.dev/docs/migrate_to_cloud
PlatfoEndpoints
rms
PlatformsPlatforms
Basicamente, prompts
Escreva um poema sobre uma mochila mágica.
In a world of wonder, behold,
A backpack with tales yet untold.
Its fabric shimmers, alive with might,
A portal to realms, day and night.
aistudio.google.com
Escreva um poema sobre uma mochila mágica
A mochila pendurada, um simples saco de pano,
Mas guarda em suas costuras segredos sobre-humanos.
Um universo inteiro, onde tudo pode caber,
De florestas e oceanos, a um reino para se esconder.
...
aistudio.google.com
Gere uma pequena dica sobre Javascript. Boas dicas são curtas e
contém um código de exemplo demonstrando.
**Tip: Use Array.prototype.flat() to flatten an array of
arrays.**
Gere uma pequena dica sobre Javascript. Boas dicas são curtas e
contém um código de exemplo demonstrando.
tópico: let e const
dica: Prefira usar `let` e `const` para inicializar variáveis, em
vez de `var`, pois `var` não tem escopo de bloco ou proteção
contra mutabilidade.
e.g. const myVar = "EXAMPLE";
tópico: arrow syntax
dica:
AI StudioEndpointsPrompting
import google.generativeai as genai
model = genai.GenerativeModel( 'models/gemini-pro')
resp = model.generate_content(
'Escreva o primeiro parágrafo de uma história sobre uma mochila
mágica')
>>> print(resp.text)
Clara nunca pensou que sua vida mudaria por causa de uma mochila velha e
empoeirada. Era feita de um tecido gasto e desbotado, com remendos
coloridos e uma alça desfiada. Ela a encontrara esquecida no sótão da
casa de sua avó, entre caixas de fotografias antigas e brinquedos
quebrados...
chat = model.start_chat()
response = chat.send_message(
"Olá, o que devo comer no jantar?" )
print(response.text)
# 'Aqui tem algumas sugestões...'
response = chat.send_message(
"Como eu preparo a primeira?" )
Contents
Settings/Configs
Text
Text
Text
Tools
Tools
Safety Settings
Safety Settings
Generation Config
Content of role: "user"
containing multimodal Parts:
Content of role: "model"
"user" content: a text Part
Tools specified by the caller
Safety setting configured by caller
Generation Config includes: temperature, Top P, Top K,
stop seq, max output tokens etc.
Generate Content Request
Candidate(s)
Feedback
Text
Block Reason
Safety Ratings
Candidate "Content"
note: only one candidate returned today
why the model stopped generating
feedback on the prompt
Finish Reason
Finish Message set if finish reason is present
Safety Ratings
Safety Ratings
how safe is the response
Generate Content Response
…
AI StudioEndpointsPrompting
https://ai.google.dev/docs/prompt_best_practices
●Prompts encadeados - faça um plano e execute-o
●Contexto – Poucos e pequenos prompts
●Parâmetros de geração - Temperatura, configurações de segurança,
Top-P, TopK
Geração de códigos
Code Generation
Content generation
Convert human language
requests to machine
requests
Rapid tool buildingGenerate code based on a
user prompt
Examples
Code generation
●Generate data
Examples
Code generation
●Generate data
●Generate a SQL query
Examples
Code generation
●Generate data
●Generate a SQL query
●Simulate Execution
Examples
Code generation
●Generate data
●Generate a SQL query
●Simulate Execution
●Quick coding help - Smart pair programer.
Escreva um teste de unidade para esta função Python:
def is_diver_neutrally_buoyant(density_water, density_diver):
"""Verifique se um mergulhador tem flutuabilidade neutra."""
if density_water == density_diver:
return True
import unittest
class IsDiverNeutrallyBuoyantTest (unittest.TestCase):
result = model.generate_content( f"""
Explique como pode haver vida no mar profundo.
Responda a pergunta baseado no seguinte documento:
{document}""")
Use the prompt's context window
Learning more
●Limited by the model's context length
○gemini-1.0-pro: 30K tokens.
Search & IR
Use the prompt's context window
Learning more
●gemini-1.5-pro: 1M tokens.
○Remember: Everything is tokens.
●Using context may be more flexible than Retrieval Augmented
Generation (RAG).
●More input context means higher latency
●Join the waitlist:
https://aistudio.google.com/app/waitlist/97445851
Use respostas de perguntas atribuídas
(AQA) com pedaços embutidos
AQA
●Instructions + Chunks + Question in one request
●AQA is a new feature
○Feedback is welcome
○The AQA model is specifically tuned for RAG
import google.generativeai.answer
import google.generativeai as genai
chunks = ["Your Googlecar has a climate control system that allo..." ,
"Your Googlecar has a large touchscreen display that p..." ,
"Your Googlecar has an automatic transmission. To shif..." ]
answer = genai.answer.generate_answer(
contents='How do you make it less windy?' ,
answer_style=AnswerStyle. ABSTRACTIVE,
inline_passages=chunks)
answer { content { parts {
text: "To decrease the airflow inside the car, "
"turn the airflow knob counterclockwise." }}
...
grounding_attributions {
source_id { grounding_passage { passage_id: "0" } }
content { parts { text:
"Your Googlecar has a climate control system that allows..." }}}
}
answerable_probability: 0.995222747
Use AQA with inline chunks
Learning more
●Limited by the request size - Max 4MB.
●Inefficient when asking about the same data repeatedly.
●Simple to use.
●Example: handle chunks returned by a search tool.
AQA
Use AQA with the retriever service
Learning more
●Corpus < Document < Chunk hierarchy.
●Pass a reference to a document or corpus to the
generate_answer function.
Corpus
Document Chunk
AQA
from google.generativeai import retriever
my_corpus = retriever.create_corpus(display_name= "My Documents")
for document in my_documents:
doc = my_corpus.create_document(display_name=document.title)
doc.batch_create_chunks(htmlchunker.chunk(document.html))
Use AQA with the retriever service
Learning more
●Supports filtering based on chunk metadata
●Limits
○10k documents per corpus
○1M chunks per corpus
○5 corpora per user
AQA
Chamada de Funções
Function Calling
https://github.com/GoogleCloudPlatform/generative-ai/tree/main/gemini/function-calling/sql-talk-app goo.gle/gemini-fn-call-sql-github
Function Calling
https://github.com/GoogleCloudPlatform/generative-ai/tree/main/gemini/function-calling/sql-talk-app
SELECT ROUND((
COUNT(DISTINCT IF(returned_at IS NOT NULL, order_id, NULL)) /
COUNT(DISTINCT order_id)) * 100, 2)
AS return_rate
FROM thelook_ecommerce.orders
goo.gle/gemini-fn-call-sql-github
Function Calling
Function Calling
●Describe external functions to the model.
●The model may ask you to call the function to help it respond
to your queries.
●Endless possibilities for integrating external tools.
Function calling - Basics
Function Calling
●How?
●The google.generativeai SDK will inspect the function's
type hints to determine the schema.
●Allowed types are limited:
AllowedTypes = (
int | float | str | bool | list | dict )
https://ai.google.dev/tutorials/function_calling_python_quickstart
model = genai.GenerativeModel(
model_name='gemini-1.0-pro',
tools=[multiply])
Function calling - Basics
Function Calling
●Because function calling requires alternating turns, it's easiest
to use through chat.
●Enable "automatic function calling" when you start a chat, and
the ChatSession will call the function(s) for you.
○You don't have to use automatic function calling, it just
makes simple cases easier.
https://ai.google.dev/tutorials/function_calling_python_quickstart
response = chat.send_message(
'I have 57 cats, each owns 44 mittens, '
'how many mittens is that in total?' )
print(response.text)
# The number of mittens in total is 2508.
Function calling - Basics
Function Calling
●What happened? Use the chat history to find out.
●The chat history collects all the function calls and responses
that took place.
https://ai.google.dev/tutorials/function_calling_python_quickstart
for content in chat.history:
part = content.parts[ 0]
print(content.role, "->", type(part).to_dict(part))
for content in chat.history:
part = content.parts[ 0]
print(content.role, "->", type(part).to_dict(part))
# user -> {'text': 'I have 57 cats, each owns 44 mittens, '
# 'how many mittens is that in total?' }
# model -> {'function_call': {'name': 'multiply',
# 'args': {'a': 57.0, 'b': 44.0}}}
# user -> {'function_response': {'name': 'multiply',
# 'response': {'result': 2508.0}}}
# model -> {'text': ' The number of mittens in total is 2508.' }
Contents
Tools
Text
Function Declaration
Function Declaration
Text
Function Calling interaction
Function Call
Function Response
Text
model may predict a function call
based on user content
model can understand the function response
and generate text
OR another function call
if one or more function declarations are
provided, function calling feature will turn on
Function calling - More Examples
Function Calling
●Wikipedia research aid
○Integrates a search tool.
○Uses the Gemini API inside the function call to summarize pages.
res = chat.send_message(instructions.format(query=query))
# Searching for "How do deep-sea creatures survive the extreme pressure?"
# Related search terms: [ 'Deep sea', 'Deep-sea community' , 'Deep-sea fish']
# Fetching page: "Deep sea"
# Information Source: https://en.wikipedia.org/wiki/Deep_sea
# Fetching page: "Deep-sea community"
# Information Source: https://en.wikipedia.org/wiki/Deep-sea_community
# Fetching page: "Deep-sea fish"
# Information Source: https://en.wikipedia.org/wiki/Deep-sea_fish
# Searching for "How do deep-sea creatures survive the cold temperatures?"
# Related search terms: [ 'Deep-sea community' , 'Deep sea', 'Deep-water coral']
# Fetching page: "Deep-water coral"
# Information Source: https://en.wikipedia.org/wiki/Deep-water_coral
# Searching for "What adaptations do deep-sea creatures have to survive?"
# Related search terms: [ 'Deep sea', 'Deep-sea community' , 'Deep-sea fish']
# Searching for "What are the unique characteristics of deep-sea creatures?"
# Related search terms: [ 'Deep-sea community' , 'Deep-sea fish', 'Deep sea']
# Searching for "How do deep-sea creatures find food?"
# Related search terms: [ 'Deep-sea community' , 'Deep-sea fish', 'The Deep (TV series)' ]
# Fetching page: "The Deep (TV series)"
# Information Source: https://en.wikipedia.org/wiki/The_Deep_(TV_series)
# Searching for "How do deep-sea creatures communicate with each other?"
Function calling - More Examples
Function Calling
●SQL Talk
○Use function calling to talk to a database.
○Live example: https://goo.gle/gemini-fn-call-sql
https://github.com/GoogleCloudPlatform/generative-ai/tree/main/gemini/function-calling/sql-talk-app
sql_query_func = FunctionDeclaration(
name="sql_query",
description="Get information from data in BigQuery using SQL queries" ,
parameters={
"type": "object",
"properties": {
"query": {
"type": "string",
"description": "SQL query on a single line ...
Function Calling
https://github.com/GoogleCloudPlatform/generative-ai/tree/main/gemini/function-calling/sql-talk-app
SELECT ROUND((
COUNT(DISTINCT IF(returned_at IS NOT NULL, order_id, NULL)) /
COUNT(DISTINCT order_id)) * 100, 2)
AS return_rate
FROM thelook_ecommerce.orders
Function calling - Schema
Function Calling
●Automatically building the schema from the type hints doesn't
[currently 02/2024] work for everything.
●The allowed types are actually:
Function calling - Schema
Function Calling
●Let's look at how the schema is constructed.
https://ai.google.dev/tutorials/function_calling_python_quickstart
model = genai.GenerativeModel(
model_name='gemini-1.0-pro',
tools=[multiply])
model._tools.to_proto()
[function_declarations {
name: "multiply"
description: "returns a * b."
parameters {
type_: OBJECT
properties {
key: "b" value { type_: NUMBER }}
properties {
key: "a" value { type_: NUMBER }}
required: "a" required: "b" }}]
Function calling - Schema
Function Calling
●It's an OpenAPI schema, written as a protobuf.
●The protobuf-classes are available in the
google.ai.generativelanguage client library.
●Reference docs:
calculator = glm.Tool(
function_declarations=[
glm.FunctionDeclaration(
name='multiply',
description= "Returns the product of two numbers." ,
parameters=glm.Schema(
type=glm.Type. OBJECT,
properties={
'a': glm.Schema(type=glm.Type. NUMBER),
'b': glm.Schema(type=glm.Type. NUMBER)},
required=[ 'a','b']))])
Function calling - Schema
Function Calling
●They can be written out as JSON-compatible objects as well.
https://ai.google.dev/tutorials/function_calling_python_quickstart
model = genai.GenerativeModel(
model_name='gemini-1.0-pro',
tools=[calculator])
Function calling - Structured data
Function Calling
●Structured data extraction.
●You can just ask the model to do it and return JSON.
https://ai.google.dev/tutorials/structured_data_extraction
response = model.generate_content(textwrap.dedent( """\
Please return JSON describing the the people, places, things and relationships from this
story using the following schema:
Function calling - Structured data
Function Calling
●Asking for JSON often works.
●Function calling lets you strictly describe the schema.
●With a strict description, we can strictly enforce that that's
what gets returned.
https://ai.google.dev/tutorials/structured_data_extraction
model = model = genai.GenerativeModel(
model_name='gemini-1.0-pro',
tools = [add_to_database])
Function calling - Structured data
Function Calling
●Why?
○Function calling doesn 't return a string.
○Function calling returns data-structures: You don't parse
text.
https://ai.google.dev/tutorials/structured_data_extraction
response = model.generate_content(f """
Please add the people, places, things and relationships
from this story to the database:
{story}
""")
fc = response.candidates[ 0].content.parts[0].function_call
pprint.pprint(type(fc).to_dict(fc))
{'name': 'add_to_database',
'args': {'people': [{'description': 'A curious girl with wild hair of '
'midnight blue and eyes that shone like... ,
'name': 'Luna',
...
'places': [{'description': 'Nestled between rolling hills and '
'sparkling rivers, where Luna and her '
'family lived.',
'name': 'Small town'}],
'relationships': [{'person_1_name': 'Luna', 'person_2_name': 'Elara',
'relationship': 'Mother-daughter'},
{ 'person_1_name': 'Luna', 'person_2_name': 'Orion',
'relationship': 'Father-daughter'}],
'things': [{'description': 'A dusty, forgotten backpack that Luna '
'found in the attic, which possessed a...' ,
'end_place_name': None,
'name': 'Backpack',
'start_place_name': 'Attic'},
...