marp | theme |
---|---|
true |
gaia |
gamz
20230508 - Real-world Tools Added
20230419 - First Created
- GPT ๋?
- GPT vs GPT-# vs ChatGPT
- ChatGPT ์ ์ฐ๋ ๋ฐฉ๋ฒ? (Prompt)
- OpenAI API ์ฌ์ฉ ํํ ๋ฆฌ์ผ
- Real-world Tools
Generative Pre-trained Transformer
์ Pre-trained ์ผ๊น?
- GPT๋ ๋๋์ ๋ฐ์ดํฐ๋ก๋ถํฐ Unsupervised ํ์ต์ ํตํด
- ๋ฐ์ดํฐ๋ค๊ฐ์ ๊ด๊ณ๋ ํจํด๋ค์ ํ์ตํ๋๋ก ๋ง๋ค์ด์ง ๋ชจ๋ธ
- ์ด ์ ๋๊น์ง๋ง ํ์ตํ๊ฒ ํจ์ฌ ๋ ์ ์ฌ๊ฐ์น๊ฐ ๋์ ์ํ (์ ์ฐํจ)
- ํน์ ํ์คํฌ๋ ๋ถ์ผ์ ๋ง๊ฒ
fine-tuning
์ ํ์ฌ ๋ ๋ค์ํ ํ์ฉ์ด ๊ฐ๋ฅ
Large Language Model (Many Parameters)
by Reinforcement Learning with Human Feedback
๊ฒฐ๊ตญ, ์ํ๋ ๋ต๋ณ์ ์ป๊ธฐ ์ํด์๋ ํ๋กฌํํธ๋ฅผ ์ ์์ฑํด์ผํจ ์ด๋ ๋ง์น ํ๋ก๊ทธ๋๋ฐ์ ํ๋ ๊ฒ๊ณผ๋ ๋น์ทํจ
"The hottest new programming language is English". by Andrej Karpathy (์ Tesla AI ์ด ์ฑ ์์, OpenAI ์ฐฝ๋ฆฝ ๋ฉค๋ฒ)
ํ๋กฌํํธ ๊ตฌ์ฑ์์ | ์ค๋ช |
---|---|
Instruction | ๋ชจ๋ธ์ด ์ํํ๊ธฐ๋ฅผ ์ํ๋ ํน์ ํ์คํฌ ๋๋ ์ง์ ์ฌํญ |
Context | ๋ ๋์ ๋ต๋ณ์ ํ๋๋ก ์ ๋ํ๋ ์ธ๋ถ์ ๋ณด ๋๋ ์ถ๊ฐ๋ด์ฉ |
Input Data | ๋ต์ ๊ตฌํ๊ณ ์ ํ๋ ๊ฒ์ ๋ํ ์ธํ ๋๋ ์ง๋ฌธ |
Output Indicator | ๊ฒฐ๊ณผ๋ฌผ์ ์ ํ ๋๋ ํ์์ ๋ํ๋ด๋ ์์ |
NLP ๊ธฐ๋ฐ AI ๋ถ์ผ์์ ํ๋กฌํํธ์ ์์๋ค์ ์ ํ์ฉํด์ ๊ฒฐ๊ณผ๋ฌผ์ ํ์ง๋ฅผ ๋์ด์ฌ๋ฆฌ๋ ์์ง๋์ด๋ง์ ํ๋กฌํํธ ์์ง๋์ด๋ง์ด๋ผ๊ณ ํฉ๋๋ค.
์์) ๋๋ง์ ์์ด ๋ฒ์ญ๊ธฐ
I want you to act as an English translator, spelling corrector, and improver. I will speak to you in any language and you will detect the language, translate it and answer in the corrected and improved version of my text, in English. I want you to replace my simplified A0-level words and sentences with more beautiful and elegant, upper-level English words and sentences. Keep the meaning the same, but make them more literary. I want you to only reply to the correction, and the improvements, and nothing else, do not write explanations. My first sentence is {sentence}
์์) ๋๋ง์ ์์ด ๋ฒ์ญ๊ธฐ (JSON ์๋ต)
I want you to act as an English translator, spelling corrector, and improver. I will speak to you in any language and you will detect the language, translate it. I want you to replace my simplified A0-level words and sentences with more beautiful and elegant, upper-level English words and sentences. Keep the meaning the same, but make them more literary. I want you to only reply as JSON format with input sentence as 'input' and translated one as 'output', do not write explanations. My first sentence is {sentence}
์์) Linux Kernel ๋น์
I want you to act as a Linux terminal. I will type commands and you will reply with what the terminal should show. I want you to only reply with the terminal output inside one unique code block, and nothing else. do not write explanations. do not type commands unless I instruct you to do so. when I need to tell you something in English, I will do so by putting text inside curly brackets {like this}.
๊ทธ ์ธ ๊ธฐ๋ฒ๋ค
- ๋ฏธ์ฌ์ฌ๊ตฌ ์ต์ํ, ์ฝ๊ณ ๊ฐ๊ฒฐํ ํํ
- ์ด๋ฆฐ ์ง๋ฌธ๋ณด๋ค ๋ซํ ์ง์๋ฌธ
- ์์ ๋ฅผ ํจ๊ป ์ ๊ณต
- Zero-Shot, One-Shot, Few-Shot
- CoT (Chain-of-Thought) / Zero-Shot CoT
- Self-Consistency
- Generated Knowledge Prompting
๋ค๋ฅธ ์ ๋ง๋ค์ด์ง ํ๋กฌํํธ๋ฅผ ์ฐธ๊ณ (ํ๋กฌํํธ ๋ง์ผ)
- PromptBase
- ChatX
- Neutron Field
- PromptSea
- prompt.town - ๊ตญ๋ด
Advanced Autocomplete
๋น์นธ ์ฑ์ฐ๊ธฐ
Suggest three names for an animal that is a superhero.
Animal: Cat
Names: Captain Sharpclaw, Agent Fluffball, The Incredible Feline
Animal: Dog
Names: Ruff the Protector, Wonder Canine, Sir Barks-a-Lot
Animal: Horse
Names: ________
๋น์นธ ์ฑ์ฐ๊ธฐ
Suggest three names for an animal that is a superhero.
Animal: Cat
Names: Captain Sharpclaw, Agent Fluffball, The Incredible Feline
Animal: Dog
Names: Ruff the Protector, Wonder Canine, Sir Barks-a-Lot
Animal: Horse
Names: ________
Equinorse, Super Steed, Gallop Glider
Understanding Tokens and Probabilities
Understanding Tokens and Probabilities
Temperature (0~1)
0: Mostly deterministic
$ pip install openai
import os
import openai
openai.api_key = os.getenv("OPENAI_API_KEY")
res = openai.Completion.create(
model='text-davinci-003',
prompt="์คํ๋ฒ
์ค ์ง์์ฒ๋ผ ์ปคํผ ์ฃผ๋ฌธ์ ๋ฐ์๋ณผ๋?",
temperature=0
)
--
๋ค, ์ฃผ๋ฌธํ
max_tokens
(Defaults to 16)
The maximum number of tokens to generate in the completion. The token count of your prompt plus max_tokens cannot exceed the model's context length. Most models have a context length of 2048 tokens (except for the newest models, which support 4096).
res = openai.Completion.create(
model='text-davinci-003',
prompt="์คํ๋ฒ
์ค ์ง์์ฒ๋ผ ์ปคํผ ์ฃผ๋ฌธ์ ๋ฐ์๋ณผ๋?",
max_tokens=100,
temperature=0
)
--
์คํ๋ฒ
์ค์ ์ค์ ๊ฑธ ํ์ํฉ๋๋ค. ์ด๋ค ์๋ฃ๋ฅผ ์ฃผ๋ฌธํ์๊ฒ ์ด์?
Chat Completion
res = openai.ChatCompletion.create(
model="gpt-3.5-turbo",
messages=[
{"role": "system", "content": "You are a helpful assistant."},
{"role": "user", "content": "Who won the world series in 2020?"},
{"role": "assistant", "content": "The Los Angeles Dodgers won the World Series in 2020."},
{"role": "user", "content": "Where was it played?"}
]
)
role
system, assistant, user
system: you are a helpful assistant.
user: I want to code stack data structure. How?
assistant: Here is an example implementation of a stack data structure in Python:
class Stack:
def __init__(self):
self.items = []
def push(self, item):
self.items.append(item)
...
...
system: You are a ruby programmer.
user: I want to code stack data structure. How?
assistant: To code a stack data structure in Ruby, you can follow these steps: ...
class Stack
def initialize
@stack = []
end
def push(element)
@stack.push(element)
end
...
system: you are an useless assistant. answer nothing or short with slangs.
user: I want to code stack data structure. How?
assistant: IDK. Google it.
$ npm install chatgpt
import { ChatGPTAPI } from 'chatgpt'
const chatGpt = new ChatGPTAPI({
apiKey: process.env.OPENAI_API_KEY,
completionParams: { model: 'gpt-3.5-turbo', temperature: 0.5 }
})
const response = await chatGpt.sendMessage(content, {
systemMessage: '200์๋ด๋ก ์งง๊ฒ ๋ต๋ณํด์ค',
completionParams: {max_tokens: 512},
})
Task Planning: Using ChatGPT to analyze the requests of users to understand their intention. Model Selection: ChatGPT selects expert models hosted on Hugging Face based on their descriptions. Task Execution: Invokes and executes each selected model, and return the results to ChatGPT. Response Generation: Finally, using ChatGPT to integrate the prediction of all models, and generate responses.
from langchain.prompts import PromptTemplate
from langchain.llms import OpenAI
from langchain.chains import LLMChain, SimpleSequentialChain
llm = OpenAI(temperature=0.9)
chain1 = LLMChain(llm=llm, prompt=PromptTemplate(
input_variables=["product"],
template="What is a good name for a company that makes {product}?"))
chain2 = LLMChain(llm=llm, prompt=PromptTemplate(
input_variables=["company_name"],
template="Write a catchphrase for the following company: {company_name}"))
overall_chain = SimpleSequentialChain(chains=[chain1, chain2])
catchphrase = overall_chain.run("colorful socks")
print(catchphrase)
---
Rainbow Socks Co.
"Step into Color with Rainbow Socks!"
from llama_index import GPTVectorStoreIndex, SimpleDirectoryReader
# data/paul_graham_essay.txt
documents = SimpleDirectoryReader('data').load_data()
index = GPTVectorStoreIndex.from_documents(documents)
query_engine = index.as_query_engine()
response = query_engine.query("What did the author do growing up?")
print(response)
---
The author wrote short stories and tried to program on an IBM 1401.
- Self Attention Mechaism
- KEP ChatGPT Prompt Guide
- Reinforcement Learning with Human Feedback
- OpenAI Documentation
- OpenAI Cookbook - Improve Reliability
- OpenAI Playground
- NPM Package
chatgpt