Openai(ChatGPT) completions API를 사용할 때 stream
으로 응답받는 방법을 java로 만들어보자.
우선 테스트를 하기 위해 api 키가 필요한데 api 키 발급 받는 방법은 생략하도록 한다.
api key 발급: https://platform.openai.com/account/api-keys
api 테스트
우선 발급받은 api가 잘 동작하는 지 api로만 테스트해보자.
intellij에서 제공하는 http를 통해 테스트하기로 한다.
<인증키> 부분은 각자 api 인증키를 추가로 넣어야 한다.
사용 api는 아래를 참고하도록 하자.
https://platform.openai.com/docs/api-reference/chat/create
[gpt.http]
POST https://api.openai.com/v1/chat/completions
content-Type: application/json
Authorization: Bearer <인증키>
{
"model": "gpt-3.5-turbo",
"messages": [{
"role": "user",
"content": "안녕"
}],
"stream": true
}
위의 코드를 실행하면 아래와 같은 결과가 표시된다.
[응답결과]
data: {"id":"chatcmpl-7JAWKy97ys368Xme5fIRjj1ujwfS5","object":"chat.completion.chunk","created":1684803028,"model":"gpt-3.5-turbo-0301","choices":[{"delta":{"role":"assistant"},"index":0,"finish_reason":null}]}
data: {"id":"chatcmpl-7JAWKy97ys368Xme5fIRjj1ujwfS5","object":"chat.completion.chunk","created":1684803028,"model":"gpt-3.5-turbo-0301","choices":[{"delta":{"content":"íì¸ì"},"index":0,"finish_reason":null}]}
<계속>
data: {"id":"chatcmpl-7JAWKy97ys368Xme5fIRjj1ujwfS5","object":"chat.completion.chunk","created":1684803028,"model":"gpt-3.5-turbo-0301","choices":[{"delta":{"content":"?"},"index":0,"finish_reason":null}]}
data: {"id":"chatcmpl-7JAWKy97ys368Xme5fIRjj1ujwfS5","object":"chat.completion.chunk","created":1684803028,"model":"gpt-3.5-turbo-0301","choices":[{"delta":{},"index":0,"finish_reason":"stop"}]}
data: [DONE]
pom.xml
pom.xml에 필요한 정보를 추가하자.
spring-boot-starter-webflux를 사용하여 stream으로 받아올 예정이다.
<parent>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-parent</artifactId>
<version>2.7.9</version>
<relativePath/>
</parent>
<groupId>org.example</groupId>
<artifactId>chatgpt-test</artifactId>
<version>1.0-SNAPSHOT</version>
<properties>
<maven.compiler.source>8</maven.compiler.source>
<maven.compiler.target>8</maven.compiler.target>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
</properties>
<dependencies>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-webflux</artifactId>
</dependency>
<dependency>
<groupId>io.netty</groupId>
<artifactId>netty-resolver-dns-native-macos</artifactId>
<version>4.1.73.Final</version>
<classifier>osx-aarch_64</classifier>
</dependency>
<dependency>
<groupId>org.projectlombok</groupId>
<artifactId>lombok</artifactId>
</dependency>
</dependencies>
OpenAIService
WebClient로 인증키를 사용하여 https://api.openai.com/v1/chat/completions를 호출하고 응답값을 String으로 리턴한다.
@Service
public class OpenAIService {
@Value("${openai.url}")
private String openAiUrl;
@Value("${openai.key}")
private String openAiKey;
private WebClient client;
private final ObjectMapper objectMapper = new ObjectMapper()
.configure(DeserializationFeature.FAIL_ON_UNKNOWN_PROPERTIES, false)
.setPropertyNamingStrategy(PropertyNamingStrategies.SNAKE_CASE );
@PostConstruct
public void init() {
client = WebClient.builder()
.baseUrl(openAiUrl)
.defaultHeader(HttpHeaders.CONTENT_TYPE, MediaType.APPLICATION_JSON_VALUE)
.defaultHeader("Authorization", "Bearer " + openAiKey)
.build();
}
public Flux<String> ask(String question) throws JsonProcessingException {
CompletionRequest request = new CompletionRequest(question);
String requestValue = objectMapper.writeValueAsString(request);
Flux<String> eventStream = client.post()
.bodyValue(requestValue)
.accept(MediaType.TEXT_EVENT_STREAM)
.retrieve()
.bodyToFlux(String.class);
return eventStream;
}
OpenAIResource
RestController를 하나 만들어서 요청을 받게 한다.
@RestController
@RequiredArgsConstructor
@Slf4j
public class OpenAIResource {
private final OpenAIService openAIService;
@PostMapping(value = "ask", produces = MediaType.TEXT_EVENT_STREAM_VALUE)
public Flux<String> ask(@RequestBody Question question) {
try {
return openAIService.ask(question.getQuestion());
} catch (JsonProcessingException e) {
log.error(e.getMessage());
return Flux.empty();
}
}
}
@Getter
public class Question {
private String question;
}
Application
SpringBoot Application을 띄워서 API요청을 받도록 한다.
@SpringBootApplication
public class Application {
public static void main(String[] args) {
SpringApplication.run(Application.class, args);
}
}
application.yml
server:
port: 5001
openai:
url: https://api.openai.com/v1/chat/completions
key: <인증키>
SpringBoot Application 실행
Application을 실행한다.
API 실행
아래와 같이 API를 실행한다.
POST http://localhost:5001/ask
content-Type: application/json
{
"question": "안녕"
}
API 응답
아래와 같이 응답이 표시되는 것을 확인할 수 있다.
data:{"id":"chatcmpl-7JAf0j9pHX1DEst6VNUhYoK5xoeEj","object":"chat.completion.chunk","created":1684803566,"model":"gpt-3.5-turbo-0301","choices":[{"delta":{"role":"assistant"},"index":0,"finish_reason":null}]}
data:{"id":"chatcmpl-7JAf0j9pHX1DEst6VNUhYoK5xoeEj","object":"chat.completion.chunk","created":1684803566,"model":"gpt-3.5-turbo-0301","choices":[{"delta":{"content":"하세요"},"index":0,"finish_reason":null}]}
<계속>
data:{"id":"chatcmpl-7JAf0j9pHX1DEst6VNUhYoK5xoeEj","object":"chat.completion.chunk","created":1684803566,"model":"gpt-3.5-turbo-0301","choices":[{"delta":{"content":"?"},"index":0,"finish_reason":null}]}
data:{"id":"chatcmpl-7JAf0j9pHX1DEst6VNUhYoK5xoeEj","object":"chat.completion.chunk","created":1684803566,"model":"gpt-3.5-turbo-0301","choices":[{"delta":{},"index":0,"finish_reason":"stop"}]}
data:[DONE]
반응형