Skip to content

Commit

Permalink
Merge pull request #91 from mlverse/updates
Browse files Browse the repository at this point in the history
Prepares for CRAN release
  • Loading branch information
edgararuiz authored Apr 23, 2024
2 parents 0d916d1 + a95c0f1 commit 92a4486
Show file tree
Hide file tree
Showing 18 changed files with 88 additions and 42 deletions.
1 change: 1 addition & 0 deletions .Rbuildignore
Original file line number Diff line number Diff line change
Expand Up @@ -9,3 +9,4 @@
^_pkgdown\.yml$
^docs$
^pkgdown$
^cran-comments\.md$
10 changes: 7 additions & 3 deletions DESCRIPTION
Original file line number Diff line number Diff line change
@@ -1,11 +1,15 @@
Package: chattr
Title: Integrates LLM's with the RStudio IDE
Version: 0.0.0.9012
Title: Interact with Large Language Models in 'RStudio'
Version: 0.1.0
Authors@R: c(
person("Edgar", "Ruiz", , "[email protected]", role = c("aut", "cre")),
person(given = "Posit Software, PBC", role = c("cph", "fnd"))
)
Description: Integrates LLM's with the RStudio IDE.
Description: Enables user interactivity with large-language models ('LLM') inside
the 'RStudio' integrated development environment ('IDE'). The user can
interact with the model using the 'Shiny' app included in this package, or
directly in the 'R' console. It comes with back-ends for 'OpenAI', 'GitHub'
'Copilot', and 'LlamaGPT'.
URL: https://github.com/mlverse/chattr,
https://mlverse.github.io/chattr/
BugReports: https://github.com/mlverse/chattr/issues
Expand Down
9 changes: 6 additions & 3 deletions R/app-server.R
Original file line number Diff line number Diff line change
Expand Up @@ -172,9 +172,12 @@ app_add_assistant <- function(content, input) {

prep_entry <- function(x, remove) {
if (remove) {
split_ch <- unlist(strsplit(x, "\n"))
ch <- split_ch[2:(length(split_ch) - 1)]
x <- paste0(ch, collapse = "\n")
x1 <- x[x != ""]
if (length(x1) > 0) {
split_ch <- unlist(strsplit(x1, "\n"))
ch <- split_ch[2:(length(split_ch) - 1)]
x <- paste0(ch, collapse = "\n")
}
}
x
}
Expand Down
33 changes: 22 additions & 11 deletions R/chattr-app.R
Original file line number Diff line number Diff line change
Expand Up @@ -40,18 +40,29 @@ chattr_app <- function(viewer = c("viewer", "dialog"),
}
} else {
run_file <- tempfile()
writeLines(
c(
"app <- chattr:::app_interactive(as_job = TRUE)\n",
"rp <- list(ui = app$ui, server = app$server)\n",
paste0(
"shiny::runApp(rp, host = '",
as_job_host,
"', port = ",
as_job_port,
")"
)
defaults_file <- path(
tempdir(),
paste0("chat_", paste0(floor(stats::runif(10, 0, 10)), collapse = "")),
ext = "yml"
)
chattr_defaults_save(defaults_file)
app_code <- c(
paste0(
"Sys.setenv(CHATTR_USE = \"", defaults_file, "\")"
),
"print(chattr::chattr_defaults())",
"app <- chattr:::app_interactive(as_job = TRUE)",
"rp <- list(ui = app$ui, server = app$server)",
paste0(
"shiny::runApp(rp, host = '",
as_job_host,
"', port = ",
as_job_port,
")"
)
)
writeLines(
app_code,
con = run_file
)
jobRunScript(path = run_file)
Expand Down
12 changes: 10 additions & 2 deletions R/chattr-defaults.R
Original file line number Diff line number Diff line change
Expand Up @@ -76,15 +76,23 @@ chattr_defaults <- function(type = "default",
ch_env$defaults <- NULL
}
env_model <- NULL
check_files <- NULL
if (is.null(chattr_defaults_get(type))) {
# Overrides environment variable if YAML file is present
if (file_exists(yaml_file)) {
check_files <- yaml_file
} else {
check_files <- ch_package_file(Sys.getenv("CHATTR_USE", unset = NA))
env_use <- Sys.getenv("CHATTR_USE", unset = NA)
if (!is.na(env_use)) {
if (is_file(env_use)) {
check_files <- path_expand(env_use)
} else {
check_files <- ch_package_file(env_use)
}
}
}
env_model <- Sys.getenv("CHATTR_MODEL", unset = NA)
if(is.na(env_model)) {
if (is.na(env_model)) {
env_model <- NULL
}
for (j in seq_along(check_files)) {
Expand Down
6 changes: 1 addition & 5 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -11,9 +11,6 @@ status](https://www.r-pkg.org/badges/version/chattr.png)](https://CRAN.R-project
[![](man/figures/lifecycle-experimental.svg)](https://lifecycle.r-lib.org/articles/stages.html#experimental)

<!-- badges: end -->

![](man/figures/readme/chattr.gif)

<!-- toc: start -->

- [Intro](#intro)
Expand Down Expand Up @@ -71,8 +68,7 @@ back-ends as time goes by:
</thead>
<tbody>
<tr class="odd">
<td style="text-align: center;"><a
href="https://platform.openai.com/docs/introduction">OpenAI</a></td>
<td style="text-align: center;">OpenAI</td>
<td style="text-align: center;">GPT Models accessible via the OpenAI’s
REST API. <code>chattr</code> provides a convenient way to interact with
GPT 4, and 3.5.</td>
Expand Down
25 changes: 12 additions & 13 deletions README.qmd
Original file line number Diff line number Diff line change
Expand Up @@ -9,8 +9,6 @@ format: md

<!-- badges: end -->

![](man/figures/readme/chattr.gif)

```{r}
#| echo: false
#| eval: false
Expand All @@ -32,15 +30,16 @@ url <- urls[grepl(".io", urls)][[1]]
```

<!-- toc: start -->
- [Intro](#intro)
- [Install](#install)
- [Available models](#available-models)
- [Using](#using)
- [The App](#the-app)
- [Additional ways to interact](#additional-ways-to-interact)
- [How it works](#how-it-works)
- [Keyboard Shortcut](#keyboard-shortcut)
- [How to setup the keyboard shortcut](#how-to-setup-the-keyboard-shortcut)

- [Intro](#intro)
- [Install](#install)
- [Available models](#available-models)
- [Using](#using)
- [The App](#the-app)
- [Additional ways to interact](#additional-ways-to-interact)
- [How it works](#how-it-works)
- [Keyboard Shortcut](#keyboard-shortcut)
- [How to setup the keyboard shortcut](#how-to-setup-the-keyboard-shortcut)

<!-- toc: end -->

Expand All @@ -63,9 +62,9 @@ remotes::install_github("mlverse/chattr")

`chattr` provides two main integration with two main LLM back-ends. Each back-end provides access to multiple LLM types. The plan is to add more back-ends as time goes by:

| Provider | Models | Setup Instructions |
| Provider | Models | Setup Instructions |
|:-----------------:|:--------------------------------:|:-----------------:|
| [OpenAI](https://platform.openai.com/docs/introduction) | GPT Models accessible via the OpenAI's REST API. `chattr` provides a convenient way to interact with GPT 4, and 3.5. | [Interact with OpenAI GPT models](https://mlverse.github.io/chattr/articles/openai-gpt.html) |
| OpenAI | GPT Models accessible via the OpenAI's REST API. `chattr` provides a convenient way to interact with GPT 4, and 3.5. | [Interact with OpenAI GPT models](https://mlverse.github.io/chattr/articles/openai-gpt.html) |
| [LLamaGPT-Chat](https://github.com/kuvaus/LlamaGPTJ-chat) | LLM models available in your computer. Including GPT-J, LLaMA, and MPT. Tested on a [GPT4ALL](https://gpt4all.io/index.html) model. **LLamaGPT-Chat** is a command line chat program for models written in C++. | [Interact with local models](https://mlverse.github.io/chattr/articles/backend-llamagpt.html) |
| [GitHub Copilot](https://docs.posit.co/ide/user/ide/guide/tools/copilot.html) | AI pair programmer that offers autocomplete-style suggestions as you code | [Interact with GitHub Copilot Chat](https://mlverse.github.io/chattr/articles/copilot-chat.html) |

Expand Down
21 changes: 21 additions & 0 deletions cran-comments.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,21 @@
## New submission

This is a new package submission. Enables user interactivity with large-language
models ('LLM') inside the 'RStudio' integrated development environment ('IDE').
The user can interact with the model using the 'Shiny' app included in this
package, or directly in the 'R' console. It comes with back-ends for 'OpenAI',
'GitHub' 'Copilot', and 'LlamaGPT'.

## R CMD check environments

- Mac OS M3 (aarch64-apple-darwin23), R 4.3.3 (Local)

- Mac OS x86_64-apple-darwin20.0 (64-bit), R 4.3.3 (GH Actions)
- Windows x86_64-w64-mingw32 (64-bit), R 4.3.3 (GH Actions)
- Linux x86_64-pc-linux-gnu (64-bit), R 4.3.3 (GH Actions)
- Linux x86_64-pc-linux-gnu (64-bit), R 4.4.0 (dev) (GH Actions)
- Linux x86_64-pc-linux-gnu (64-bit), R 4.2.3 (old release) (GH Actions)

## R CMD check results

0 errors ✔ | 0 warnings ✔ | 0 notes ✔
Binary file removed inst/tests/rstudio-console.rds
Binary file not shown.
Binary file removed inst/tests/rstudio-quarto.rds
Binary file not shown.
Binary file modified inst/tests/rstudio-script.rds
Binary file not shown.
Binary file modified tests/testthat/data/gpt35-error.rds
Binary file not shown.
Binary file modified tests/testthat/data/gpt35-stream.rds
Binary file not shown.
1 change: 1 addition & 0 deletions tests/testthat/test-app-server.R
Original file line number Diff line number Diff line change
Expand Up @@ -23,6 +23,7 @@ test_that("Split content function", {
test_that("Prep-entry works", {
x <- prep_entry(c("a", "b", "c"), TRUE)
expect_true(length(x) == 1)
expect_equal(prep_entry("", TRUE), "")
})

test_that("Cleanup", {
Expand Down
2 changes: 2 additions & 0 deletions tests/testthat/test-backend-openai.R
Original file line number Diff line number Diff line change
Expand Up @@ -86,6 +86,7 @@ test_that("Completion function works for Copilot", {
})

test_that("Copilot token finder works", {
skip_on_cran()
local_mocked_bindings(
req_perform = httr2::req_dry_run,
resp_body_json = function(...) {
Expand Down Expand Up @@ -117,6 +118,7 @@ test_that("Copilot token finder works", {
})

test_that("Copilot token folder not found", {
skip_on_cran()
local_mocked_bindings(
os_win = function(...) TRUE
)
Expand Down
3 changes: 2 additions & 1 deletion tests/testthat/test-chattr-use.R
Original file line number Diff line number Diff line change
Expand Up @@ -11,6 +11,7 @@ test_that("Request submission works", {
})

test_that("Menu works", {
skip_on_cran()
withr::with_envvar(
new = c("OPENAI_API_KEY" = "test"),
{
Expand All @@ -32,7 +33,7 @@ test_that("Menu works", {
new = c(
"CHATTR_USE" = "llamagpt",
"CHATTR_MODEL" = "test/path"
),
),
expect_snapshot(chattr_defaults(force = TRUE))
)
})
2 changes: 2 additions & 0 deletions tests/testthat/test-chattr.R
Original file line number Diff line number Diff line change
Expand Up @@ -12,6 +12,7 @@ test_that("chattr() works", {
})

test_that("External R submit works", {
skip_on_cran()
local_mocked_bindings(
ui_current = function(...) {
"script"
Expand All @@ -22,6 +23,7 @@ test_that("External R submit works", {
})

test_that("Test for null output", {
skip_on_cran()
local_mocked_bindings(
ch_submit = function(...) NULL
)
Expand Down
5 changes: 1 addition & 4 deletions vignettes/openai-gpt.Rmd
Original file line number Diff line number Diff line change
Expand Up @@ -27,8 +27,7 @@ to make the "chatting" experience with the GPT models seamless.
OpenAI requires a **secret key** to authenticate your user. It is required for
any application non-OpenAI application, such as `chattr`, to have one in order
to function. A key is a long alphanumeric sequence. The sequence is created in
the OpenAI portal. To obtain your **secret key**, follow this link:
[OpenAI API Keys](https://platform.openai.com/account/api-keys)
the OpenAI portal.

By default, `chattr` will look for the **secret key** inside the a Environment
Variable called `OPENAI_API_KEY`. Other packages that integrate with OpenAI use
Expand Down Expand Up @@ -79,8 +78,6 @@ To switch back to GPT 4, run:
chattr_use("gpt4")
```

To see the latest list which endpoint to use, go to : [Model Endpoint Compatibility](https://platform.openai.com/docs/models/model-endpoint-compatibility)

## Data files and data frames

Because it is information about your environment and work space, by default
Expand Down

0 comments on commit 92a4486

Please sign in to comment.