Sys.setenv(ROI_LOAD_PLUGINS = FALSE)
library(ROI)

The following example is based on Automatic Differentiation in R by autodiffr (Changcheng and Nash and Borchers 2018) which is part of the autodiffr documentation.

First we load the package.

library(autodiffr)
ad_setup()
## Julia version 1.0.1 at location /home/florian/bin/julia/bin will be used.
## Loading setup script for JuliaCall...
## Finish loading setup script for JuliaCall.
## Loading ReverseDiff...
## Loading ForwardDiff...
## Finish autodiffr setup.

Users which want to use autodiffr with ROI have several options to do so. The easiest and recommended way is just to make the gradient function with autodiffr and provide the gradient when constructing the optimization problem.

fun <- function(x) sum(x^2L)
grad <- makeGradFunc(fun)
hess <- makeHessianFunc(fun)

o <- OP(F_objective(F = fun, n = 3L, G = grad, H = hess))
s <- ROI_solve(o, solver = "nlminb", start = rnorm(3))
solution(s)
## [1] 0 0 0

In general users can change the default differentiation function via ROI_options but it is not recommended to change the default to autodiffr. Since not every function works out of the box.

fun0 <- function(x) {
    stopifnot(is.numeric(x), length(x) == 4L)
    det(matrix(x^c(1, 2, 1, 3), 2, 2))
}

x0 <- c(1.2, 1.4, 1.6, 1.8)
fun0(x0)
## [1] 3.8624
tryCatch(ad_grad(fun0, x0), error = function(e) e)
## <simpleError: Error happens in Julia.
## REvalError: >

Changcheng and Nash and Borchers (2018) find a way to resolve this problem by replacing matrix with array in the fun0() function.

fun2 <- function(x) {
    stopifnot(is.numeric(x), length(x) == 4L)
    det(array(x^c(1, 2, 1, 3), c(2, 2)))
}

x0 <- c(1.2, 1.4, 1.6, 1.8)
fun2(x0)
## [1] 3.8624
ad_grad(fun2, x0)
## [1]  5.832 -4.480 -1.960 11.664

However this example shows that a function (e.g. fun0()) which works perfectly fine in R and with numDeriv

library(numDeriv)
## 
## Attaching package: 'numDeriv'
## The following object is masked _by_ '.GlobalEnv':
## 
##     grad
numDeriv::grad(fun0, x0)
## [1]  5.832 -4.480 -1.960 11.664

can cause errors in autodiffr. Therefore it is not recommended to set autodiffr as default option to derive the gradient (via ROI_option). Since this is likely to cause errors which are hard to debug.