diff --git a/.gitignore b/.gitignore index b7691041fd9506aee2547dfb8fad7e75fde77ddf..ccef4c394e77a2173d49ba8ce0b739332a9fef4a 100644 --- a/.gitignore +++ b/.gitignore @@ -17,3 +17,4 @@ test.html lantern/.idea* lantern/cmake-build* check/ +docs/ diff --git a/docs/404.html b/docs/404.html deleted file mode 100644 index c39b2d5eee392e3d9f14cc771df4facbcbb84fc3..0000000000000000000000000000000000000000 --- a/docs/404.html +++ /dev/null @@ -1,191 +0,0 @@ - - - - - - - - -Page not found (404) • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
-
- - - - -
- -
-
- - -Content not found. Please use links in the navbar. - -
- - - -
- - - - -
- - - - - - - - diff --git a/docs/CONTRIBUTING.html b/docs/CONTRIBUTING.html deleted file mode 100644 index 3533ada4eea54a4b95fe847dc4017a34fa474383..0000000000000000000000000000000000000000 --- a/docs/CONTRIBUTING.html +++ /dev/null @@ -1,228 +0,0 @@ - - - - - - - - -Contributing to torch • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
-
- - - - -
- -
-
- - -
- -

This outlines how to propose a change to torch. For more detailed info about contributing to this, and other tidyverse packages, please see the development contributing guide.

-
-

-Fixing typos

-

You can fix typos, spelling mistakes, or grammatical errors in the documentation directly using the GitHub web interface, as long as the changes are made in the source file. This generally means you’ll need to edit roxygen2 comments in an .R, not a .Rd file. You can find the .R file that generates the .Rd by reading the comment in the first line.

-

See also the [Documentation] section.

-
-
-

-Filing bugs

-

If you find a bug in torch please open an issue here. Please, provide detailed information on how to reproduce the bug. It would be great to also provide a reprex.

-
-
-

-Feature requests

-

Feel free to open issues here and add the feature-request tag. Try searching if there’s already an open issue for your feature-request, in this case it’s better to comment or upvote it intead of opening a new one.

-
-
-

-Examples

-

We welcome contributed examples. feel free to open a PR with new examples. The should be placed in the vignettes/examples folder.

-

The examples should be an .R file and a .Rmd file with the same name that just renders the code.

-

See mnist-mlp.R and mnist-mlp.Rmd

-

One must be able to run the example without manually downloading any dataset/file. You should also add an entry to the _pkgdown.yaml file.

-
-
-

-Code contributions

-

We have many open issues in the github repo if there’s one item that you want to work on, you can comment on it an ask for directions.

-
-
-

-Documentation

-

We use roxygen2 to generate the documentation. IN order to update the docs, edit the file in the R directory. To regenerate and preview the docs, use the custom tools/document.R script, as we need to patch roxygen2 to avoid running the examples on CRAN.

-
-
- -
- - - -
- - - - -
- - - - - - - - diff --git a/docs/LICENSE-text.html b/docs/LICENSE-text.html deleted file mode 100644 index 3ec2627dd8e439d207944d074811ad7ebe50381f..0000000000000000000000000000000000000000 --- a/docs/LICENSE-text.html +++ /dev/null @@ -1,193 +0,0 @@ - - - - - - - - -License • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
-
- - - - -
- -
-
- - -
YEAR: 2020
-COPYRIGHT HOLDER: Daniel Falbel
-
- -
- - - -
- - - - -
- - - - - - - - diff --git a/docs/LICENSE.html b/docs/LICENSE.html deleted file mode 100644 index 7b6b7b0fc4cce273f50348afa06acc656de0f9ff..0000000000000000000000000000000000000000 --- a/docs/LICENSE.html +++ /dev/null @@ -1,197 +0,0 @@ - - - - - - - - -MIT License • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
-
- - - - -
- -
-
- - -
- -

Copyright (c) 2020 Daniel Falbel

-

Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the “Software”), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:

-

The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.

-

THE SOFTWARE IS PROVIDED “AS IS”, WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.

-
- -
- - - -
- - - - -
- - - - - - - - diff --git a/docs/articles/examples/mnist-cnn.html b/docs/articles/examples/mnist-cnn.html deleted file mode 100644 index bc24e1af7c04b0f06b119cb6e56899935ddba460..0000000000000000000000000000000000000000 --- a/docs/articles/examples/mnist-cnn.html +++ /dev/null @@ -1,215 +0,0 @@ - - - - - - - -mnist-cnn • torch - - - - - - - - - - -
-
- - - - -
-
- - - - -
dir <- "~/Downloads/mnist"
-
-ds <- mnist_dataset(
-  dir,
-  download = TRUE,
-  transform = function(x) {
-    x <- x$to(dtype = torch_float())/256
-    x[newaxis,..]
-  }
-)
-dl <- dataloader(ds, batch_size = 32, shuffle = TRUE)
-
-net <- nn_module(
-  "Net",
-  initialize = function() {
-    self$conv1 <- nn_conv2d(1, 32, 3, 1)
-    self$conv2 <- nn_conv2d(32, 64, 3, 1)
-    self$dropout1 <- nn_dropout2d(0.25)
-    self$dropout2 <- nn_dropout2d(0.5)
-    self$fc1 <- nn_linear(9216, 128)
-    self$fc2 <- nn_linear(128, 10)
-  },
-  forward = function(x) {
-    x <- self$conv1(x)
-    x <- nnf_relu(x)
-    x <- self$conv2(x)
-    x <- nnf_relu(x)
-    x <- nnf_max_pool2d(x, 2)
-    x <- self$dropout1(x)
-    x <- torch_flatten(x, start_dim = 2)
-    x <- self$fc1(x)
-    x <- nnf_relu(x)
-    x <- self$dropout2(x)
-    x <- self$fc2(x)
-    output <- nnf_log_softmax(x, dim=1)
-    output
-  }
-)
-
-model <- net()
-optimizer <- optim_sgd(model$parameters, lr = 0.01)
-
-epochs <- 10
-
-for (epoch in 1:10) {
-
-  pb <- progress::progress_bar$new(
-    total = length(dl),
-    format = "[:bar] :eta Loss: :loss"
-  )
-  l <- c()
-
-  for (b in enumerate(dl)) {
-    optimizer$zero_grad()
-    output <- model(b[[1]])
-    loss <- nnf_nll_loss(output, b[[2]])
-    loss$backward()
-    optimizer$step()
-    l <- c(l, loss$item())
-    pb$tick(tokens = list(loss = mean(l)))
-  }
-
-  cat(sprintf("Loss at epoch %d: %3f\n", epoch, mean(l)))
-}
-
- - - -
- - - - -
- - - - - - diff --git a/docs/articles/examples/mnist-cnn_files/accessible-code-block-0.0.1/empty-anchor.js b/docs/articles/examples/mnist-cnn_files/accessible-code-block-0.0.1/empty-anchor.js deleted file mode 100644 index ca349fd6a570108bde9d7daace534cd651c5f042..0000000000000000000000000000000000000000 --- a/docs/articles/examples/mnist-cnn_files/accessible-code-block-0.0.1/empty-anchor.js +++ /dev/null @@ -1,15 +0,0 @@ -// Hide empty tag within highlighted CodeBlock for screen reader accessibility (see https://github.com/jgm/pandoc/issues/6352#issuecomment-626106786) --> -// v0.0.1 -// Written by JooYoung Seo (jooyoung@psu.edu) and Atsushi Yasumoto on June 1st, 2020. - -document.addEventListener('DOMContentLoaded', function() { - const codeList = document.getElementsByClassName("sourceCode"); - for (var i = 0; i < codeList.length; i++) { - var linkList = codeList[i].getElementsByTagName('a'); - for (var j = 0; j < linkList.length; j++) { - if (linkList[j].innerHTML === "") { - linkList[j].setAttribute('aria-hidden', 'true'); - } - } - } -}); diff --git a/docs/articles/examples/mnist-cnn_files/header-attrs-2.1.1/header-attrs.js b/docs/articles/examples/mnist-cnn_files/header-attrs-2.1.1/header-attrs.js deleted file mode 100644 index dd57d92e02028785163a821c31bca8743a8ab59a..0000000000000000000000000000000000000000 --- a/docs/articles/examples/mnist-cnn_files/header-attrs-2.1.1/header-attrs.js +++ /dev/null @@ -1,12 +0,0 @@ -// Pandoc 2.9 adds attributes on both header and div. We remove the former (to -// be compatible with the behavior of Pandoc < 2.8). -document.addEventListener('DOMContentLoaded', function(e) { - var hs = document.querySelectorAll("div.section[class*='level'] > :first-child"); - var i, h, a; - for (i = 0; i < hs.length; i++) { - h = hs[i]; - if (!/^h[1-6]$/i.test(h.tagName)) continue; // it should be a header h1-h6 - a = h.attributes; - while (a.length > 0) h.removeAttribute(a[0].name); - } -}); diff --git a/docs/articles/examples/mnist-cnn_files/header-attrs-2.3/header-attrs.js b/docs/articles/examples/mnist-cnn_files/header-attrs-2.3/header-attrs.js deleted file mode 100644 index dd57d92e02028785163a821c31bca8743a8ab59a..0000000000000000000000000000000000000000 --- a/docs/articles/examples/mnist-cnn_files/header-attrs-2.3/header-attrs.js +++ /dev/null @@ -1,12 +0,0 @@ -// Pandoc 2.9 adds attributes on both header and div. We remove the former (to -// be compatible with the behavior of Pandoc < 2.8). -document.addEventListener('DOMContentLoaded', function(e) { - var hs = document.querySelectorAll("div.section[class*='level'] > :first-child"); - var i, h, a; - for (i = 0; i < hs.length; i++) { - h = hs[i]; - if (!/^h[1-6]$/i.test(h.tagName)) continue; // it should be a header h1-h6 - a = h.attributes; - while (a.length > 0) h.removeAttribute(a[0].name); - } -}); diff --git a/docs/articles/examples/mnist-dcgan.html b/docs/articles/examples/mnist-dcgan.html deleted file mode 100644 index ca8f65769502c81540e85024836eada8ce46d013..0000000000000000000000000000000000000000 --- a/docs/articles/examples/mnist-dcgan.html +++ /dev/null @@ -1,296 +0,0 @@ - - - - - - - -mnist-dcgan • torch - - - - - - - - - - -
-
- - - - -
-
- - - - -
library(torch)
-
-dir <- "~/Downloads/mnist"
-
-ds <- mnist_dataset(
-  dir,
-  download = TRUE,
-  transform = function(x) {
-    x <- x$to(dtype = torch_float())/256
-    x <- 2*(x - 0.5)
-    x[newaxis,..]
-  }
-)
-dl <- dataloader(ds, batch_size = 32, shuffle = TRUE)
-
-generator <- nn_module(
-  "generator",
-  initialize = function(latent_dim, out_channels) {
-    self$main <- nn_sequential(
-      nn_conv_transpose2d(latent_dim, 512, kernel_size = 4,
-                          stride = 1, padding = 0, bias = FALSE),
-      nn_batch_norm2d(512),
-      nn_relu(),
-      nn_conv_transpose2d(512, 256, kernel_size = 4,
-                          stride = 2, padding = 1, bias = FALSE),
-      nn_batch_norm2d(256),
-      nn_relu(),
-      nn_conv_transpose2d(256, 128, kernel_size = 4,
-                          stride = 2, padding = 1, bias = FALSE),
-      nn_batch_norm2d(128),
-      nn_relu(),
-      nn_conv_transpose2d(128, out_channels, kernel_size = 4,
-                          stride = 2, padding = 3, bias = FALSE),
-      nn_tanh()
-    )
-  },
-  forward = function(input) {
-    self$main(input)
-  }
-)
-
-discriminator <- nn_module(
-  "discriminator",
-  initialize = function(in_channels) {
-    self$main <- nn_sequential(
-      nn_conv2d(in_channels, 16, kernel_size = 4, stride = 2, padding = 1, bias = FALSE),
-      nn_leaky_relu(0.2, inplace = TRUE),
-      nn_conv2d(16, 32, kernel_size = 4, stride = 2, padding = 1, bias = FALSE),
-      nn_batch_norm2d(32),
-      nn_leaky_relu(0.2, inplace = TRUE),
-      nn_conv2d(32, 64, kernel_size = 4, stride = 2, padding = 1, bias = FALSE),
-      nn_batch_norm2d(64),
-      nn_leaky_relu(0.2, inplace = TRUE),
-      nn_conv2d(64, 128, kernel_size = 4, stride = 2, padding = 1, bias = FALSE),
-      nn_leaky_relu(0.2, inplace = TRUE)
-    )
-    self$linear <- nn_linear(128, 1)
-    self$sigmoid <- nn_sigmoid()
-  },
-  forward = function(input) {
-    x <- self$main(input)
-    x <- torch_flatten(x, start_dim = 2)
-    x <- self$linear(x)
-    self$sigmoid(x)
-  }
-)
-
-plot_gen <- function(noise) {
-  img <- G(noise)
-  img <- img$cpu()
-  img <- img[1,1,,,newaxis]/2 + 0.5
-  img <- torch_stack(list(img, img, img), dim = 2)[..,1]
-  img <- as.raster(as_array(img))
-  plot(img)
-}
-
-device <- torch_device(ifelse(cuda_is_available(),  "cuda", "cpu"))
-
-G <- generator(latent_dim = 100, out_channels = 1)
-D <- discriminator(in_channels = 1)
-
-init_weights <- function(m) {
-  if (grepl("conv", m$.classes[[1]])) {
-    nn_init_normal_(m$weight$data(), 0.0, 0.02)
-  } else if (grepl("batch_norm", m$.classes[[1]])) {
-    nn_init_normal_(m$weight$data(), 1.0, 0.02)
-    nn_init_constant_(m$bias$data(), 0)
-  }
-}
-
-G[[1]]$apply(init_weights)
-D[[1]]$apply(init_weights)
-
-G$to(device = device)
-D$to(device = device)
-
-G_optimizer <- optim_adam(G$parameters, lr = 2 * 1e-4, betas = c(0.5, 0.999))
-D_optimizer <- optim_adam(D$parameters, lr = 2 * 1e-4, betas = c(0.5, 0.999))
-
-fixed_noise <- torch_randn(1, 100, 1, 1, device = device)
-
-loss <- nn_bce_loss()
-
-for (epoch in 1:10) {
-
-  pb <- progress::progress_bar$new(
-    total = length(dl),
-    format = "[:bar] :eta Loss D: :lossd Loss G: :lossg"
-  )
-  lossg <- c()
-  lossd <- c()
-
-  for (b in enumerate(dl)) {
-
-    y_real <- torch_ones(32, device = device)
-    y_fake <- torch_zeros(32, device = device)
-
-    noise <- torch_randn(32, 100, 1, 1, device = device)
-    fake <- G(noise)
-
-    img <- b[[1]]$to(device = device)
-
-    # train the discriminator ---
-    D_loss <- loss(D(img), y_real) + loss(D(fake$detach()), y_fake)
-
-    D_optimizer$zero_grad()
-    D_loss$backward()
-    D_optimizer$step()
-
-    # train the generator ---
-
-    G_loss <- loss(D(fake), y_real)
-
-    G_optimizer$zero_grad()
-    G_loss$backward()
-    G_optimizer$step()
-
-    lossd <- c(lossd, D_loss$item())
-    lossg <- c(lossg, G_loss$item())
-    pb$tick(tokens = list(lossd = mean(lossd), lossg = mean(lossg)))
-  }
-  plot_gen(fixed_noise)
-
-  cat(sprintf("Epoch %d - Loss D: %3f Loss G: %3f\n", epoch, mean(lossd), mean(lossg)))
-}
-
- - - -
- - - - -
- - - - - - diff --git a/docs/articles/examples/mnist-dcgan_files/accessible-code-block-0.0.1/empty-anchor.js b/docs/articles/examples/mnist-dcgan_files/accessible-code-block-0.0.1/empty-anchor.js deleted file mode 100644 index ca349fd6a570108bde9d7daace534cd651c5f042..0000000000000000000000000000000000000000 --- a/docs/articles/examples/mnist-dcgan_files/accessible-code-block-0.0.1/empty-anchor.js +++ /dev/null @@ -1,15 +0,0 @@ -// Hide empty tag within highlighted CodeBlock for screen reader accessibility (see https://github.com/jgm/pandoc/issues/6352#issuecomment-626106786) --> -// v0.0.1 -// Written by JooYoung Seo (jooyoung@psu.edu) and Atsushi Yasumoto on June 1st, 2020. - -document.addEventListener('DOMContentLoaded', function() { - const codeList = document.getElementsByClassName("sourceCode"); - for (var i = 0; i < codeList.length; i++) { - var linkList = codeList[i].getElementsByTagName('a'); - for (var j = 0; j < linkList.length; j++) { - if (linkList[j].innerHTML === "") { - linkList[j].setAttribute('aria-hidden', 'true'); - } - } - } -}); diff --git a/docs/articles/examples/mnist-dcgan_files/header-attrs-2.3/header-attrs.js b/docs/articles/examples/mnist-dcgan_files/header-attrs-2.3/header-attrs.js deleted file mode 100644 index dd57d92e02028785163a821c31bca8743a8ab59a..0000000000000000000000000000000000000000 --- a/docs/articles/examples/mnist-dcgan_files/header-attrs-2.3/header-attrs.js +++ /dev/null @@ -1,12 +0,0 @@ -// Pandoc 2.9 adds attributes on both header and div. We remove the former (to -// be compatible with the behavior of Pandoc < 2.8). -document.addEventListener('DOMContentLoaded', function(e) { - var hs = document.querySelectorAll("div.section[class*='level'] > :first-child"); - var i, h, a; - for (i = 0; i < hs.length; i++) { - h = hs[i]; - if (!/^h[1-6]$/i.test(h.tagName)) continue; // it should be a header h1-h6 - a = h.attributes; - while (a.length > 0) h.removeAttribute(a[0].name); - } -}); diff --git a/docs/articles/examples/mnist-mlp.html b/docs/articles/examples/mnist-mlp.html deleted file mode 100644 index db40b8ea67a827c698d268c466c47076dab8c7b5..0000000000000000000000000000000000000000 --- a/docs/articles/examples/mnist-mlp.html +++ /dev/null @@ -1,203 +0,0 @@ - - - - - - - -mnist-mlp • torch - - - - - - - - - - -
-
- - - - -
-
- - - - -
dir <- "~/Downloads/mnist"
-
-ds <- mnist_dataset(
-  dir,
-  download = TRUE,
-  transform = function(x) {
-    x$to(dtype = torch_float())/256
-  }
-)
-dl <- dataloader(ds, batch_size = 32, shuffle = TRUE)
-
-net <- nn_module(
-  "Net",
-  initialize = function() {
-    self$fc1 <- nn_linear(784, 128)
-    self$fc2 <- nn_linear(128, 10)
-  },
-  forward = function(x) {
-    x %>%
-      torch_flatten(start_dim = 2) %>%
-      self$fc1() %>%
-      nnf_relu() %>%
-      self$fc2() %>%
-      nnf_log_softmax(dim = 1)
-  }
-)
-
-model <- net()
-optimizer <- optim_sgd(model$parameters, lr = 0.01)
-
-epochs <- 10
-
-for (epoch in 1:10) {
-
-  pb <- progress::progress_bar$new(
-    total = length(dl),
-    format = "[:bar] :eta Loss: :loss"
-  )
-  l <- c()
-
-  for (b in enumerate(dl)) {
-    optimizer$zero_grad()
-    output <- model(b[[1]])
-    loss <- nnf_nll_loss(output, b[[2]])
-    loss$backward()
-    optimizer$step()
-    l <- c(l, loss$item())
-    pb$tick(tokens = list(loss = mean(l)))
-  }
-
-  cat(sprintf("Loss at epoch %d: %3f\n", epoch, mean(l)))
-}
-
- - - -
- - - - -
- - - - - - diff --git a/docs/articles/examples/mnist-mlp_files/accessible-code-block-0.0.1/empty-anchor.js b/docs/articles/examples/mnist-mlp_files/accessible-code-block-0.0.1/empty-anchor.js deleted file mode 100644 index ca349fd6a570108bde9d7daace534cd651c5f042..0000000000000000000000000000000000000000 --- a/docs/articles/examples/mnist-mlp_files/accessible-code-block-0.0.1/empty-anchor.js +++ /dev/null @@ -1,15 +0,0 @@ -// Hide empty tag within highlighted CodeBlock for screen reader accessibility (see https://github.com/jgm/pandoc/issues/6352#issuecomment-626106786) --> -// v0.0.1 -// Written by JooYoung Seo (jooyoung@psu.edu) and Atsushi Yasumoto on June 1st, 2020. - -document.addEventListener('DOMContentLoaded', function() { - const codeList = document.getElementsByClassName("sourceCode"); - for (var i = 0; i < codeList.length; i++) { - var linkList = codeList[i].getElementsByTagName('a'); - for (var j = 0; j < linkList.length; j++) { - if (linkList[j].innerHTML === "") { - linkList[j].setAttribute('aria-hidden', 'true'); - } - } - } -}); diff --git a/docs/articles/examples/mnist-mlp_files/header-attrs-2.1.1/header-attrs.js b/docs/articles/examples/mnist-mlp_files/header-attrs-2.1.1/header-attrs.js deleted file mode 100644 index dd57d92e02028785163a821c31bca8743a8ab59a..0000000000000000000000000000000000000000 --- a/docs/articles/examples/mnist-mlp_files/header-attrs-2.1.1/header-attrs.js +++ /dev/null @@ -1,12 +0,0 @@ -// Pandoc 2.9 adds attributes on both header and div. We remove the former (to -// be compatible with the behavior of Pandoc < 2.8). -document.addEventListener('DOMContentLoaded', function(e) { - var hs = document.querySelectorAll("div.section[class*='level'] > :first-child"); - var i, h, a; - for (i = 0; i < hs.length; i++) { - h = hs[i]; - if (!/^h[1-6]$/i.test(h.tagName)) continue; // it should be a header h1-h6 - a = h.attributes; - while (a.length > 0) h.removeAttribute(a[0].name); - } -}); diff --git a/docs/articles/examples/mnist-mlp_files/header-attrs-2.3/header-attrs.js b/docs/articles/examples/mnist-mlp_files/header-attrs-2.3/header-attrs.js deleted file mode 100644 index dd57d92e02028785163a821c31bca8743a8ab59a..0000000000000000000000000000000000000000 --- a/docs/articles/examples/mnist-mlp_files/header-attrs-2.3/header-attrs.js +++ /dev/null @@ -1,12 +0,0 @@ -// Pandoc 2.9 adds attributes on both header and div. We remove the former (to -// be compatible with the behavior of Pandoc < 2.8). -document.addEventListener('DOMContentLoaded', function(e) { - var hs = document.querySelectorAll("div.section[class*='level'] > :first-child"); - var i, h, a; - for (i = 0; i < hs.length; i++) { - h = hs[i]; - if (!/^h[1-6]$/i.test(h.tagName)) continue; // it should be a header h1-h6 - a = h.attributes; - while (a.length > 0) h.removeAttribute(a[0].name); - } -}); diff --git a/docs/articles/extending-autograd.html b/docs/articles/extending-autograd.html deleted file mode 100644 index 8c27bfa2e021ff15406cac0fe22ed6f3dce868d4..0000000000000000000000000000000000000000 --- a/docs/articles/extending-autograd.html +++ /dev/null @@ -1,222 +0,0 @@ - - - - - - - -Extending Autograd • torch - - - - - - - - - - -
-
- - - - -
-
- - - - -
library(torch)
-

Adding operations to autograd requires implementing a new autograd_function for each operation. Recall that autograd_functionss are what autograd uses to compute the results and gradients, and encode the operation history. Every new function requires you to implement 2 methods:

-
    -
  • forward() - the code that performs the operation. It can take as many arguments as you want, with some of them being optional, if you specify the default values. All kinds of R objects are accepted here. Tensor arguments that track history (i.e., with requires_grad=TRUE) will be converted to ones that don’t track history before the call, and their use will be registered in the graph. Note that this logic won’t traverse lists or any other data structures and will only consider Tensor’s that are direct arguments to the call. You can return either a single Tensor output, or a list of Tensors if there are multiple outputs. Also, please refer to the docs of autograd_function to find descriptions of useful methods that can be called only from forward().

  • -
  • backward() - gradient formula. It will be given as many Tensor arguments as there were outputs, with each of them representing gradient w.r.t. that output. It should return as many Tensors as there were Tensor's that required gradients in forward, with each of them containing the gradient w.r.t. its corresponding input.

  • -
-
-

-Note

-

It’s the user’s responsibility to use the special functions in the forward’s ctx properly in order to ensure that the new autograd_function works properly with the autograd engine.

-
    -
  • save_for_backward() must be used when saving input or ouput of the forward to be used later in the backward.

  • -
  • mark_dirty() must be used to mark any input that is modified inplace by the forward function.

  • -
  • mark_non_differentiable() must be used to tell the engine if an output is not differentiable.

  • -
-
-
-

-Examples

-

Below you can find code for a linear function:

-
linear <- autograd_function(
-  forward = function(ctx, input, weight, bias = NULL) {
-    ctx$save_for_backward(input = input, weight = weight, bias = bias)
-    output <- input$mm(weight$t())
-    if (!is.null(bias))
-      output <- output + bias$unsqueeze(0)$expand_as(output)
-
-    output
-  },
-  backward = function(ctx, grad_output) {
-
-    s <- ctx$saved_variables
-
-    grads <- list(
-      input = NULL,
-      weight = NULL,
-      bias = NULL
-    )
-
-    if (ctx$needs_input_grad$input)
-      grads$input <- grad_output$mm(s$weight)
-
-    if (ctx$needs_input_grad$weight)
-      grads$weight <- grad_output$t()$mm(s$input)
-
-    if (!is.null(s$bias) && ctx$needs_input_grad$bias)
-      grads$bias <- grad_output$sum(dim = 0)
-
-    grads
-  }
-)
-

Here, we give an additional example of a function that is parametrized by non-Tensor arguments:

-
mul_constant <- autograd_function(
-  forward = function(ctx, tensor, constant) {
-    ctx$save_for_backward(constant = constant)
-    tensor * constant
-  },
-  backward = function(ctx, grad_output) {
-    v <- ctx$saved_variables
-    list(
-      tensor = grad_output * v$constant
-    )
-  }
-)
-
x <- torch_tensor(1, requires_grad = TRUE)
-o <- mul_constant(x, 2)
-o$backward()
-x$grad
-
-
- - - -
- - - - -
- - - - - - diff --git a/docs/articles/extending-autograd_files/accessible-code-block-0.0.1/empty-anchor.js b/docs/articles/extending-autograd_files/accessible-code-block-0.0.1/empty-anchor.js deleted file mode 100644 index ca349fd6a570108bde9d7daace534cd651c5f042..0000000000000000000000000000000000000000 --- a/docs/articles/extending-autograd_files/accessible-code-block-0.0.1/empty-anchor.js +++ /dev/null @@ -1,15 +0,0 @@ -// Hide empty tag within highlighted CodeBlock for screen reader accessibility (see https://github.com/jgm/pandoc/issues/6352#issuecomment-626106786) --> -// v0.0.1 -// Written by JooYoung Seo (jooyoung@psu.edu) and Atsushi Yasumoto on June 1st, 2020. - -document.addEventListener('DOMContentLoaded', function() { - const codeList = document.getElementsByClassName("sourceCode"); - for (var i = 0; i < codeList.length; i++) { - var linkList = codeList[i].getElementsByTagName('a'); - for (var j = 0; j < linkList.length; j++) { - if (linkList[j].innerHTML === "") { - linkList[j].setAttribute('aria-hidden', 'true'); - } - } - } -}); diff --git a/docs/articles/extending-autograd_files/header-attrs-2.1.1/header-attrs.js b/docs/articles/extending-autograd_files/header-attrs-2.1.1/header-attrs.js deleted file mode 100644 index dd57d92e02028785163a821c31bca8743a8ab59a..0000000000000000000000000000000000000000 --- a/docs/articles/extending-autograd_files/header-attrs-2.1.1/header-attrs.js +++ /dev/null @@ -1,12 +0,0 @@ -// Pandoc 2.9 adds attributes on both header and div. We remove the former (to -// be compatible with the behavior of Pandoc < 2.8). -document.addEventListener('DOMContentLoaded', function(e) { - var hs = document.querySelectorAll("div.section[class*='level'] > :first-child"); - var i, h, a; - for (i = 0; i < hs.length; i++) { - h = hs[i]; - if (!/^h[1-6]$/i.test(h.tagName)) continue; // it should be a header h1-h6 - a = h.attributes; - while (a.length > 0) h.removeAttribute(a[0].name); - } -}); diff --git a/docs/articles/extending-autograd_files/header-attrs-2.3/header-attrs.js b/docs/articles/extending-autograd_files/header-attrs-2.3/header-attrs.js deleted file mode 100644 index dd57d92e02028785163a821c31bca8743a8ab59a..0000000000000000000000000000000000000000 --- a/docs/articles/extending-autograd_files/header-attrs-2.3/header-attrs.js +++ /dev/null @@ -1,12 +0,0 @@ -// Pandoc 2.9 adds attributes on both header and div. We remove the former (to -// be compatible with the behavior of Pandoc < 2.8). -document.addEventListener('DOMContentLoaded', function(e) { - var hs = document.querySelectorAll("div.section[class*='level'] > :first-child"); - var i, h, a; - for (i = 0; i < hs.length; i++) { - h = hs[i]; - if (!/^h[1-6]$/i.test(h.tagName)) continue; // it should be a header h1-h6 - a = h.attributes; - while (a.length > 0) h.removeAttribute(a[0].name); - } -}); diff --git a/docs/articles/index.html b/docs/articles/index.html deleted file mode 100644 index 50afc9ac2fd9bfcf423c06aab30da3ce6a49ca59..0000000000000000000000000000000000000000 --- a/docs/articles/index.html +++ /dev/null @@ -1,204 +0,0 @@ - - - - - - - - -Articles • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
-
- - - - -
- -
-
- - - -
-
- - - -
- - - - - - - - diff --git a/docs/articles/indexing.html b/docs/articles/indexing.html deleted file mode 100644 index 0207c0554ad6c5f58fd2b3d8f823b693921165d4..0000000000000000000000000000000000000000 --- a/docs/articles/indexing.html +++ /dev/null @@ -1,224 +0,0 @@ - - - - - - - -Indexing tensors • torch - - - - - - - - - - -
-
- - - - -
-
- - - - -
library(torch)
-

In this article we describe the indexing operator for torch tensors and how it compares to the R indexing operator for arrays.

-

Torch’s indexing semantics are closer to numpy’s semantics than R’s. You will find a lot of similarities between this article and the numpy indexing article available here.

-
-

-Single element indexing

-

Single element indexing for a 1-D tensors works mostly as expected. Like R, it is 1-based. Unlike R though, it accepts negative indices for indexing from the end of the array. (In R, negative indices are used to remove elements.)

-
x <- torch_tensor(1:10)
-x[1]
-x[-1]
-

You can also subset matrices and higher dimensions arrays using the same syntax:

-
x <- x$reshape(shape = c(2,5))
-x
-x[1,3]
-x[1,-1]
-

Note that if one indexes a multidimensional tensor with fewer indices than dimensions, one gets an error, unlike in R that would flatten the array. For example:

-
x[1]
-
-
-

-Slicing and striding

-

It is possible to slice and stride arrays to extract sub-arrays of the same number of dimensions, but of different sizes than the original. This is best illustrated by a few examples:

-
x <- torch_tensor(1:10)
-x
-x[2:5]
-x[1:(-7)]
-

You can also use the 1:10:2 syntax which means: In the range from 1 to 10, take every second item. For example:

-
x[1:5:2]
-

Another special syntax is the N, meaning the size of the specified dimension.

-
x[5:N]
-
-
-

-Getting the complete dimension

-

Like in R, you can take all elements in a dimension by leaving an index empty.

-

Consider a matrix:

-
x <- torch_randn(2, 3)
-x
-

The following syntax will give you the first row:

-
x[1,]
-

And this would give you the first 2 columns:

-
x[,1:2]
-
-
-

-Dropping dimensions

-

By default, when indexing by a single integer, this dimension will be dropped to avoid the singleton dimension:

-
x <- torch_randn(2, 3)
-x[1,]$shape
-

You can optionally use the drop = FALSE argument to avoid dropping the dimension.

-
x[1,,drop = FALSE]$shape
-
-
-

-Adding a new dimension

-

It’s possible to add a new dimension to a tensor using index-like syntax:

-
x <- torch_tensor(c(10))
-x$shape
-x[, newaxis]$shape
-x[, newaxis, newaxis]$shape
-

You can also use NULL instead of newaxis:

-
x[,NULL]$shape
-
-
-

-Dealing with variable number of indices

-

Sometimes we don’t know how many dimensions a tensor has, but we do know what to do with the last available dimension, or the first one. To subsume all others, we can use ..:

-
z <- torch_tensor(1:125)$reshape(c(5,5,5))
-z[1,..]
-z[..,1]
-
-
- - - -
- - - - -
- - - - - - diff --git a/docs/articles/indexing_files/accessible-code-block-0.0.1/empty-anchor.js b/docs/articles/indexing_files/accessible-code-block-0.0.1/empty-anchor.js deleted file mode 100644 index ca349fd6a570108bde9d7daace534cd651c5f042..0000000000000000000000000000000000000000 --- a/docs/articles/indexing_files/accessible-code-block-0.0.1/empty-anchor.js +++ /dev/null @@ -1,15 +0,0 @@ -// Hide empty tag within highlighted CodeBlock for screen reader accessibility (see https://github.com/jgm/pandoc/issues/6352#issuecomment-626106786) --> -// v0.0.1 -// Written by JooYoung Seo (jooyoung@psu.edu) and Atsushi Yasumoto on June 1st, 2020. - -document.addEventListener('DOMContentLoaded', function() { - const codeList = document.getElementsByClassName("sourceCode"); - for (var i = 0; i < codeList.length; i++) { - var linkList = codeList[i].getElementsByTagName('a'); - for (var j = 0; j < linkList.length; j++) { - if (linkList[j].innerHTML === "") { - linkList[j].setAttribute('aria-hidden', 'true'); - } - } - } -}); diff --git a/docs/articles/indexing_files/header-attrs-2.1.1/header-attrs.js b/docs/articles/indexing_files/header-attrs-2.1.1/header-attrs.js deleted file mode 100644 index dd57d92e02028785163a821c31bca8743a8ab59a..0000000000000000000000000000000000000000 --- a/docs/articles/indexing_files/header-attrs-2.1.1/header-attrs.js +++ /dev/null @@ -1,12 +0,0 @@ -// Pandoc 2.9 adds attributes on both header and div. We remove the former (to -// be compatible with the behavior of Pandoc < 2.8). -document.addEventListener('DOMContentLoaded', function(e) { - var hs = document.querySelectorAll("div.section[class*='level'] > :first-child"); - var i, h, a; - for (i = 0; i < hs.length; i++) { - h = hs[i]; - if (!/^h[1-6]$/i.test(h.tagName)) continue; // it should be a header h1-h6 - a = h.attributes; - while (a.length > 0) h.removeAttribute(a[0].name); - } -}); diff --git a/docs/articles/indexing_files/header-attrs-2.3/header-attrs.js b/docs/articles/indexing_files/header-attrs-2.3/header-attrs.js deleted file mode 100644 index dd57d92e02028785163a821c31bca8743a8ab59a..0000000000000000000000000000000000000000 --- a/docs/articles/indexing_files/header-attrs-2.3/header-attrs.js +++ /dev/null @@ -1,12 +0,0 @@ -// Pandoc 2.9 adds attributes on both header and div. We remove the former (to -// be compatible with the behavior of Pandoc < 2.8). -document.addEventListener('DOMContentLoaded', function(e) { - var hs = document.querySelectorAll("div.section[class*='level'] > :first-child"); - var i, h, a; - for (i = 0; i < hs.length; i++) { - h = hs[i]; - if (!/^h[1-6]$/i.test(h.tagName)) continue; // it should be a header h1-h6 - a = h.attributes; - while (a.length > 0) h.removeAttribute(a[0].name); - } -}); diff --git a/docs/articles/loading-data.html b/docs/articles/loading-data.html deleted file mode 100644 index 3fd2a41a3fbf466887de1aa02e43612b7e8cb2e3..0000000000000000000000000000000000000000 --- a/docs/articles/loading-data.html +++ /dev/null @@ -1,274 +0,0 @@ - - - - - - - -Loading data • torch - - - - - - - - - - -
-
- - - - -
-
- - - - -
library(torch)
-
-

-Datasets and data loaders

-

Central to data ingestion and preprocessing are datasets and data loaders.

-

torch comes equipped with a bag of datasets related to, mostly, image recognition and natural language processing (e.g., mnist_dataset()), which can be iterated over by means of dataloaders:

-
# ...
-ds <- mnist_dataset(
-  dir, 
-  download = TRUE, 
-  transform = function(x) {
-    x <- x$to(dtype = torch_float())/256
-    x[newaxis,..]
-  }
-)
-
-dl <- dataloader(ds, batch_size = 32, shuffle = TRUE)
-
-for (b in enumerate(dl)) {
-  # ...
-

Cf. vignettes/examples/mnist-cnn.R for a complete example.

-

What if you want to train on a different dataset? In these cases, you subclass Dataset, an abstract container that needs to know how to iterate over the given data. To that purpose, your subclass needs to implement .getitem(), and say what should be returned when the data loader is asking for the next batch.

-

In .getitem(), you can implement whatever preprocessing you require. Additionally, you should implement .length(), so users can find out how many items there are in the dataset.

-

While this may sound complicated, it is not at all. The base logic is straightforward – complexity will, naturally, correlate with how involved your preprocessing is. To provide you with a simple but functional prototype, here we show how to create your own dataset to train on Allison Horst's penguins.

-
-
-

-A custom dataset

-
library(palmerpenguins)
-library(magrittr)
-
-penguins
-

Datasets are R6 classes created using the dataset() constructor. You can pass a name and various member functions. Among those should be initialize(), to create instance variables, .getitem(), to indicate how the data should be returned, and .length(), to say how many items we have.

-

In addition, any number of helper functions can be defined.

-

Here, we assume the penguins have already been loaded, and all preprocessing consists in removing lines with NA values, transforming factors to numbers starting from 0, and converting from R data types to torch tensors.

-

In .getitem, we essentially decide how this data is going to be used: All variables besides species go into x, the predictor, and species will constitute y, the target. Predictor and target are returned in a list, to be accessed as batch[[1]] and batch[[2]] during training.

-
penguins_dataset <- dataset(
-
-  name = "penguins_dataset",
-
-  initialize = function() {
-    self$data <- self$prepare_penguin_data()
-  },
-
-  .getitem = function(index) {
-
-    x <- self$data[index, 2:-1]
-    y <- self$data[index, 1]$to(torch_long())
-
-    list(x, y)
-  },
-
-  .length = function() {
-    self$data$size()[[1]]
-  },
-
-  prepare_penguin_data = function() {
-
-    input <- na.omit(penguins)
-    # conveniently, the categorical data are already factors
-    input$species <- as.numeric(input$species)
-    input$island <- as.numeric(input$island)
-    input$sex <- as.numeric(input$sex)
-
-    input <- as.matrix(input)
-    torch_tensor(input)
-  }
-)
-

Let’s create the dataset , query for it’s length, and look at its first item:

-
tuxes <- penguins_dataset()
-tuxes$.length()
-tuxes$.getitem(1)
-

To be able to iterate over tuxes, we need a data loader (we override the default batch size of 1):

-
dl <-tuxes %>% dataloader(batch_size = 8)
-

Calling .length() on a data loader (as opposed to a dataset) will return the number of batches we have:

-
dl$.length()
-

And we can create an iterator to inspect the first batch:

-
iter <- dl$.iter()
-b <- iter$.next()
-b
-

To train a network, we can use enumerate to iterate over batches.

-
-
-

-Training with data loaders

-

Our example network is very simple. (In reality, we would want to treat island as the categorical variable it is, and either one-hot-encode or embed it.)

-
net <- nn_module(
-  "PenguinNet",
-  initialize = function() {
-    self$fc1 <- nn_linear(6, 32)
-    self$fc2 <- nn_linear(32, 3)
-  },
-  forward = function(x) {
-    x %>%
-      self$fc1() %>%
-      nnf_relu() %>%
-      self$fc2() %>%
-      nnf_log_softmax(dim = 1)
-  }
-)
-
-model <- net()
-

We still need an optimizer:

-
optimizer <- optim_sgd(model$parameters, lr = 0.01)
-

And we’re ready to train:

-
for (epoch in 1:10) {
-
-  l <- c()
-
-  for (b in enumerate(dl)) {
-    optimizer$zero_grad()
-    output <- model(b[[1]])
-    loss <- nnf_nll_loss(output, b[[2]])
-    loss$backward()
-    optimizer$step()
-    l <- c(l, loss$item())
-  }
-
-  cat(sprintf("Loss at epoch %d: %3f\n", epoch, mean(l)))
-}
-
-
- - - -
- - - - -
- - - - - - diff --git a/docs/articles/loading-data_files/accessible-code-block-0.0.1/empty-anchor.js b/docs/articles/loading-data_files/accessible-code-block-0.0.1/empty-anchor.js deleted file mode 100644 index ca349fd6a570108bde9d7daace534cd651c5f042..0000000000000000000000000000000000000000 --- a/docs/articles/loading-data_files/accessible-code-block-0.0.1/empty-anchor.js +++ /dev/null @@ -1,15 +0,0 @@ -// Hide empty tag within highlighted CodeBlock for screen reader accessibility (see https://github.com/jgm/pandoc/issues/6352#issuecomment-626106786) --> -// v0.0.1 -// Written by JooYoung Seo (jooyoung@psu.edu) and Atsushi Yasumoto on June 1st, 2020. - -document.addEventListener('DOMContentLoaded', function() { - const codeList = document.getElementsByClassName("sourceCode"); - for (var i = 0; i < codeList.length; i++) { - var linkList = codeList[i].getElementsByTagName('a'); - for (var j = 0; j < linkList.length; j++) { - if (linkList[j].innerHTML === "") { - linkList[j].setAttribute('aria-hidden', 'true'); - } - } - } -}); diff --git a/docs/articles/loading-data_files/header-attrs-2.3/header-attrs.js b/docs/articles/loading-data_files/header-attrs-2.3/header-attrs.js deleted file mode 100644 index dd57d92e02028785163a821c31bca8743a8ab59a..0000000000000000000000000000000000000000 --- a/docs/articles/loading-data_files/header-attrs-2.3/header-attrs.js +++ /dev/null @@ -1,12 +0,0 @@ -// Pandoc 2.9 adds attributes on both header and div. We remove the former (to -// be compatible with the behavior of Pandoc < 2.8). -document.addEventListener('DOMContentLoaded', function(e) { - var hs = document.querySelectorAll("div.section[class*='level'] > :first-child"); - var i, h, a; - for (i = 0; i < hs.length; i++) { - h = hs[i]; - if (!/^h[1-6]$/i.test(h.tagName)) continue; // it should be a header h1-h6 - a = h.attributes; - while (a.length > 0) h.removeAttribute(a[0].name); - } -}); diff --git a/docs/articles/tensor-creation.html b/docs/articles/tensor-creation.html deleted file mode 100644 index 96bc72ff20e6eee597bea4ad3f71d32889586277..0000000000000000000000000000000000000000 --- a/docs/articles/tensor-creation.html +++ /dev/null @@ -1,231 +0,0 @@ - - - - - - - -Creating tensors • torch - - - - - - - - - - -
-
- - - - -
-
- - - - -
library(torch)
-

In this article we describe various ways of creating torch tensors in R.

-
-

-From R objects

-

You can create tensors from R objects using the torch_tensor function. The torch_tensor function takes an R vector, matrix or array and creates an equivalent torch_tensor.

-

You can see a few examples below:

-
torch_tensor(c(1,2,3))
-
-# conform to row-major indexing used in torch
-torch_tensor(matrix(1:10, ncol = 5, nrow = 2, byrow = TRUE))
-torch_tensor(array(runif(12), dim = c(2, 2, 3)))
-

By default, we will create tensors in the cpu device, converting their R datatype to the corresponding torch dtype.

-
-

Note currently, only numeric and boolean types are supported.

-
-

You can always modify dtype and device when converting an R object to a torch tensor. For example:

-
torch_tensor(1, dtype = torch_long())
-torch_tensor(1, device = "cpu", dtype = torch_float64())
-

Other options available when creating a tensor are:

-
    -
  • -requires_grad: boolean indicating if you want autograd to record operations on them for automatic differentiation.
  • -
  • -pin_memory: – If set, the tensor returned would be allocated in pinned memory. Works only for CPU tensors.
  • -
-

These options are available for all functions that can be used to create new tensors, including the factory functions listed in the next section.

-
-
-

-Using creation functions

-

You can also use the torch_* functions listed below to create torch tensors using some algorithm.

-

For example, the torch_randn function will create tensors using the normal distribution with mean 0 and standard deviation 1. You can use the ... argument to pass the size of the dimensions. For example, the code below will create a normally distributed tensor with shape 5x3.

-
x <- torch_randn(5, 3)
-x
-

Another example is torch_ones, which creates a tensor filled with ones.

-
x <- torch_ones(2, 4, dtype = torch_int64(), device = "cpu")
-x
-

Here is the full list of functions that can be used to bulk-create tensors in torch:

-
    -
  • -torch_arange: Returns a tensor with a sequence of integers,
  • -
  • -torch_empty: Returns a tensor with uninitialized values,
  • -
  • -torch_eye: Returns an identity matrix,
  • -
  • -torch_full: Returns a tensor filled with a single value,
  • -
  • -torch_linspace: Returns a tensor with values linearly spaced in some interval,
  • -
  • -torch_logspace: Returns a tensor with values logarithmically spaced in some interval,
  • -
  • -torch_ones: Returns a tensor filled with all ones,
  • -
  • -torch_rand: Returns a tensor filled with values drawn from a uniform distribution on [0, 1).
  • -
  • -torch_randint: Returns a tensor with integers randomly drawn from an interval,
  • -
  • -torch_randn: Returns a tensor filled with values drawn from a unit normal distribution,
  • -
  • -torch_randperm: Returns a tensor filled with a random permutation of integers in some interval,
  • -
  • -torch_zeros: Returns a tensor filled with all zeros.
  • -
-
-
-

-Conversion

-

Once a tensor exists you can convert between dtypes and move to a different device with to method. For example:

-
x <- torch_tensor(1)
-y <- x$to(dtype = torch_int32())
-x
-y
-

You can also copy a tensor to the GPU using:

-
x <- torch_tensor(1)
-y <- x$cuda())
-
-
- - - -
- - - - -
- - - - - - diff --git a/docs/articles/tensor-creation_files/accessible-code-block-0.0.1/empty-anchor.js b/docs/articles/tensor-creation_files/accessible-code-block-0.0.1/empty-anchor.js deleted file mode 100644 index ca349fd6a570108bde9d7daace534cd651c5f042..0000000000000000000000000000000000000000 --- a/docs/articles/tensor-creation_files/accessible-code-block-0.0.1/empty-anchor.js +++ /dev/null @@ -1,15 +0,0 @@ -// Hide empty tag within highlighted CodeBlock for screen reader accessibility (see https://github.com/jgm/pandoc/issues/6352#issuecomment-626106786) --> -// v0.0.1 -// Written by JooYoung Seo (jooyoung@psu.edu) and Atsushi Yasumoto on June 1st, 2020. - -document.addEventListener('DOMContentLoaded', function() { - const codeList = document.getElementsByClassName("sourceCode"); - for (var i = 0; i < codeList.length; i++) { - var linkList = codeList[i].getElementsByTagName('a'); - for (var j = 0; j < linkList.length; j++) { - if (linkList[j].innerHTML === "") { - linkList[j].setAttribute('aria-hidden', 'true'); - } - } - } -}); diff --git a/docs/articles/tensor-creation_files/header-attrs-2.1.1/header-attrs.js b/docs/articles/tensor-creation_files/header-attrs-2.1.1/header-attrs.js deleted file mode 100644 index dd57d92e02028785163a821c31bca8743a8ab59a..0000000000000000000000000000000000000000 --- a/docs/articles/tensor-creation_files/header-attrs-2.1.1/header-attrs.js +++ /dev/null @@ -1,12 +0,0 @@ -// Pandoc 2.9 adds attributes on both header and div. We remove the former (to -// be compatible with the behavior of Pandoc < 2.8). -document.addEventListener('DOMContentLoaded', function(e) { - var hs = document.querySelectorAll("div.section[class*='level'] > :first-child"); - var i, h, a; - for (i = 0; i < hs.length; i++) { - h = hs[i]; - if (!/^h[1-6]$/i.test(h.tagName)) continue; // it should be a header h1-h6 - a = h.attributes; - while (a.length > 0) h.removeAttribute(a[0].name); - } -}); diff --git a/docs/articles/tensor-creation_files/header-attrs-2.3/header-attrs.js b/docs/articles/tensor-creation_files/header-attrs-2.3/header-attrs.js deleted file mode 100644 index dd57d92e02028785163a821c31bca8743a8ab59a..0000000000000000000000000000000000000000 --- a/docs/articles/tensor-creation_files/header-attrs-2.3/header-attrs.js +++ /dev/null @@ -1,12 +0,0 @@ -// Pandoc 2.9 adds attributes on both header and div. We remove the former (to -// be compatible with the behavior of Pandoc < 2.8). -document.addEventListener('DOMContentLoaded', function(e) { - var hs = document.querySelectorAll("div.section[class*='level'] > :first-child"); - var i, h, a; - for (i = 0; i < hs.length; i++) { - h = hs[i]; - if (!/^h[1-6]$/i.test(h.tagName)) continue; // it should be a header h1-h6 - a = h.attributes; - while (a.length > 0) h.removeAttribute(a[0].name); - } -}); diff --git a/docs/articles/using-autograd.html b/docs/articles/using-autograd.html deleted file mode 100644 index 9cd634b0cef30e19ac25fa32a466f70c17818a5a..0000000000000000000000000000000000000000 --- a/docs/articles/using-autograd.html +++ /dev/null @@ -1,268 +0,0 @@ - - - - - - - -Using autograd • torch - - - - - - - - - - -
-
- - - - -
-
- - - - -
library(torch)
-

So far, all we’ve been using from torch is tensors, but we’ve been performing all calculations ourselves – the computing the predictions, the loss, the gradients (and thus, the necessary updates to the weights), and the new weight values. In this chapter, we’ll make a significant change: Namely, we spare ourselves the cumbersome calculation of gradients, and have torch do it for us.

-

Before we see that in action, let’s get some more background.

-
-

-Automatic differentiation with autograd

-

Torch uses a module called autograd to record operations performed on tensors, and store what has to be done to obtain the respective gradients. These actions are stored as functions, and those functions are applied in order when the gradient of the output (normally, the loss) with respect to those tensors is calculated: starting from the output node and propagating gradients back through the network. This is a form of reverse mode automatic differentiation.

-

As users, we can see a bit of this implementation. As a prerequisite for this “recording” to happen, tensors have to be created with requires_grad = TRUE. E.g.

-
x <- torch_ones(2,2, requires_grad = TRUE)
-

To be clear, this is a tensor with respect to which gradients have to be calculated – normally, a tensor representing a weight or a bias, not the input data 1. If we now perform some operation on that tensor, assigning the result to y

-
y <- x$mean()
-

we find that y now has a non-empty grad_fn that tells torch how to compute the gradient of y with respect to x:

-
y$grad_fn
-

Actual computation of gradients is triggered by calling backward() on the output tensor.

-
y$backward()
-

That executed, x now has a non-empty field grad that stores the gradient of y with respect to x:

-
x$grad
-

With a longer chain of computations, we can peek at how torch builds up a graph of backward operations.

-

Here is a slightly more complex example. We call retain_grad() on y and z just for demonstration purposes; by default, intermediate gradients – while of course they have to be computed – aren’t stored, in order to save memory.

-
x1 <- torch_ones(2,2, requires_grad = TRUE)
-x2 <- torch_tensor(1.1, requires_grad = TRUE)
-y <- x1 * (x2 + 2)
-y$retain_grad()
-z <- y$pow(2) * 3
-z$retain_grad()
-out <- z$mean()
-

Starting from out$grad_fn, we can follow the graph all back to the leaf nodes:

-
# how to compute the gradient for mean, the last operation executed
-out$grad_fn
-# how to compute the gradient for the multiplication by 3 in z = y$pow(2) * 3
-out$grad_fn$next_functions
-# how to compute the gradient for pow in z = y.pow(2) * 3
-out$grad_fn$next_functions[[1]]$next_functions
-# how to compute the gradient for the multiplication in y = x * (x + 2)
-out$grad_fn$next_functions[[1]]$next_functions[[1]]$next_functions
-# how to compute the gradient for the two branches of y = x * (x + 2),
-# where the left branch is a leaf node (AccumulateGrad for x1)
-out$grad_fn$next_functions[[1]]$next_functions[[1]]$next_functions[[1]]$next_functions
-# here we arrive at the other leaf node (AccumulateGrad for x2)
-out$grad_fn$next_functions[[1]]$next_functions[[1]]$next_functions[[1]]$next_functions[[2]]$next_functions
-

After calling out$backward(), all tensors in the graph will have their respective gradients created. Without our calls to retain_grad above, z$grad and y$grad would be empty:

-
out$backward()
-z$grad
-y$grad
-x2$grad
-x1$grad
-

Thus acquainted with autograd, we’re ready to modify our example.

-
-
-

-The simple network, now using autograd

-

For a single new line calling loss$backward(), now a number of lines (that did manual backprop) are gone:

-
### generate training data -----------------------------------------------------
-# input dimensionality (number of input features)
-d_in <- 3
-# output dimensionality (number of predicted features)
-d_out <- 1
-# number of observations in training set
-n <- 100
-# create random data
-x <- torch_randn(n, d_in)
-y <- x[,1]*0.2 - x[..,2]*1.3 - x[..,3]*0.5 + torch_randn(n)
-y <- y$unsqueeze(dim = 1)
-### initialize weights ---------------------------------------------------------
-# dimensionality of hidden layer
-d_hidden <- 32
-# weights connecting input to hidden layer
-w1 <- torch_randn(d_in, d_hidden, requires_grad = TRUE)
-# weights connecting hidden to output layer
-w2 <- torch_randn(d_hidden, d_out, requires_grad = TRUE)
-# hidden layer bias
-b1 <- torch_zeros(1, d_hidden, requires_grad = TRUE)
-# output layer bias
-b2 <- torch_zeros(1, d_out,requires_grad = TRUE)
-### network parameters ---------------------------------------------------------
-learning_rate <- 1e-4
-### training loop --------------------------------------------------------------
-for (t in 1:200) {
-
-    ### -------- Forward pass -------- 
-    y_pred <- x$mm(w1)$add(b1)$clamp(min = 0)$mm(w2)$add(b2)
-    ### -------- compute loss -------- 
-    loss <- (y_pred - y)$pow(2)$mean()
-    if (t %% 10 == 0) cat(t, as_array(loss), "\n")
-    ### -------- Backpropagation -------- 
-    # compute the gradient of loss with respect to all tensors with requires_grad = True.
-    loss$backward()
-
-    ### -------- Update weights -------- 
-
-    # Wrap in torch.no_grad() because this is a part we DON'T want to record for automatic gradient computation
-    with_no_grad({
-
-      w1$sub_(learning_rate * w1$grad)
-      w2$sub_(learning_rate * w2$grad)
-      b1$sub_(learning_rate * b1$grad)
-      b2$sub_(learning_rate * b2$grad)
-
-      # Zero the gradients after every pass, because they'd accumulate otherwise
-      w1$grad$zero_()
-      w2$grad$zero_()
-      b1$grad$zero_()
-      b2$grad$zero_()
-
-    })
-
-}
-

We still manually compute the forward pass, and we still manually update the weights. In the last two chapters of this section, we’ll see how these parts of the logic can be made more modular and reusable, as well.

-
-
-
-
    -
  1. Unless we want to change the data, as in adversarial example generation↩︎

  2. -
-
-
- - - -
- - - - -
- - - - - - diff --git a/docs/articles/using-autograd_files/accessible-code-block-0.0.1/empty-anchor.js b/docs/articles/using-autograd_files/accessible-code-block-0.0.1/empty-anchor.js deleted file mode 100644 index ca349fd6a570108bde9d7daace534cd651c5f042..0000000000000000000000000000000000000000 --- a/docs/articles/using-autograd_files/accessible-code-block-0.0.1/empty-anchor.js +++ /dev/null @@ -1,15 +0,0 @@ -// Hide empty tag within highlighted CodeBlock for screen reader accessibility (see https://github.com/jgm/pandoc/issues/6352#issuecomment-626106786) --> -// v0.0.1 -// Written by JooYoung Seo (jooyoung@psu.edu) and Atsushi Yasumoto on June 1st, 2020. - -document.addEventListener('DOMContentLoaded', function() { - const codeList = document.getElementsByClassName("sourceCode"); - for (var i = 0; i < codeList.length; i++) { - var linkList = codeList[i].getElementsByTagName('a'); - for (var j = 0; j < linkList.length; j++) { - if (linkList[j].innerHTML === "") { - linkList[j].setAttribute('aria-hidden', 'true'); - } - } - } -}); diff --git a/docs/articles/using-autograd_files/header-attrs-2.1.1/header-attrs.js b/docs/articles/using-autograd_files/header-attrs-2.1.1/header-attrs.js deleted file mode 100644 index dd57d92e02028785163a821c31bca8743a8ab59a..0000000000000000000000000000000000000000 --- a/docs/articles/using-autograd_files/header-attrs-2.1.1/header-attrs.js +++ /dev/null @@ -1,12 +0,0 @@ -// Pandoc 2.9 adds attributes on both header and div. We remove the former (to -// be compatible with the behavior of Pandoc < 2.8). -document.addEventListener('DOMContentLoaded', function(e) { - var hs = document.querySelectorAll("div.section[class*='level'] > :first-child"); - var i, h, a; - for (i = 0; i < hs.length; i++) { - h = hs[i]; - if (!/^h[1-6]$/i.test(h.tagName)) continue; // it should be a header h1-h6 - a = h.attributes; - while (a.length > 0) h.removeAttribute(a[0].name); - } -}); diff --git a/docs/articles/using-autograd_files/header-attrs-2.3/header-attrs.js b/docs/articles/using-autograd_files/header-attrs-2.3/header-attrs.js deleted file mode 100644 index dd57d92e02028785163a821c31bca8743a8ab59a..0000000000000000000000000000000000000000 --- a/docs/articles/using-autograd_files/header-attrs-2.3/header-attrs.js +++ /dev/null @@ -1,12 +0,0 @@ -// Pandoc 2.9 adds attributes on both header and div. We remove the former (to -// be compatible with the behavior of Pandoc < 2.8). -document.addEventListener('DOMContentLoaded', function(e) { - var hs = document.querySelectorAll("div.section[class*='level'] > :first-child"); - var i, h, a; - for (i = 0; i < hs.length; i++) { - h = hs[i]; - if (!/^h[1-6]$/i.test(h.tagName)) continue; // it should be a header h1-h6 - a = h.attributes; - while (a.length > 0) h.removeAttribute(a[0].name); - } -}); diff --git a/docs/authors.html b/docs/authors.html deleted file mode 100644 index 5d8ead1a4be11ba65ff26f9890e1fc052f175747..0000000000000000000000000000000000000000 --- a/docs/authors.html +++ /dev/null @@ -1,206 +0,0 @@ - - - - - - - - -Authors • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
-
- - - - -
- -
-
- - -
    -
  • -

    Daniel Falbel. Author, maintainer. -

    -
  • -
  • -

    Javier Luraschi. Author. -

    -
  • -
  • -

    Dmitriy Selivanov. Contributor. -

    -
  • -
  • -

    Athos Damiani. Contributor. -

    -
  • -
  • -

    RStudio. Copyright holder. -

    -
  • -
- -
- -
- - - - -
- - - - - - - - diff --git a/docs/bootstrap-toc.css b/docs/bootstrap-toc.css deleted file mode 100644 index 5a859415c1f7eacfd94920968bc910e2f1f1427e..0000000000000000000000000000000000000000 --- a/docs/bootstrap-toc.css +++ /dev/null @@ -1,60 +0,0 @@ -/*! - * Bootstrap Table of Contents v0.4.1 (http://afeld.github.io/bootstrap-toc/) - * Copyright 2015 Aidan Feldman - * Licensed under MIT (https://github.com/afeld/bootstrap-toc/blob/gh-pages/LICENSE.md) */ - -/* modified from https://github.com/twbs/bootstrap/blob/94b4076dd2efba9af71f0b18d4ee4b163aa9e0dd/docs/assets/css/src/docs.css#L548-L601 */ - -/* All levels of nav */ -nav[data-toggle='toc'] .nav > li > a { - display: block; - padding: 4px 20px; - font-size: 13px; - font-weight: 500; - color: #767676; -} -nav[data-toggle='toc'] .nav > li > a:hover, -nav[data-toggle='toc'] .nav > li > a:focus { - padding-left: 19px; - color: #563d7c; - text-decoration: none; - background-color: transparent; - border-left: 1px solid #563d7c; -} -nav[data-toggle='toc'] .nav > .active > a, -nav[data-toggle='toc'] .nav > .active:hover > a, -nav[data-toggle='toc'] .nav > .active:focus > a { - padding-left: 18px; - font-weight: bold; - color: #563d7c; - background-color: transparent; - border-left: 2px solid #563d7c; -} - -/* Nav: second level (shown on .active) */ -nav[data-toggle='toc'] .nav .nav { - display: none; /* Hide by default, but at >768px, show it */ - padding-bottom: 10px; -} -nav[data-toggle='toc'] .nav .nav > li > a { - padding-top: 1px; - padding-bottom: 1px; - padding-left: 30px; - font-size: 12px; - font-weight: normal; -} -nav[data-toggle='toc'] .nav .nav > li > a:hover, -nav[data-toggle='toc'] .nav .nav > li > a:focus { - padding-left: 29px; -} -nav[data-toggle='toc'] .nav .nav > .active > a, -nav[data-toggle='toc'] .nav .nav > .active:hover > a, -nav[data-toggle='toc'] .nav .nav > .active:focus > a { - padding-left: 28px; - font-weight: 500; -} - -/* from https://github.com/twbs/bootstrap/blob/e38f066d8c203c3e032da0ff23cd2d6098ee2dd6/docs/assets/css/src/docs.css#L631-L634 */ -nav[data-toggle='toc'] .nav > .active > ul { - display: block; -} diff --git a/docs/bootstrap-toc.js b/docs/bootstrap-toc.js deleted file mode 100644 index 1cdd573b20f53b3ebe31c021e154c4338ca456af..0000000000000000000000000000000000000000 --- a/docs/bootstrap-toc.js +++ /dev/null @@ -1,159 +0,0 @@ -/*! - * Bootstrap Table of Contents v0.4.1 (http://afeld.github.io/bootstrap-toc/) - * Copyright 2015 Aidan Feldman - * Licensed under MIT (https://github.com/afeld/bootstrap-toc/blob/gh-pages/LICENSE.md) */ -(function() { - 'use strict'; - - window.Toc = { - helpers: { - // return all matching elements in the set, or their descendants - findOrFilter: function($el, selector) { - // http://danielnouri.org/notes/2011/03/14/a-jquery-find-that-also-finds-the-root-element/ - // http://stackoverflow.com/a/12731439/358804 - var $descendants = $el.find(selector); - return $el.filter(selector).add($descendants).filter(':not([data-toc-skip])'); - }, - - generateUniqueIdBase: function(el) { - var text = $(el).text(); - var anchor = text.trim().toLowerCase().replace(/[^A-Za-z0-9]+/g, '-'); - return anchor || el.tagName.toLowerCase(); - }, - - generateUniqueId: function(el) { - var anchorBase = this.generateUniqueIdBase(el); - for (var i = 0; ; i++) { - var anchor = anchorBase; - if (i > 0) { - // add suffix - anchor += '-' + i; - } - // check if ID already exists - if (!document.getElementById(anchor)) { - return anchor; - } - } - }, - - generateAnchor: function(el) { - if (el.id) { - return el.id; - } else { - var anchor = this.generateUniqueId(el); - el.id = anchor; - return anchor; - } - }, - - createNavList: function() { - return $(''); - }, - - createChildNavList: function($parent) { - var $childList = this.createNavList(); - $parent.append($childList); - return $childList; - }, - - generateNavEl: function(anchor, text) { - var $a = $(''); - $a.attr('href', '#' + anchor); - $a.text(text); - var $li = $('
  • '); - $li.append($a); - return $li; - }, - - generateNavItem: function(headingEl) { - var anchor = this.generateAnchor(headingEl); - var $heading = $(headingEl); - var text = $heading.data('toc-text') || $heading.text(); - return this.generateNavEl(anchor, text); - }, - - // Find the first heading level (`

    `, then `

    `, etc.) that has more than one element. Defaults to 1 (for `

    `). - getTopLevel: function($scope) { - for (var i = 1; i <= 6; i++) { - var $headings = this.findOrFilter($scope, 'h' + i); - if ($headings.length > 1) { - return i; - } - } - - return 1; - }, - - // returns the elements for the top level, and the next below it - getHeadings: function($scope, topLevel) { - var topSelector = 'h' + topLevel; - - var secondaryLevel = topLevel + 1; - var secondarySelector = 'h' + secondaryLevel; - - return this.findOrFilter($scope, topSelector + ',' + secondarySelector); - }, - - getNavLevel: function(el) { - return parseInt(el.tagName.charAt(1), 10); - }, - - populateNav: function($topContext, topLevel, $headings) { - var $context = $topContext; - var $prevNav; - - var helpers = this; - $headings.each(function(i, el) { - var $newNav = helpers.generateNavItem(el); - var navLevel = helpers.getNavLevel(el); - - // determine the proper $context - if (navLevel === topLevel) { - // use top level - $context = $topContext; - } else if ($prevNav && $context === $topContext) { - // create a new level of the tree and switch to it - $context = helpers.createChildNavList($prevNav); - } // else use the current $context - - $context.append($newNav); - - $prevNav = $newNav; - }); - }, - - parseOps: function(arg) { - var opts; - if (arg.jquery) { - opts = { - $nav: arg - }; - } else { - opts = arg; - } - opts.$scope = opts.$scope || $(document.body); - return opts; - } - }, - - // accepts a jQuery object, or an options object - init: function(opts) { - opts = this.helpers.parseOps(opts); - - // ensure that the data attribute is in place for styling - opts.$nav.attr('data-toggle', 'toc'); - - var $topContext = this.helpers.createChildNavList(opts.$nav); - var topLevel = this.helpers.getTopLevel(opts.$scope); - var $headings = this.helpers.getHeadings(opts.$scope, topLevel); - this.helpers.populateNav($topContext, topLevel, $headings); - } - }; - - $(function() { - $('nav[data-toggle="toc"]').each(function(i, el) { - var $nav = $(el); - Toc.init($nav); - }); - }); -})(); diff --git a/docs/docsearch.css b/docs/docsearch.css deleted file mode 100644 index e5f1fe1dfa2c34c51fe941829b511acd8c763301..0000000000000000000000000000000000000000 --- a/docs/docsearch.css +++ /dev/null @@ -1,148 +0,0 @@ -/* Docsearch -------------------------------------------------------------- */ -/* - Source: https://github.com/algolia/docsearch/ - License: MIT -*/ - -.algolia-autocomplete { - display: block; - -webkit-box-flex: 1; - -ms-flex: 1; - flex: 1 -} - -.algolia-autocomplete .ds-dropdown-menu { - width: 100%; - min-width: none; - max-width: none; - padding: .75rem 0; - background-color: #fff; - background-clip: padding-box; - border: 1px solid rgba(0, 0, 0, .1); - box-shadow: 0 .5rem 1rem rgba(0, 0, 0, .175); -} - -@media (min-width:768px) { - .algolia-autocomplete .ds-dropdown-menu { - width: 175% - } -} - -.algolia-autocomplete .ds-dropdown-menu::before { - display: none -} - -.algolia-autocomplete .ds-dropdown-menu [class^=ds-dataset-] { - padding: 0; - background-color: rgb(255,255,255); - border: 0; - max-height: 80vh; -} - -.algolia-autocomplete .ds-dropdown-menu .ds-suggestions { - margin-top: 0 -} - -.algolia-autocomplete .algolia-docsearch-suggestion { - padding: 0; - overflow: visible -} - -.algolia-autocomplete .algolia-docsearch-suggestion--category-header { - padding: .125rem 1rem; - margin-top: 0; - font-size: 1.3em; - font-weight: 500; - color: #00008B; - border-bottom: 0 -} - -.algolia-autocomplete .algolia-docsearch-suggestion--wrapper { - float: none; - padding-top: 0 -} - -.algolia-autocomplete .algolia-docsearch-suggestion--subcategory-column { - float: none; - width: auto; - padding: 0; - text-align: left -} - -.algolia-autocomplete .algolia-docsearch-suggestion--content { - float: none; - width: auto; - padding: 0 -} - -.algolia-autocomplete .algolia-docsearch-suggestion--content::before { - display: none -} - -.algolia-autocomplete .ds-suggestion:not(:first-child) .algolia-docsearch-suggestion--category-header { - padding-top: .75rem; - margin-top: .75rem; - border-top: 1px solid rgba(0, 0, 0, .1) -} - -.algolia-autocomplete .ds-suggestion .algolia-docsearch-suggestion--subcategory-column { - display: block; - padding: .1rem 1rem; - margin-bottom: 0.1; - font-size: 1.0em; - font-weight: 400 - /* display: none */ -} - -.algolia-autocomplete .algolia-docsearch-suggestion--title { - display: block; - padding: .25rem 1rem; - margin-bottom: 0; - font-size: 0.9em; - font-weight: 400 -} - -.algolia-autocomplete .algolia-docsearch-suggestion--text { - padding: 0 1rem .5rem; - margin-top: -.25rem; - font-size: 0.8em; - font-weight: 400; - line-height: 1.25 -} - -.algolia-autocomplete .algolia-docsearch-footer { - width: 110px; - height: 20px; - z-index: 3; - margin-top: 10.66667px; - float: right; - font-size: 0; - line-height: 0; -} - -.algolia-autocomplete .algolia-docsearch-footer--logo { - background-image: url("data:image/svg+xml;utf8,"); - background-repeat: no-repeat; - background-position: 50%; - background-size: 100%; - overflow: hidden; - text-indent: -9000px; - width: 100%; - height: 100%; - display: block; - transform: translate(-8px); -} - -.algolia-autocomplete .algolia-docsearch-suggestion--highlight { - color: #FF8C00; - background: rgba(232, 189, 54, 0.1) -} - - -.algolia-autocomplete .algolia-docsearch-suggestion--text .algolia-docsearch-suggestion--highlight { - box-shadow: inset 0 -2px 0 0 rgba(105, 105, 105, .5) -} - -.algolia-autocomplete .ds-suggestion.ds-cursor .algolia-docsearch-suggestion--content { - background-color: rgba(192, 192, 192, .15) -} diff --git a/docs/docsearch.js b/docs/docsearch.js deleted file mode 100644 index b35504cd3a282816130a16881f3ebeead9c1bcb4..0000000000000000000000000000000000000000 --- a/docs/docsearch.js +++ /dev/null @@ -1,85 +0,0 @@ -$(function() { - - // register a handler to move the focus to the search bar - // upon pressing shift + "/" (i.e. "?") - $(document).on('keydown', function(e) { - if (e.shiftKey && e.keyCode == 191) { - e.preventDefault(); - $("#search-input").focus(); - } - }); - - $(document).ready(function() { - // do keyword highlighting - /* modified from https://jsfiddle.net/julmot/bL6bb5oo/ */ - var mark = function() { - - var referrer = document.URL ; - var paramKey = "q" ; - - if (referrer.indexOf("?") !== -1) { - var qs = referrer.substr(referrer.indexOf('?') + 1); - var qs_noanchor = qs.split('#')[0]; - var qsa = qs_noanchor.split('&'); - var keyword = ""; - - for (var i = 0; i < qsa.length; i++) { - var currentParam = qsa[i].split('='); - - if (currentParam.length !== 2) { - continue; - } - - if (currentParam[0] == paramKey) { - keyword = decodeURIComponent(currentParam[1].replace(/\+/g, "%20")); - } - } - - if (keyword !== "") { - $(".contents").unmark({ - done: function() { - $(".contents").mark(keyword); - } - }); - } - } - }; - - mark(); - }); -}); - -/* Search term highlighting ------------------------------*/ - -function matchedWords(hit) { - var words = []; - - var hierarchy = hit._highlightResult.hierarchy; - // loop to fetch from lvl0, lvl1, etc. - for (var idx in hierarchy) { - words = words.concat(hierarchy[idx].matchedWords); - } - - var content = hit._highlightResult.content; - if (content) { - words = words.concat(content.matchedWords); - } - - // return unique words - var words_uniq = [...new Set(words)]; - return words_uniq; -} - -function updateHitURL(hit) { - - var words = matchedWords(hit); - var url = ""; - - if (hit.anchor) { - url = hit.url_without_anchor + '?q=' + escape(words.join(" ")) + '#' + hit.anchor; - } else { - url = hit.url + '?q=' + escape(words.join(" ")); - } - - return url; -} diff --git a/docs/index.html b/docs/index.html deleted file mode 100644 index a36128e298c5f03de709e362febefee84db4729f..0000000000000000000000000000000000000000 --- a/docs/index.html +++ /dev/null @@ -1,272 +0,0 @@ - - - - - - - -Tensors and Neural Networks with GPU Acceleration • torch - - - - - - - - - - -
    -
    - - - - -
    -
    -
    - - -
    -

    -Installation

    -

    Run:

    -
    remotes::install_github("mlverse/torch")
    -

    At the first package load additional software will be installed.

    -
    -
    -

    -Example

    -

    Currently this package is only a proof of concept and you can only create a Torch Tensor from an R object. And then convert back from a torch Tensor to an R object.

    -
    library(torch)
    -x <- array(runif(8), dim = c(2, 2, 2))
    -y <- torch_tensor(x, dtype = torch_float64())
    -y
    -#> torch_tensor 
    -#> (1,.,.) = 
    -#>   0.8687  0.0157
    -#>   0.4237  0.8971
    -#> 
    -#> (2,.,.) = 
    -#>   0.4021  0.5509
    -#>   0.3374  0.9034
    -#> [ CPUDoubleType{2,2,2} ]
    -identical(x, as_array(y))
    -#> [1] TRUE
    -
    -

    -Simple Autograd Example

    -

    In the following snippet we let torch, using the autograd feature, calculate the derivatives:

    -
    x <- torch_tensor(1, requires_grad = TRUE)
    -w <- torch_tensor(2, requires_grad = TRUE)
    -b <- torch_tensor(3, requires_grad = TRUE)
    -y <- w * x + b
    -y$backward()
    -x$grad
    -#> torch_tensor 
    -#>  2
    -#> [ CPUFloatType{1} ]
    -w$grad
    -#> torch_tensor 
    -#>  1
    -#> [ CPUFloatType{1} ]
    -b$grad
    -#> torch_tensor 
    -#>  1
    -#> [ CPUFloatType{1} ]
    -
    -
    -

    -Linear Regression

    -

    In the following example we are going to fit a linear regression from scratch using torch’s Autograd.

    -

    Note all methods that end with _ (eg. sub_), will modify the tensors in place.

    -
    x <- torch_randn(100, 2)
    -y <- 0.1 + 0.5*x[,1] - 0.7*x[,2]
    -
    -w <- torch_randn(2, 1, requires_grad = TRUE)
    -b <- torch_zeros(1, requires_grad = TRUE)
    -
    -lr <- 0.5
    -for (i in 1:100) {
    -  y_hat <- torch_mm(x, w) + b
    -  loss <- torch_mean((y - y_hat$squeeze(1))^2)
    -
    -  loss$backward()
    -
    -  with_no_grad({
    -    w$sub_(w$grad*lr)
    -    b$sub_(b$grad*lr)
    -
    -    w$grad$zero_()
    -    b$grad$zero_()
    -  })
    -}
    -print(w)
    -#> torch_tensor 
    -#>  0.5000
    -#> -0.7000
    -#> [ CPUFloatType{2,1} ]
    -print(b)
    -#> torch_tensor 
    -#> 0.01 *
    -#> 10.0000
    -#> [ CPUFloatType{1} ]
    -
    -
    -
    -

    -Contributing

    -

    No matter your current skills it’s possible to contribute to torch development. See the contributing guide for more information.

    -
    -
    -
    - - -
    - - -
    - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - diff --git a/docs/link.svg b/docs/link.svg deleted file mode 100644 index 88ad82769b87f10725c57dca6fcf41b4bffe462c..0000000000000000000000000000000000000000 --- a/docs/link.svg +++ /dev/null @@ -1,12 +0,0 @@ - - - - - - diff --git a/docs/pkgdown.css b/docs/pkgdown.css deleted file mode 100644 index c01e5923be6ff1edbccf20f93be6bdf8dbd67bb3..0000000000000000000000000000000000000000 --- a/docs/pkgdown.css +++ /dev/null @@ -1,367 +0,0 @@ -/* Sticky footer */ - -/** - * Basic idea: https://philipwalton.github.io/solved-by-flexbox/demos/sticky-footer/ - * Details: https://github.com/philipwalton/solved-by-flexbox/blob/master/assets/css/components/site.css - * - * .Site -> body > .container - * .Site-content -> body > .container .row - * .footer -> footer - * - * Key idea seems to be to ensure that .container and __all its parents__ - * have height set to 100% - * - */ - -html, body { - height: 100%; -} - -body { - position: relative; -} - -body > .container { - display: flex; - height: 100%; - flex-direction: column; -} - -body > .container .row { - flex: 1 0 auto; -} - -footer { - margin-top: 45px; - padding: 35px 0 36px; - border-top: 1px solid #e5e5e5; - color: #666; - display: flex; - flex-shrink: 0; -} -footer p { - margin-bottom: 0; -} -footer div { - flex: 1; -} -footer .pkgdown { - text-align: right; -} -footer p { - margin-bottom: 0; -} - -img.icon { - float: right; -} - -img { - max-width: 100%; -} - -/* Fix bug in bootstrap (only seen in firefox) */ -summary { - display: list-item; -} - -/* Typographic tweaking ---------------------------------*/ - -.contents .page-header { - margin-top: calc(-60px + 1em); -} - -dd { - margin-left: 3em; -} - -/* Section anchors ---------------------------------*/ - -a.anchor { - margin-left: -30px; - display:inline-block; - width: 30px; - height: 30px; - visibility: hidden; - - background-image: url(./link.svg); - background-repeat: no-repeat; - background-size: 20px 20px; - background-position: center center; -} - -.hasAnchor:hover a.anchor { - visibility: visible; -} - -@media (max-width: 767px) { - .hasAnchor:hover a.anchor { - visibility: hidden; - } -} - - -/* Fixes for fixed navbar --------------------------*/ - -.contents h1, .contents h2, .contents h3, .contents h4 { - padding-top: 60px; - margin-top: -40px; -} - -/* Navbar submenu --------------------------*/ - -.dropdown-submenu { - position: relative; -} - -.dropdown-submenu>.dropdown-menu { - top: 0; - left: 100%; - margin-top: -6px; - margin-left: -1px; - border-radius: 0 6px 6px 6px; -} - -.dropdown-submenu:hover>.dropdown-menu { - display: block; -} - -.dropdown-submenu>a:after { - display: block; - content: " "; - float: right; - width: 0; - height: 0; - border-color: transparent; - border-style: solid; - border-width: 5px 0 5px 5px; - border-left-color: #cccccc; - margin-top: 5px; - margin-right: -10px; -} - -.dropdown-submenu:hover>a:after { - border-left-color: #ffffff; -} - -.dropdown-submenu.pull-left { - float: none; -} - -.dropdown-submenu.pull-left>.dropdown-menu { - left: -100%; - margin-left: 10px; - border-radius: 6px 0 6px 6px; -} - -/* Sidebar --------------------------*/ - -#pkgdown-sidebar { - margin-top: 30px; - position: -webkit-sticky; - position: sticky; - top: 70px; -} - -#pkgdown-sidebar h2 { - font-size: 1.5em; - margin-top: 1em; -} - -#pkgdown-sidebar h2:first-child { - margin-top: 0; -} - -#pkgdown-sidebar .list-unstyled li { - margin-bottom: 0.5em; -} - -/* bootstrap-toc tweaks ------------------------------------------------------*/ - -/* All levels of nav */ - -nav[data-toggle='toc'] .nav > li > a { - padding: 4px 20px 4px 6px; - font-size: 1.5rem; - font-weight: 400; - color: inherit; -} - -nav[data-toggle='toc'] .nav > li > a:hover, -nav[data-toggle='toc'] .nav > li > a:focus { - padding-left: 5px; - color: inherit; - border-left: 1px solid #878787; -} - -nav[data-toggle='toc'] .nav > .active > a, -nav[data-toggle='toc'] .nav > .active:hover > a, -nav[data-toggle='toc'] .nav > .active:focus > a { - padding-left: 5px; - font-size: 1.5rem; - font-weight: 400; - color: inherit; - border-left: 2px solid #878787; -} - -/* Nav: second level (shown on .active) */ - -nav[data-toggle='toc'] .nav .nav { - display: none; /* Hide by default, but at >768px, show it */ - padding-bottom: 10px; -} - -nav[data-toggle='toc'] .nav .nav > li > a { - padding-left: 16px; - font-size: 1.35rem; -} - -nav[data-toggle='toc'] .nav .nav > li > a:hover, -nav[data-toggle='toc'] .nav .nav > li > a:focus { - padding-left: 15px; -} - -nav[data-toggle='toc'] .nav .nav > .active > a, -nav[data-toggle='toc'] .nav .nav > .active:hover > a, -nav[data-toggle='toc'] .nav .nav > .active:focus > a { - padding-left: 15px; - font-weight: 500; - font-size: 1.35rem; -} - -/* orcid ------------------------------------------------------------------- */ - -.orcid { - font-size: 16px; - color: #A6CE39; - /* margins are required by official ORCID trademark and display guidelines */ - margin-left:4px; - margin-right:4px; - vertical-align: middle; -} - -/* Reference index & topics ----------------------------------------------- */ - -.ref-index th {font-weight: normal;} - -.ref-index td {vertical-align: top;} -.ref-index .icon {width: 40px;} -.ref-index .alias {width: 40%;} -.ref-index-icons .alias {width: calc(40% - 40px);} -.ref-index .title {width: 60%;} - -.ref-arguments th {text-align: right; padding-right: 10px;} -.ref-arguments th, .ref-arguments td {vertical-align: top;} -.ref-arguments .name {width: 20%;} -.ref-arguments .desc {width: 80%;} - -/* Nice scrolling for wide elements --------------------------------------- */ - -table { - display: block; - overflow: auto; -} - -/* Syntax highlighting ---------------------------------------------------- */ - -pre { - word-wrap: normal; - word-break: normal; - border: 1px solid #eee; -} - -pre, code { - background-color: #f8f8f8; - color: #333; -} - -pre code { - overflow: auto; - word-wrap: normal; - white-space: pre; -} - -pre .img { - margin: 5px 0; -} - -pre .img img { - background-color: #fff; - display: block; - height: auto; -} - -code a, pre a { - color: #375f84; -} - -a.sourceLine:hover { - text-decoration: none; -} - -.fl {color: #1514b5;} -.fu {color: #000000;} /* function */ -.ch,.st {color: #036a07;} /* string */ -.kw {color: #264D66;} /* keyword */ -.co {color: #888888;} /* comment */ - -.message { color: black; font-weight: bolder;} -.error { color: orange; font-weight: bolder;} -.warning { color: #6A0366; font-weight: bolder;} - -/* Clipboard --------------------------*/ - -.hasCopyButton { - position: relative; -} - -.btn-copy-ex { - position: absolute; - right: 0; - top: 0; - visibility: hidden; -} - -.hasCopyButton:hover button.btn-copy-ex { - visibility: visible; -} - -/* headroom.js ------------------------ */ - -.headroom { - will-change: transform; - transition: transform 200ms linear; -} -.headroom--pinned { - transform: translateY(0%); -} -.headroom--unpinned { - transform: translateY(-100%); -} - -/* mark.js ----------------------------*/ - -mark { - background-color: rgba(255, 255, 51, 0.5); - border-bottom: 2px solid rgba(255, 153, 51, 0.3); - padding: 1px; -} - -/* vertical spacing after htmlwidgets */ -.html-widget { - margin-bottom: 10px; -} - -/* fontawesome ------------------------ */ - -.fab { - font-family: "Font Awesome 5 Brands" !important; -} - -/* don't display links in code chunks when printing */ -/* source: https://stackoverflow.com/a/10781533 */ -@media print { - code a:link:after, code a:visited:after { - content: ""; - } -} diff --git a/docs/pkgdown.js b/docs/pkgdown.js deleted file mode 100644 index 7e7048faebb92b85ed06afddd1a8a4581241d6a4..0000000000000000000000000000000000000000 --- a/docs/pkgdown.js +++ /dev/null @@ -1,108 +0,0 @@ -/* http://gregfranko.com/blog/jquery-best-practices/ */ -(function($) { - $(function() { - - $('.navbar-fixed-top').headroom(); - - $('body').css('padding-top', $('.navbar').height() + 10); - $(window).resize(function(){ - $('body').css('padding-top', $('.navbar').height() + 10); - }); - - $('[data-toggle="tooltip"]').tooltip(); - - var cur_path = paths(location.pathname); - var links = $("#navbar ul li a"); - var max_length = -1; - var pos = -1; - for (var i = 0; i < links.length; i++) { - if (links[i].getAttribute("href") === "#") - continue; - // Ignore external links - if (links[i].host !== location.host) - continue; - - var nav_path = paths(links[i].pathname); - - var length = prefix_length(nav_path, cur_path); - if (length > max_length) { - max_length = length; - pos = i; - } - } - - // Add class to parent
  • , and enclosing
  • if in dropdown - if (pos >= 0) { - var menu_anchor = $(links[pos]); - menu_anchor.parent().addClass("active"); - menu_anchor.closest("li.dropdown").addClass("active"); - } - }); - - function paths(pathname) { - var pieces = pathname.split("/"); - pieces.shift(); // always starts with / - - var end = pieces[pieces.length - 1]; - if (end === "index.html" || end === "") - pieces.pop(); - return(pieces); - } - - // Returns -1 if not found - function prefix_length(needle, haystack) { - if (needle.length > haystack.length) - return(-1); - - // Special case for length-0 haystack, since for loop won't run - if (haystack.length === 0) { - return(needle.length === 0 ? 0 : -1); - } - - for (var i = 0; i < haystack.length; i++) { - if (needle[i] != haystack[i]) - return(i); - } - - return(haystack.length); - } - - /* Clipboard --------------------------*/ - - function changeTooltipMessage(element, msg) { - var tooltipOriginalTitle=element.getAttribute('data-original-title'); - element.setAttribute('data-original-title', msg); - $(element).tooltip('show'); - element.setAttribute('data-original-title', tooltipOriginalTitle); - } - - if(ClipboardJS.isSupported()) { - $(document).ready(function() { - var copyButton = ""; - - $(".examples, div.sourceCode").addClass("hasCopyButton"); - - // Insert copy buttons: - $(copyButton).prependTo(".hasCopyButton"); - - // Initialize tooltips: - $('.btn-copy-ex').tooltip({container: 'body'}); - - // Initialize clipboard: - var clipboardBtnCopies = new ClipboardJS('[data-clipboard-copy]', { - text: function(trigger) { - return trigger.parentNode.textContent; - } - }); - - clipboardBtnCopies.on('success', function(e) { - changeTooltipMessage(e.trigger, 'Copied!'); - e.clearSelection(); - }); - - clipboardBtnCopies.on('error', function() { - changeTooltipMessage(e.trigger,'Press Ctrl+C or Command+C to copy'); - }); - }); - } -})(window.jQuery || window.$) diff --git a/docs/pkgdown.yml b/docs/pkgdown.yml deleted file mode 100644 index ffc2acea34c75c2a31440ddcee5eedb5de3c9431..0000000000000000000000000000000000000000 --- a/docs/pkgdown.yml +++ /dev/null @@ -1,14 +0,0 @@ -pandoc: 2.9.2.1 -pkgdown: 1.5.1 -pkgdown_sha: ~ -articles: - mnist-cnn: examples/mnist-cnn.html - mnist-dcgan: examples/mnist-dcgan.html - mnist-mlp: examples/mnist-mlp.html - extending-autograd: extending-autograd.html - indexing: indexing.html - loading-data: loading-data.html - tensor-creation: tensor-creation.html - using-autograd: using-autograd.html -last_built: 2020-07-22T19:57Z - diff --git a/docs/reference/AutogradContext.html b/docs/reference/AutogradContext.html deleted file mode 100644 index b1fcfa13c646f16eb14583decd96d9d5e50e9ca6..0000000000000000000000000000000000000000 --- a/docs/reference/AutogradContext.html +++ /dev/null @@ -1,306 +0,0 @@ - - - - - - - - -Class representing the context. — AutogradContext • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Class representing the context.

    -

    Class representing the context.

    -
    - - - -

    Public fields

    - -

    -
    ptr

    (Dev related) pointer to the context c++ object.

    - -

    -

    Active bindings

    - -

    -
    needs_input_grad

    boolean listing arguments of forward and whether they require_grad.

    - -
    saved_variables

    list of objects that were saved for backward via save_for_backward.

    - -

    -

    Methods

    - - -

    Public methods

    - - -


    -

    Method new()

    -

    (Dev related) Initializes the context. Not user related.

    Usage

    -

    AutogradContext$new(
    -  ptr,
    -  env,
    -  argument_names = NULL,
    -  argument_needs_grad = NULL
    -)

    - -

    Arguments

    -

    -
    ptr

    pointer to the c++ object

    - -
    env

    environment that encloses both forward and backward

    - -
    argument_names

    names of forward arguments

    - -
    argument_needs_grad

    whether each argument in forward needs grad.

    - -

    -


    -

    Method save_for_backward()

    -

    Saves given objects for a future call to backward().

    -

    This should be called at most once, and only from inside the forward() -method.

    -

    Later, saved objects can be accessed through the saved_variables attribute. -Before returning them to the user, a check is made to ensure they weren’t used -in any in-place operation that modified their content.

    -

    Arguments can also be any kind of R object.

    Usage

    -

    AutogradContext$save_for_backward(...)

    - -

    Arguments

    -

    -
    ...

    any kind of R object that will be saved for the backward pass. -It's common to pass named arguments.

    - -

    -


    -

    Method mark_non_differentiable()

    -

    Marks outputs as non-differentiable.

    -

    This should be called at most once, only from inside the forward() method, -and all arguments should be outputs.

    -

    This will mark outputs as not requiring gradients, increasing the efficiency -of backward computation. You still need to accept a gradient for each output -in backward(), but it’s always going to be a zero tensor with the same -shape as the shape of a corresponding output.

    -

    This is used e.g. for indices returned from a max Function.

    Usage

    -

    AutogradContext$mark_non_differentiable(...)

    - -

    Arguments

    -

    -
    ...

    non-differentiable outputs.

    - -

    -


    -

    Method mark_dirty()

    -

    Marks given tensors as modified in an in-place operation.

    -

    This should be called at most once, only from inside the forward() method, -and all arguments should be inputs.

    -

    Every tensor that’s been modified in-place in a call to forward() should -be given to this function, to ensure correctness of our checks. It doesn’t -matter whether the function is called before or after modification.

    Usage

    -

    AutogradContext$mark_dirty(...)

    - -

    Arguments

    -

    -
    ...

    tensors that are modified in-place.

    - -

    -


    -

    Method clone()

    -

    The objects of this class are cloneable with this method.

    Usage

    -

    AutogradContext$clone(deep = FALSE)

    - -

    Arguments

    -

    -
    deep

    Whether to make a deep clone.

    - -

    - - - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/as_array.html b/docs/reference/as_array.html deleted file mode 100644 index 6e35287d8c32d6f26f6b45fbf6eed26d9a9f0ce2..0000000000000000000000000000000000000000 --- a/docs/reference/as_array.html +++ /dev/null @@ -1,205 +0,0 @@ - - - - - - - - -Converts to array — as_array • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Converts to array

    -
    - -
    as_array(x)
    - -

    Arguments

    - - - - - - -
    x

    object to be converted into an array

    - - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/autograd_backward.html b/docs/reference/autograd_backward.html deleted file mode 100644 index 087f1d74cc4d1f88f7762a851079897019537cd2..0000000000000000000000000000000000000000 --- a/docs/reference/autograd_backward.html +++ /dev/null @@ -1,258 +0,0 @@ - - - - - - - - -Computes the sum of gradients of given tensors w.r.t. graph leaves. — autograd_backward • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    The graph is differentiated using the chain rule. If any of tensors are -non-scalar (i.e. their data has more than one element) and require gradient, -then the Jacobian-vector product would be computed, in this case the function -additionally requires specifying grad_tensors. It should be a sequence of -matching length, that contains the “vector” in the Jacobian-vector product, -usually the gradient of the differentiated function w.r.t. corresponding -tensors (None is an acceptable value for all tensors that don’t need gradient -tensors).

    -
    - -
    autograd_backward(
    -  tensors,
    -  grad_tensors = NULL,
    -  retain_graph = create_graph,
    -  create_graph = FALSE
    -)
    - -

    Arguments

    - - - - - - - - - - - - - - - - - - -
    tensors

    (list of Tensor) – Tensors of which the derivative will -be computed.

    grad_tensors

    (list of (Tensor or NULL)) – The “vector” in the Jacobian-vector product, usually gradients w.r.t. each element of corresponding tensors. NULLvalues can be specified for scalar Tensors or ones that don’t require grad. If aNULL` value would be acceptable for all -grad_tensors, then this argument is optional.

    retain_graph

    (bool, optional) – If FALSE, the graph used to compute -the grad will be freed. Note that in nearly all cases setting this option to -TRUE is not needed and often can be worked around in a much more efficient -way. Defaults to the value of create_graph.

    create_graph

    (bool, optional) – If TRUE, graph of the derivative will -be constructed, allowing to compute higher order derivative products. -Defaults to FALSE.

    - -

    Details

    - -

    This function accumulates gradients in the leaves - you might need to zero -them before calling it.

    - -

    Examples

    -
    # \dontrun{ -x <- torch_tensor(1, requires_grad = TRUE) -y <- 2 * x - -a <- torch_tensor(1, requires_grad = TRUE) -b <- 3 * a - -autograd_backward(list(y, b)) - -# }
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/autograd_function.html b/docs/reference/autograd_function.html deleted file mode 100644 index f5ca06c73a4f9fd7071d2d73b68170f6ad245981..0000000000000000000000000000000000000000 --- a/docs/reference/autograd_function.html +++ /dev/null @@ -1,246 +0,0 @@ - - - - - - - - -Records operation history and defines formulas for differentiating ops. — autograd_function • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Every operation performed on Tensor's creates a new function object, that -performs the computation, and records that it happened. The history is -retained in the form of a DAG of functions, with edges denoting data -dependencies (input <- output). Then, when backward is called, the graph is -processed in the topological ordering, by calling backward() methods of each -Function object, and passing returned gradients on to next Function's.

    -
    - -
    autograd_function(forward, backward)
    - -

    Arguments

    - - - - - - - - - - -
    forward

    Performs the operation. It must accept a context ctx as the first argument, -followed by any number of arguments (tensors or other types). The context can be -used to store tensors that can be then retrieved during the backward pass. -See AutogradContext for more information about context methods.

    backward

    Defines a formula for differentiating the operation. It must accept -a context ctx as the first argument, followed by as many outputs did forward() -return, and it should return a named list. Each argument is the gradient w.r.t -the given output, and each element in the returned list should be the gradient -w.r.t. the corresponding input. The context can be used to retrieve tensors saved -during the forward pass. It also has an attribute ctx$needs_input_grad as a -named list of booleans representing whether each input needs gradient. -E.g., backward() will have ctx$needs_input_grad$input = TRUE if the input -argument to forward() needs gradient computated w.r.t. the output. -See AutogradContext for more information about context methods.

    - - -

    Examples

    -
    # \dontrun{ - -exp2 <- autograd_function( - forward = function(ctx, i) { - result <- i$exp() - ctx$save_for_backward(result = result) - result - }, - backward = function(ctx, grad_output) { - list(i = grad_output * ctx$saved_variable$result) - } -) - -# }
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/autograd_grad.html b/docs/reference/autograd_grad.html deleted file mode 100644 index da8949015b9bdd4421f3af4af4fc240ef1163104..0000000000000000000000000000000000000000 --- a/docs/reference/autograd_grad.html +++ /dev/null @@ -1,272 +0,0 @@ - - - - - - - - -Computes and returns the sum of gradients of outputs w.r.t. the inputs. — autograd_grad • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    grad_outputs should be a list of length matching output containing the “vector” -in Jacobian-vector product, usually the pre-computed gradients w.r.t. each of -the outputs. If an output doesn’t require_grad, then the gradient can be None).

    -
    - -
    autograd_grad(
    -  outputs,
    -  inputs,
    -  grad_outputs = NULL,
    -  retain_graph = create_graph,
    -  create_graph = FALSE,
    -  allow_unused = FALSE
    -)
    - -

    Arguments

    - - - - - - - - - - - - - - - - - - - - - - - - - - -
    outputs

    (sequence of Tensor) – outputs of the differentiated function.

    inputs

    (sequence of Tensor) – Inputs w.r.t. which the gradient will be -returned (and not accumulated into .grad).

    grad_outputs

    (sequence of Tensor) – The “vector” in the Jacobian-vector -product. Usually gradients w.r.t. each output. None values can be specified for -scalar Tensors or ones that don’t require grad. If a None value would be acceptable -for all grad_tensors, then this argument is optional. Default: None.

    retain_graph

    (bool, optional) – If FALSE, the graph used to compute the -grad will be freed. Note that in nearly all cases setting this option to TRUE is -not needed and often can be worked around in a much more efficient way. -Defaults to the value of create_graph.

    create_graph

    (bool, optional) – If TRUE, graph of the derivative will be constructed, allowing to compute higher order derivative products. Default: FALSE`.

    allow_unused

    (bool, optional) – If FALSE, specifying inputs that were -not used when computing outputs (and therefore their grad is always zero) is an -error. Defaults to FALSE

    - -

    Details

    - -

    If only_inputs is TRUE, the function will only return a list of gradients w.r.t -the specified inputs. If it’s FALSE, then gradient w.r.t. all remaining leaves -will still be computed, and will be accumulated into their .grad attribute.

    - -

    Examples

    -
    # \dontrun{ -w <- torch_tensor(0.5, requires_grad = TRUE) -b <- torch_tensor(0.9, requires_grad = TRUE) -x <- torch_tensor(runif(100)) -y <- 2 * x + 1 -loss <- (y - (w*x + b))^2 -loss <- loss$mean() - -o <- autograd_grad(loss, list(w, b)) -o
    #> [[1]] -#> torch_tensor -#> -0.9935 -#> [ CPUFloatType{1} ] -#> -#> [[2]] -#> torch_tensor -#> -1.6206 -#> [ CPUFloatType{1} ] -#>
    -# } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/autograd_set_grad_mode.html b/docs/reference/autograd_set_grad_mode.html deleted file mode 100644 index f91387ad10edcea706a0a14794aaf2db6d3b7d3e..0000000000000000000000000000000000000000 --- a/docs/reference/autograd_set_grad_mode.html +++ /dev/null @@ -1,205 +0,0 @@ - - - - - - - - -Set grad mode — autograd_set_grad_mode • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Sets or disables gradient history.

    -
    - -
    autograd_set_grad_mode(enabled)
    - -

    Arguments

    - - - - - - -
    enabled

    bool wether to enable or disable the gradient recording.

    - - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/cuda_current_device.html b/docs/reference/cuda_current_device.html deleted file mode 100644 index 8a6145c0554b94a9ac39157bfc5f674d2809ac78..0000000000000000000000000000000000000000 --- a/docs/reference/cuda_current_device.html +++ /dev/null @@ -1,197 +0,0 @@ - - - - - - - - -Returns the index of a currently selected device. — cuda_current_device • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Returns the index of a currently selected device.

    -
    - -
    cuda_current_device()
    - - - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/cuda_device_count.html b/docs/reference/cuda_device_count.html deleted file mode 100644 index 4094cfa2dc67ba7ed99c08dcba3c59ff152a80f0..0000000000000000000000000000000000000000 --- a/docs/reference/cuda_device_count.html +++ /dev/null @@ -1,197 +0,0 @@ - - - - - - - - -Returns the number of GPUs available. — cuda_device_count • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Returns the number of GPUs available.

    -
    - -
    cuda_device_count()
    - - - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/cuda_is_available.html b/docs/reference/cuda_is_available.html deleted file mode 100644 index 0a5dec9940654b1f8e0a95669c8049519f702272..0000000000000000000000000000000000000000 --- a/docs/reference/cuda_is_available.html +++ /dev/null @@ -1,197 +0,0 @@ - - - - - - - - -Returns a bool indicating if CUDA is currently available. — cuda_is_available • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Returns a bool indicating if CUDA is currently available.

    -
    - -
    cuda_is_available()
    - - - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/dataloader.html b/docs/reference/dataloader.html deleted file mode 100644 index 2e3cc18e42d233d6a2998e0ea287daf77fbef00a..0000000000000000000000000000000000000000 --- a/docs/reference/dataloader.html +++ /dev/null @@ -1,278 +0,0 @@ - - - - - - - - -Data loader. Combines a dataset and a sampler, and provides -single- or multi-process iterators over the dataset. — dataloader • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Data loader. Combines a dataset and a sampler, and provides -single- or multi-process iterators over the dataset.

    -
    - -
    dataloader(
    -  dataset,
    -  batch_size = 1,
    -  shuffle = FALSE,
    -  sampler = NULL,
    -  batch_sampler = NULL,
    -  num_workers = 0,
    -  collate_fn = NULL,
    -  pin_memory = FALSE,
    -  drop_last = FALSE,
    -  timeout = 0,
    -  worker_init_fn = NULL
    -)
    - -

    Arguments

    - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    dataset

    (Dataset): dataset from which to load the data.

    batch_size

    (int, optional): how many samples per batch to load -(default: 1).

    shuffle

    (bool, optional): set to TRUE to have the data reshuffled -at every epoch (default: FALSE).

    sampler

    (Sampler, optional): defines the strategy to draw samples from -the dataset. If specified, shuffle must be False.

    batch_sampler

    (Sampler, optional): like sampler, but returns a batch of -indices at a time. Mutually exclusive with batch_size, -shuffle, sampler, and drop_last.

    num_workers

    (int, optional): how many subprocesses to use for data -loading. 0 means that the data will be loaded in the main process. -(default: 0)

    collate_fn

    (callable, optional): merges a list of samples to form a mini-batch.

    pin_memory

    (bool, optional): If TRUE, the data loader will copy tensors -into CUDA pinned memory before returning them. If your data elements -are a custom type, or your collate_fn returns a batch that is a custom type -see the example below.

    drop_last

    (bool, optional): set to TRUE to drop the last incomplete batch, -if the dataset size is not divisible by the batch size. If FALSE and -the size of dataset is not divisible by the batch size, then the last batch -will be smaller. (default: FALSE)

    timeout

    (numeric, optional): if positive, the timeout value for collecting a batch -from workers. Should always be non-negative. (default: 0)

    worker_init_fn

    (callable, optional): If not NULL, this will be called on each -worker subprocess with the worker id (an int in [0, num_workers - 1]) as -input, after seeding and before data loading. (default: NULL)

    - - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/dataloader_make_iter.html b/docs/reference/dataloader_make_iter.html deleted file mode 100644 index 8e3cf682a4b5c76e057710a33908abe14926eb74..0000000000000000000000000000000000000000 --- a/docs/reference/dataloader_make_iter.html +++ /dev/null @@ -1,205 +0,0 @@ - - - - - - - - -Creates an iterator from a DataLoader — dataloader_make_iter • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Creates an iterator from a DataLoader

    -
    - -
    dataloader_make_iter(dataloader)
    - -

    Arguments

    - - - - - - -
    dataloader

    a dataloader object.

    - - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/dataloader_next.html b/docs/reference/dataloader_next.html deleted file mode 100644 index 0ab5eef50c660c10863d8d58c6ae173acaa48c24..0000000000000000000000000000000000000000 --- a/docs/reference/dataloader_next.html +++ /dev/null @@ -1,205 +0,0 @@ - - - - - - - - -Get the next element of a dataloader iterator — dataloader_next • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Get the next element of a dataloader iterator

    -
    - -
    dataloader_next(iter)
    - -

    Arguments

    - - - - - - -
    iter

    a DataLoader iter created with dataloader_make_iter.

    - - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/dataset.html b/docs/reference/dataset.html deleted file mode 100644 index 35b359ea3800b1750ba4b0d9a4fa127cae3fb100..0000000000000000000000000000000000000000 --- a/docs/reference/dataset.html +++ /dev/null @@ -1,230 +0,0 @@ - - - - - - - - -An abstract class representing a <code>Dataset</code>. — dataset • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    All datasets that represent a map from keys to data samples should subclass -it. All subclasses should overwrite get_item, supporting fetching a -data sample for a given key. Subclasses could also optionally overwrite -lenght, which is expected to return the size of the dataset by many -~torch.utils.data.Sampler implementations and the default options -of ~torch.utils.data.DataLoader.

    -
    - -
    dataset(name = NULL, inherit = Dataset, ...)
    - -

    Arguments

    - - - - - - - - - - - - - - -
    name

    a name for the dataset. It it's also used as the class -for it.

    inherit

    you can optionally inherit from a dataset when creating a -new dataset.

    ...

    public methods for the dataset class

    - -

    Note

    - -

    ~torch.utils.data.DataLoader by default constructs a index -sampler that yields integral indices. To make it work with a map-style -dataset with non-integral indices/keys, a custom sampler must be provided.

    - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/default_dtype.html b/docs/reference/default_dtype.html deleted file mode 100644 index 5d27dca58158c960c8c978b76c2e3921c8a4e52c..0000000000000000000000000000000000000000 --- a/docs/reference/default_dtype.html +++ /dev/null @@ -1,208 +0,0 @@ - - - - - - - - -Gets and sets the default floating point dtype. — torch_set_default_dtype • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Gets and sets the default floating point dtype.

    -
    - -
    torch_set_default_dtype(d)
    -
    -torch_get_default_dtype()
    - -

    Arguments

    - - - - - - -
    d

    The default floating point dtype to set. Initially set to -torch_float().

    - - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/enumerate.dataloader.html b/docs/reference/enumerate.dataloader.html deleted file mode 100644 index 9499c94fa6ddf21677c847ac187b34138f0f1351..0000000000000000000000000000000000000000 --- a/docs/reference/enumerate.dataloader.html +++ /dev/null @@ -1,214 +0,0 @@ - - - - - - - - -Enumerate an iterator — enumerate.dataloader • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Enumerate an iterator

    -
    - -
    # S3 method for dataloader
    -enumerate(x, max_len = 1e+06, ...)
    - -

    Arguments

    - - - - - - - - - - - - - - -
    x

    the generator to enumerate.

    max_len

    maximum number of iterations.

    ...

    passed to specific methods.

    - - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/enumerate.html b/docs/reference/enumerate.html deleted file mode 100644 index 86deb56d28e16583e1dc529046e123b8ed29fecb..0000000000000000000000000000000000000000 --- a/docs/reference/enumerate.html +++ /dev/null @@ -1,209 +0,0 @@ - - - - - - - - -Enumerate an iterator — enumerate • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Enumerate an iterator

    -
    - -
    enumerate(x, ...)
    - -

    Arguments

    - - - - - - - - - - -
    x

    the generator to enumerate.

    ...

    passed to specific methods.

    - - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/figures/torch.png b/docs/reference/figures/torch.png deleted file mode 100644 index 61d24b86074b110f4cf3298f417c4148938c8f05..0000000000000000000000000000000000000000 Binary files a/docs/reference/figures/torch.png and /dev/null differ diff --git a/docs/reference/index.html b/docs/reference/index.html deleted file mode 100644 index d421206ea891bd65f4a56bc6637e9d9a59a46374..0000000000000000000000000000000000000000 --- a/docs/reference/index.html +++ /dev/null @@ -1,2839 +0,0 @@ - - - - - - - - -Function reference • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -

    All functions

    -

    -
    -

    AutogradContext

    -

    Class representing the context.

    -

    as_array()

    -

    Converts to array

    -

    autograd_backward()

    -

    Computes the sum of gradients of given tensors w.r.t. graph leaves.

    -

    autograd_function()

    -

    Records operation history and defines formulas for differentiating ops.

    -

    autograd_grad()

    -

    Computes and returns the sum of gradients of outputs w.r.t. the inputs.

    -

    autograd_set_grad_mode()

    -

    Set grad mode

    -

    cuda_current_device()

    -

    Returns the index of a currently selected device.

    -

    cuda_device_count()

    -

    Returns the number of GPUs available.

    -

    cuda_is_available()

    -

    Returns a bool indicating if CUDA is currently available.

    -

    dataloader()

    -

    Data loader. Combines a dataset and a sampler, and provides -single- or multi-process iterators over the dataset.

    -

    dataloader_make_iter()

    -

    Creates an iterator from a DataLoader

    -

    dataloader_next()

    -

    Get the next element of a dataloader iterator

    -

    dataset()

    -

    An abstract class representing a Dataset.

    -

    torch_set_default_dtype() torch_get_default_dtype()

    -

    Gets and sets the default floating point dtype.

    -

    enumerate()

    -

    Enumerate an iterator

    -

    enumerate(<dataloader>)

    -

    Enumerate an iterator

    -

    install_torch()

    -

    Install Torch

    -

    is_dataloader()

    -

    Checks if the object is a dataloader

    -

    is_torch_dtype()

    -

    Check if object is a torch data type

    -

    is_torch_layout()

    -

    Check if an object is a torch layout.

    -

    is_torch_memory_format()

    -

    Check if an object is a memory format

    -

    is_torch_qscheme()

    -

    Checks if an object is a QScheme

    -

    kmnist_dataset()

    -

    Kuzushiji-MNIST

    -

    mnist_dataset()

    -

    MNIST dataset

    -

    nn_adaptive_log_softmax_with_loss()

    -

    AdaptiveLogSoftmaxWithLoss module

    -

    nn_batch_norm1d()

    -

    BatchNorm1D module

    -

    nn_batch_norm2d()

    -

    BatchNorm2D

    -

    nn_bce_loss()

    -

    Binary cross entropy loss

    -

    nn_bilinear()

    -

    Bilinear module

    -

    nn_celu()

    -

    CELU module

    -

    nn_conv1d()

    -

    Conv1D module

    -

    nn_conv2d()

    -

    Conv2D module

    -

    nn_conv3d()

    -

    Conv3D module

    -

    nn_conv_transpose1d()

    -

    ConvTranspose1D

    -

    nn_conv_transpose2d()

    -

    ConvTranpose2D module

    -

    nn_conv_transpose3d()

    -

    ConvTranpose3D module

    -

    nn_cross_entropy_loss()

    -

    CrossEntropyLoss module

    -

    nn_dropout()

    -

    Dropout module

    -

    nn_dropout2d()

    -

    Dropout2D module

    -

    nn_dropout3d()

    -

    Dropout3D module

    -

    nn_elu()

    -

    ELU module

    -

    nn_embedding()

    -

    Embedding module

    -

    nn_gelu()

    -

    GELU module

    -

    nn_glu()

    -

    GLU module

    -

    nn_hardshrink()

    -

    Hardshwink module

    -

    nn_hardsigmoid()

    -

    Hardsigmoid module

    -

    nn_hardswish()

    -

    Hardswish module

    -

    nn_hardtanh()

    -

    Hardtanh module

    -

    nn_identity()

    -

    Identity module

    -

    nn_init_calculate_gain()

    -

    Calculate gain

    -

    nn_init_constant_()

    -

    Constant initialization

    -

    nn_init_dirac_()

    -

    Dirac initialization

    -

    nn_init_eye_()

    -

    Eye initialization

    -

    nn_init_kaiming_normal_()

    -

    Kaiming normal initialization

    -

    nn_init_kaiming_uniform_()

    -

    Kaiming uniform initialization

    -

    nn_init_normal_()

    -

    Normal initialization

    -

    nn_init_ones_()

    -

    Ones initialization

    -

    nn_init_orthogonal_()

    -

    Orthogonal initialization

    -

    nn_init_sparse_()

    -

    Sparse initialization

    -

    nn_init_trunc_normal_()

    -

    Truncated normal initialization

    -

    nn_init_uniform_()

    -

    Uniform initialization

    -

    nn_init_xavier_normal_()

    -

    Xavier normal initialization

    -

    nn_init_xavier_uniform_()

    -

    Xavier uniform initialization

    -

    nn_init_zeros_()

    -

    Zeros initialization

    -

    nn_leaky_relu()

    -

    LeakyReLU module

    -

    nn_linear()

    -

    Linear module

    -

    nn_log_sigmoid()

    -

    LogSigmoid module

    -

    nn_log_softmax()

    -

    LogSoftmax module

    -

    nn_max_pool1d()

    -

    MaxPool1D module

    -

    nn_max_pool2d()

    -

    MaxPool2D module

    -

    nn_module()

    -

    Base class for all neural network modules.

    -

    nn_module_list()

    -

    Holds submodules in a list.

    -

    nn_multihead_attention()

    -

    MultiHead attention

    -

    nn_prelu()

    -

    PReLU module

    -

    nn_relu()

    -

    ReLU module

    -

    nn_relu6()

    -

    ReLu6 module

    -

    nn_rnn()

    -

    RNN module

    -

    nn_rrelu()

    -

    RReLU module

    -

    nn_selu()

    -

    SELU module

    -

    nn_sequential()

    -

    A sequential container

    -

    nn_sigmoid()

    -

    Sigmoid module

    -

    nn_softmax()

    -

    Softmax module

    -

    nn_softmax2d()

    -

    Softmax2d module

    -

    nn_softmin()

    -

    Softmin

    -

    nn_softplus()

    -

    Softplus module

    -

    nn_softshrink()

    -

    Softshrink module

    -

    nn_softsign()

    -

    Softsign module

    -

    nn_tanh()

    -

    Tanh module

    -

    nn_tanhshrink()

    -

    Tanhshrink module

    -

    nn_threshold()

    -

    Threshoold module

    -

    nn_utils_rnn_pack_padded_sequence()

    -

    Packs a Tensor containing padded sequences of variable length.

    -

    nn_utils_rnn_pack_sequence()

    -

    Packs a list of variable length Tensors

    -

    nn_utils_rnn_pad_packed_sequence()

    -

    Pads a packed batch of variable length sequences.

    -

    nn_utils_rnn_pad_sequence()

    -

    Pad a list of variable length Tensors with padding_value

    -

    nnf_adaptive_avg_pool1d()

    -

    Adaptive_avg_pool1d

    -

    nnf_adaptive_avg_pool2d()

    -

    Adaptive_avg_pool2d

    -

    nnf_adaptive_avg_pool3d()

    -

    Adaptive_avg_pool3d

    -

    nnf_adaptive_max_pool1d()

    -

    Adaptive_max_pool1d

    -

    nnf_adaptive_max_pool2d()

    -

    Adaptive_max_pool2d

    -

    nnf_adaptive_max_pool3d()

    -

    Adaptive_max_pool3d

    -

    nnf_affine_grid()

    -

    Affine_grid

    -

    nnf_alpha_dropout()

    -

    Alpha_dropout

    -

    nnf_avg_pool1d()

    -

    Avg_pool1d

    -

    nnf_avg_pool2d()

    -

    Avg_pool2d

    -

    nnf_avg_pool3d()

    -

    Avg_pool3d

    -

    nnf_batch_norm()

    -

    Batch_norm

    -

    nnf_bilinear()

    -

    Bilinear

    -

    nnf_binary_cross_entropy()

    -

    Binary_cross_entropy

    -

    nnf_binary_cross_entropy_with_logits()

    -

    Binary_cross_entropy_with_logits

    -

    nnf_celu() nnf_celu_()

    -

    Celu

    -

    nnf_conv1d()

    -

    Conv1d

    -

    nnf_conv2d()

    -

    Conv2d

    -

    nnf_conv3d()

    -

    Conv3d

    -

    nnf_conv_tbc()

    -

    Conv_tbc

    -

    nnf_conv_transpose1d()

    -

    Conv_transpose1d

    -

    nnf_conv_transpose2d()

    -

    Conv_transpose2d

    -

    nnf_conv_transpose3d()

    -

    Conv_transpose3d

    -

    nnf_cosine_embedding_loss()

    -

    Cosine_embedding_loss

    -

    nnf_cosine_similarity()

    -

    Cosine_similarity

    -

    nnf_cross_entropy()

    -

    Cross_entropy

    -

    nnf_ctc_loss()

    -

    Ctc_loss

    -

    nnf_dropout()

    -

    Dropout

    -

    nnf_dropout2d()

    -

    Dropout2d

    -

    nnf_dropout3d()

    -

    Dropout3d

    -

    nnf_elu() nnf_elu_()

    -

    Elu

    -

    nnf_embedding()

    -

    Embedding

    -

    nnf_embedding_bag()

    -

    Embedding_bag

    -

    nnf_fold()

    -

    Fold

    -

    nnf_fractional_max_pool2d()

    -

    Fractional_max_pool2d

    -

    nnf_fractional_max_pool3d()

    -

    Fractional_max_pool3d

    -

    nnf_gelu()

    -

    Gelu

    -

    nnf_glu()

    -

    Glu

    -

    nnf_grid_sample()

    -

    Grid_sample

    -

    nnf_group_norm()

    -

    Group_norm

    -

    nnf_gumbel_softmax()

    -

    Gumbel_softmax

    -

    nnf_hardshrink()

    -

    Hardshrink

    -

    nnf_hardsigmoid()

    -

    Hardsigmoid

    -

    nnf_hardswish()

    -

    Hardswish

    -

    nnf_hardtanh() nnf_hardtanh_()

    -

    Hardtanh

    -

    nnf_hinge_embedding_loss()

    -

    Hinge_embedding_loss

    -

    nnf_instance_norm()

    -

    Instance_norm

    -

    nnf_interpolate()

    -

    Interpolate

    -

    nnf_kl_div()

    -

    Kl_div

    -

    nnf_l1_loss()

    -

    L1_loss

    -

    nnf_layer_norm()

    -

    Layer_norm

    -

    nnf_leaky_relu()

    -

    Leaky_relu

    -

    nnf_linear()

    -

    Linear

    -

    nnf_local_response_norm()

    -

    Local_response_norm

    -

    nnf_log_softmax()

    -

    Log_softmax

    -

    nnf_logsigmoid()

    -

    Logsigmoid

    -

    nnf_lp_pool1d()

    -

    Lp_pool1d

    -

    nnf_lp_pool2d()

    -

    Lp_pool2d

    -

    nnf_margin_ranking_loss()

    -

    Margin_ranking_loss

    -

    nnf_max_pool1d()

    -

    Max_pool1d

    -

    nnf_max_pool2d()

    -

    Max_pool2d

    -

    nnf_max_pool3d()

    -

    Max_pool3d

    -

    nnf_max_unpool1d()

    -

    Max_unpool1d

    -

    nnf_max_unpool2d()

    -

    Max_unpool2d

    -

    nnf_max_unpool3d()

    -

    Max_unpool3d

    -

    nnf_mse_loss()

    -

    Mse_loss

    -

    nnf_multi_head_attention_forward()

    -

    Multi head attention forward

    -

    nnf_multi_margin_loss()

    -

    Multi_margin_loss

    -

    nnf_multilabel_margin_loss()

    -

    Multilabel_margin_loss

    -

    nnf_multilabel_soft_margin_loss()

    -

    Multilabel_soft_margin_loss

    -

    nnf_nll_loss()

    -

    Nll_loss

    -

    nnf_normalize()

    -

    Normalize

    -

    nnf_one_hot()

    -

    One_hot

    -

    nnf_pad()

    -

    Pad

    -

    nnf_pairwise_distance()

    -

    Pairwise_distance

    -

    nnf_pdist()

    -

    Pdist

    -

    nnf_pixel_shuffle()

    -

    Pixel_shuffle

    -

    nnf_poisson_nll_loss()

    -

    Poisson_nll_loss

    -

    nnf_prelu()

    -

    Prelu

    -

    nnf_relu() nnf_relu_()

    -

    Relu

    -

    nnf_relu6()

    -

    Relu6

    -

    nnf_rrelu() nnf_rrelu_()

    -

    Rrelu

    -

    nnf_selu() nnf_selu_()

    -

    Selu

    -

    nnf_smooth_l1_loss()

    -

    Smooth_l1_loss

    -

    nnf_soft_margin_loss()

    -

    Soft_margin_loss

    -

    nnf_softmax()

    -

    Softmax

    -

    nnf_softmin()

    -

    Softmin

    -

    nnf_softplus()

    -

    Softplus

    -

    nnf_softshrink()

    -

    Softshrink

    -

    nnf_softsign()

    -

    Softsign

    -

    nnf_tanhshrink()

    -

    Tanhshrink

    -

    nnf_threshold() nnf_threshold_()

    -

    Threshold

    -

    nnf_triplet_margin_loss()

    -

    Triplet_margin_loss

    -

    nnf_unfold()

    -

    Unfold

    -

    optim_adam()

    -

    Implements Adam algorithm.

    -

    optim_required()

    -

    Dummy value indicating a required value.

    -

    optim_sgd()

    -

    SGD optimizer

    -

    tensor_dataset()

    -

    Dataset wrapping tensors.

    -

    torch_abs

    -

    Abs

    -

    torch_acos

    -

    Acos

    -

    torch_adaptive_avg_pool1d

    -

    Adaptive_avg_pool1d

    -

    torch_add

    -

    Add

    -

    torch_addbmm

    -

    Addbmm

    -

    torch_addcdiv

    -

    Addcdiv

    -

    torch_addcmul

    -

    Addcmul

    -

    torch_addmm

    -

    Addmm

    -

    torch_addmv

    -

    Addmv

    -

    torch_addr

    -

    Addr

    -

    torch_allclose

    -

    Allclose

    -

    torch_angle

    -

    Angle

    -

    torch_arange

    -

    Arange

    -

    torch_argmax

    -

    Argmax

    -

    torch_argmin

    -

    Argmin

    -

    torch_argsort

    -

    Argsort

    -

    torch_as_strided

    -

    As_strided

    -

    torch_asin

    -

    Asin

    -

    torch_atan

    -

    Atan

    -

    torch_atan2

    -

    Atan2

    -

    torch_avg_pool1d

    -

    Avg_pool1d

    -

    torch_baddbmm

    -

    Baddbmm

    -

    torch_bartlett_window

    -

    Bartlett_window

    -

    torch_bernoulli

    -

    Bernoulli

    -

    torch_bincount

    -

    Bincount

    -

    torch_bitwise_and

    -

    Bitwise_and

    -

    torch_bitwise_not

    -

    Bitwise_not

    -

    torch_bitwise_or

    -

    Bitwise_or

    -

    torch_bitwise_xor

    -

    Bitwise_xor

    -

    torch_blackman_window

    -

    Blackman_window

    -

    torch_bmm

    -

    Bmm

    -

    torch_broadcast_tensors

    -

    Broadcast_tensors

    -

    torch_can_cast

    -

    Can_cast

    -

    torch_cartesian_prod

    -

    Cartesian_prod

    -

    torch_cat

    -

    Cat

    -

    torch_cdist

    -

    Cdist

    -

    torch_ceil

    -

    Ceil

    -

    torch_celu_

    -

    Celu_

    -

    torch_chain_matmul

    -

    Chain_matmul

    -

    torch_cholesky

    -

    Cholesky

    -

    torch_cholesky_inverse

    -

    Cholesky_inverse

    -

    torch_cholesky_solve

    -

    Cholesky_solve

    -

    torch_chunk

    -

    Chunk

    -

    torch_clamp

    -

    Clamp

    -

    torch_combinations

    -

    Combinations

    -

    torch_conj

    -

    Conj

    -

    torch_conv1d

    -

    Conv1d

    -

    torch_conv2d

    -

    Conv2d

    -

    torch_conv3d

    -

    Conv3d

    -

    torch_conv_tbc

    -

    Conv_tbc

    -

    torch_conv_transpose1d

    -

    Conv_transpose1d

    -

    torch_conv_transpose2d

    -

    Conv_transpose2d

    -

    torch_conv_transpose3d

    -

    Conv_transpose3d

    -

    torch_cos

    -

    Cos

    -

    torch_cosh

    -

    Cosh

    -

    torch_cosine_similarity

    -

    Cosine_similarity

    -

    torch_cross

    -

    Cross

    -

    torch_cummax

    -

    Cummax

    -

    torch_cummin

    -

    Cummin

    -

    torch_cumprod

    -

    Cumprod

    -

    torch_cumsum

    -

    Cumsum

    -

    torch_det

    -

    Det

    -

    torch_device()

    -

    Create a Device object

    -

    torch_diag

    -

    Diag

    -

    torch_diag_embed

    -

    Diag_embed

    -

    torch_diagflat

    -

    Diagflat

    -

    torch_diagonal

    -

    Diagonal

    -

    torch_digamma

    -

    Digamma

    -

    torch_dist

    -

    Dist

    -

    torch_div

    -

    Div

    -

    torch_dot

    -

    Dot

    -

    torch_float32() torch_float() torch_float64() torch_double() torch_float16() torch_half() torch_uint8() torch_int8() torch_int16() torch_short() torch_int32() torch_int() torch_int64() torch_long() torch_bool() torch_quint8() torch_qint8() torch_qint32()

    -

    Torch data types

    -

    torch_eig

    -

    Eig

    -

    torch_einsum

    -

    Einsum

    -

    torch_empty

    -

    Empty

    -

    torch_empty_like

    -

    Empty_like

    -

    torch_empty_strided

    -

    Empty_strided

    -

    torch_eq

    -

    Eq

    -

    torch_equal

    -

    Equal

    -

    torch_erf

    -

    Erf

    -

    torch_erfc

    -

    Erfc

    -

    torch_erfinv

    -

    Erfinv

    -

    torch_exp

    -

    Exp

    -

    torch_expm1

    -

    Expm1

    -

    torch_eye

    -

    Eye

    -

    torch_fft

    -

    Fft

    -

    torch_flatten

    -

    Flatten

    -

    torch_flip

    -

    Flip

    -

    torch_floor

    -

    Floor

    -

    torch_floor_divide

    -

    Floor_divide

    -

    torch_fmod

    -

    Fmod

    -

    torch_frac

    -

    Frac

    -

    torch_full

    -

    Full

    -

    torch_full_like

    -

    Full_like

    -

    torch_gather

    -

    Gather

    -

    torch_ge

    -

    Ge

    -

    torch_generator()

    -

    Create a Generator object

    -

    torch_geqrf

    -

    Geqrf

    -

    torch_ger

    -

    Ger

    -

    torch_gt

    -

    Gt

    -

    torch_hamming_window

    -

    Hamming_window

    -

    torch_hann_window

    -

    Hann_window

    -

    torch_histc

    -

    Histc

    -

    torch_ifft

    -

    Ifft

    -

    torch_imag

    -

    Imag

    -

    torch_index_select

    -

    Index_select

    -

    torch_inverse

    -

    Inverse

    -

    torch_irfft

    -

    Irfft

    -

    torch_is_complex

    -

    Is_complex

    -

    torch_is_floating_point

    -

    Is_floating_point

    -

    torch_isfinite

    -

    Isfinite

    -

    torch_isinf

    -

    Isinf

    -

    torch_isnan

    -

    Isnan

    -

    torch_kthvalue

    -

    Kthvalue

    -

    torch_strided() torch_sparse_coo()

    -

    Creates the corresponding layout

    -

    torch_le

    -

    Le

    -

    torch_lerp

    -

    Lerp

    -

    torch_lgamma

    -

    Lgamma

    -

    torch_linspace

    -

    Linspace

    -

    torch_load()

    -

    Loads a saved object

    -

    torch_log

    -

    Log

    -

    torch_log10

    -

    Log10

    -

    torch_log1p

    -

    Log1p

    -

    torch_log2

    -

    Log2

    -

    torch_logdet

    -

    Logdet

    -

    torch_logical_and

    -

    Logical_and

    -

    torch_logical_not

    -

    Logical_not

    -

    torch_logical_or

    -

    Logical_or

    -

    torch_logical_xor

    -

    Logical_xor

    -

    torch_logspace

    -

    Logspace

    -

    torch_logsumexp

    -

    Logsumexp

    -

    torch_lstsq

    -

    Lstsq

    -

    torch_lt

    -

    Lt

    -

    torch_lu()

    -

    LU

    -

    torch_lu_solve

    -

    Lu_solve

    -

    torch_masked_select

    -

    Masked_select

    -

    torch_matmul

    -

    Matmul

    -

    torch_matrix_power

    -

    Matrix_power

    -

    torch_matrix_rank

    -

    Matrix_rank

    -

    torch_max

    -

    Max

    -

    torch_mean

    -

    Mean

    -

    torch_median

    -

    Median

    -

    torch_contiguous_format() torch_preserve_format() torch_channels_last_format()

    -

    Memory format

    -

    torch_meshgrid

    -

    Meshgrid

    -

    torch_min

    -

    Min

    -

    torch_mm

    -

    Mm

    -

    torch_mode

    -

    Mode

    -

    torch_mul

    -

    Mul

    -

    torch_multinomial

    -

    Multinomial

    -

    torch_mv

    -

    Mv

    -

    torch_mvlgamma

    -

    Mvlgamma

    -

    torch_narrow

    -

    Narrow

    -

    torch_ne

    -

    Ne

    -

    torch_neg

    -

    Neg

    -

    torch_nonzero

    -

    Nonzero

    -

    torch_norm

    -

    Norm

    -

    torch_normal

    -

    Normal

    -

    torch_ones

    -

    Ones

    -

    torch_ones_like

    -

    Ones_like

    -

    torch_orgqr

    -

    Orgqr

    -

    torch_ormqr

    -

    Ormqr

    -

    torch_pdist

    -

    Pdist

    -

    torch_pinverse

    -

    Pinverse

    -

    torch_pixel_shuffle

    -

    Pixel_shuffle

    -

    torch_poisson

    -

    Poisson

    -

    torch_polygamma

    -

    Polygamma

    -

    torch_pow

    -

    Pow

    -

    torch_prod

    -

    Prod

    -

    torch_promote_types

    -

    Promote_types

    -

    torch_qr

    -

    Qr

    -

    torch_per_channel_affine() torch_per_tensor_affine() torch_per_channel_symmetric() torch_per_tensor_symmetric()

    -

    Creates the corresponding Scheme object

    -

    torch_quantize_per_channel

    -

    Quantize_per_channel

    -

    torch_quantize_per_tensor

    -

    Quantize_per_tensor

    -

    torch_rand

    -

    Rand

    -

    torch_rand_like

    -

    Rand_like

    -

    torch_randint

    -

    Randint

    -

    torch_randint_like

    -

    Randint_like

    -

    torch_randn

    -

    Randn

    -

    torch_randn_like

    -

    Randn_like

    -

    torch_randperm

    -

    Randperm

    -

    torch_range

    -

    Range

    -

    torch_real

    -

    Real

    -

    torch_reciprocal

    -

    Reciprocal

    -

    torch_reduction_sum() torch_reduction_mean() torch_reduction_none()

    -

    Creates the reduction objet

    -

    torch_relu_

    -

    Relu_

    -

    torch_remainder

    -

    Remainder

    -

    torch_renorm

    -

    Renorm

    -

    torch_repeat_interleave

    -

    Repeat_interleave

    -

    torch_reshape

    -

    Reshape

    -

    torch_result_type

    -

    Result_type

    -

    torch_rfft

    -

    Rfft

    -

    torch_roll

    -

    Roll

    -

    torch_rot90

    -

    Rot90

    -

    torch_round

    -

    Round

    -

    torch_rrelu_

    -

    Rrelu_

    -

    torch_rsqrt

    -

    Rsqrt

    -

    torch_save()

    -

    Saves an object to a disk file.

    -

    torch_selu_

    -

    Selu_

    -

    torch_sigmoid

    -

    Sigmoid

    -

    torch_sign

    -

    Sign

    -

    torch_sin

    -

    Sin

    -

    torch_sinh

    -

    Sinh

    -

    torch_slogdet

    -

    Slogdet

    -

    torch_solve

    -

    Solve

    -

    torch_sort

    -

    Sort

    -

    torch_sparse_coo_tensor

    -

    Sparse_coo_tensor

    -

    torch_split

    -

    Split

    -

    torch_sqrt

    -

    Sqrt

    -

    torch_square

    -

    Square

    -

    torch_squeeze

    -

    Squeeze

    -

    torch_stack

    -

    Stack

    -

    torch_std

    -

    Std

    -

    torch_std_mean

    -

    Std_mean

    -

    torch_stft

    -

    Stft

    -

    torch_sum

    -

    Sum

    -

    torch_svd

    -

    Svd

    -

    torch_symeig

    -

    Symeig

    -

    torch_t

    -

    T

    -

    torch_take

    -

    Take

    -

    torch_tan

    -

    Tan

    -

    torch_tanh

    -

    Tanh

    -

    torch_tensor()

    -

    Converts R objects to a torch tensor

    -

    torch_tensordot

    -

    Tensordot

    -

    torch_threshold_

    -

    Threshold_

    -

    torch_topk

    -

    Topk

    -

    torch_trace

    -

    Trace

    -

    torch_transpose

    -

    Transpose

    -

    torch_trapz

    -

    Trapz

    -

    torch_triangular_solve

    -

    Triangular_solve

    -

    torch_tril

    -

    Tril

    -

    torch_tril_indices

    -

    Tril_indices

    -

    torch_triu

    -

    Triu

    -

    torch_triu_indices

    -

    Triu_indices

    -

    torch_true_divide

    -

    True_divide

    -

    torch_trunc

    -

    Trunc

    -

    torch_unbind

    -

    Unbind

    -

    torch_unique_consecutive

    -

    Unique_consecutive

    -

    torch_unsqueeze

    -

    Unsqueeze

    -

    torch_var

    -

    Var

    -

    torch_var_mean

    -

    Var_mean

    -

    torch_where

    -

    Where

    -

    torch_zeros

    -

    Zeros

    -

    torch_zeros_like

    -

    Zeros_like

    -

    vision_make_grid()

    -

    A simplified version of torchvision.utils.make_grid.

    -

    with_enable_grad()

    -

    Enable grad

    -

    with_no_grad()

    -

    Temporarily modify gradient recording.

    -
    - - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/install_torch.html b/docs/reference/install_torch.html deleted file mode 100644 index 22442f16f80f42832437bf21f26b53d88a3350ef..0000000000000000000000000000000000000000 --- a/docs/reference/install_torch.html +++ /dev/null @@ -1,234 +0,0 @@ - - - - - - - - -Install Torch — install_torch • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Installs Torch and its dependencies.

    -
    - -
    install_torch(
    -  version = "1.5.0",
    -  type = install_type(version = version),
    -  reinstall = FALSE,
    -  path = install_path(),
    -  ...
    -)
    - -

    Arguments

    - - - - - - - - - - - - - - - - - - - - - - -
    version

    The Torch version to install.

    type

    The installation type for Torch. Valid values are "cpu" or the 'CUDA' version.

    reinstall

    Re-install Torch even if its already installed?

    path

    Optional path to install or check for an already existing installation.

    ...

    other optional arguments (like load for manual installation.)

    - -

    Details

    - -

    When using path to install in a specific location, make sure the TORCH_HOME environment -variable is set to this same path to reuse this installation. The TORCH_INSTALL environment -variable can be set to 0 to prevent auto-installing torch and TORCH_LOAD set to 0 -to avoid loading dependencies automatically. These environment variables are meant for advanced use -cases and troubleshootinng only.

    - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/is_dataloader.html b/docs/reference/is_dataloader.html deleted file mode 100644 index 4963f41a6779ef28cd7ac246b2e5f90b571a97d2..0000000000000000000000000000000000000000 --- a/docs/reference/is_dataloader.html +++ /dev/null @@ -1,205 +0,0 @@ - - - - - - - - -Checks if the object is a dataloader — is_dataloader • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Checks if the object is a dataloader

    -
    - -
    is_dataloader(x)
    - -

    Arguments

    - - - - - - -
    x

    object to check

    - - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/is_torch_dtype.html b/docs/reference/is_torch_dtype.html deleted file mode 100644 index a25b3fb7acd9979f4983c0518267ec43ee4cbaf6..0000000000000000000000000000000000000000 --- a/docs/reference/is_torch_dtype.html +++ /dev/null @@ -1,205 +0,0 @@ - - - - - - - - -Check if object is a torch data type — is_torch_dtype • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Check if object is a torch data type

    -
    - -
    is_torch_dtype(x)
    - -

    Arguments

    - - - - - - -
    x

    object to check.

    - - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/is_torch_layout.html b/docs/reference/is_torch_layout.html deleted file mode 100644 index f3d02486c008a2dd618d19feda484b5a061aa3de..0000000000000000000000000000000000000000 --- a/docs/reference/is_torch_layout.html +++ /dev/null @@ -1,205 +0,0 @@ - - - - - - - - -Check if an object is a torch layout. — is_torch_layout • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Check if an object is a torch layout.

    -
    - -
    is_torch_layout(x)
    - -

    Arguments

    - - - - - - -
    x

    object to check

    - - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/is_torch_memory_format.html b/docs/reference/is_torch_memory_format.html deleted file mode 100644 index 9352ef7d6a86d9f5b4719cd903cd0bee2e59777f..0000000000000000000000000000000000000000 --- a/docs/reference/is_torch_memory_format.html +++ /dev/null @@ -1,205 +0,0 @@ - - - - - - - - -Check if an object is a memory format — is_torch_memory_format • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Check if an object is a memory format

    -
    - -
    is_torch_memory_format(x)
    - -

    Arguments

    - - - - - - -
    x

    object to check

    - - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/is_torch_qscheme.html b/docs/reference/is_torch_qscheme.html deleted file mode 100644 index 4ab4b994369a3f4252c3cfb0637f9ddd862d88ab..0000000000000000000000000000000000000000 --- a/docs/reference/is_torch_qscheme.html +++ /dev/null @@ -1,205 +0,0 @@ - - - - - - - - -Checks if an object is a QScheme — is_torch_qscheme • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Checks if an object is a QScheme

    -
    - -
    is_torch_qscheme(x)
    - -

    Arguments

    - - - - - - -
    x

    object to check

    - - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/kmnist_dataset.html b/docs/reference/kmnist_dataset.html deleted file mode 100644 index e6723725c65d26a8082f42f8d09355ace511ec7e..0000000000000000000000000000000000000000 --- a/docs/reference/kmnist_dataset.html +++ /dev/null @@ -1,233 +0,0 @@ - - - - - - - - -Kuzushiji-MNIST — kmnist_dataset • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - - - -
    kmnist_dataset(
    -  root,
    -  train = TRUE,
    -  transform = NULL,
    -  target_transform = NULL,
    -  download = FALSE
    -)
    - -

    Arguments

    - - - - - - - - - - - - - - - - - - - - - - -
    root

    (string): Root directory of dataset where KMNIST/processed/training.pt -and KMNIST/processed/test.pt exist.

    train

    (bool, optional): If TRUE, creates dataset from training.pt, -otherwise from test.pt.

    transform

    (callable, optional): A function/transform that takes in an PIL image -and returns a transformed version. E.g, transforms.RandomCrop

    target_transform

    (callable, optional): A function/transform that takes in the -target and transforms it.

    download

    (bool, optional): If true, downloads the dataset from the internet and -puts it in root directory. If dataset is already downloaded, it is not -downloaded again.

    - - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/mnist_dataset.html b/docs/reference/mnist_dataset.html deleted file mode 100644 index d9eb3855ed9ba92b3ed1b0afac64f3a91a684fb0..0000000000000000000000000000000000000000 --- a/docs/reference/mnist_dataset.html +++ /dev/null @@ -1,233 +0,0 @@ - - - - - - - - -MNIST dataset — mnist_dataset • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Prepares the MNIST dataset and optionally downloads it.

    -
    - -
    mnist_dataset(
    -  root,
    -  train = TRUE,
    -  transform = NULL,
    -  target_transform = NULL,
    -  download = FALSE
    -)
    - -

    Arguments

    - - - - - - - - - - - - - - - - - - - - - - -
    root

    (string): Root directory of dataset where MNIST/processed/training.pt -and MNIST/processed/test.pt exist.

    train

    (bool, optional): If True, creates dataset from training.pt, -otherwise from test.pt.

    transform

    (callable, optional): A function/transform that takes in an PIL image -and returns a transformed version. E.g, transforms.RandomCrop

    target_transform

    (callable, optional): A function/transform that takes in the -target and transforms it.

    download

    (bool, optional): If true, downloads the dataset from the internet and -puts it in root directory. If dataset is already downloaded, it is not -downloaded again.

    - - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nn_adaptive_log_softmax_with_loss.html b/docs/reference/nn_adaptive_log_softmax_with_loss.html deleted file mode 100644 index 8bb8c2fa7623e2fa4a943a146eca8b6ed831db35..0000000000000000000000000000000000000000 --- a/docs/reference/nn_adaptive_log_softmax_with_loss.html +++ /dev/null @@ -1,303 +0,0 @@ - - - - - - - - -AdaptiveLogSoftmaxWithLoss module — nn_adaptive_log_softmax_with_loss • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - - - -
    nn_adaptive_log_softmax_with_loss(
    -  in_features,
    -  n_classes,
    -  cutoffs,
    -  div_value = 4,
    -  head_bias = FALSE
    -)
    - -

    Arguments

    - - - - - - - - - - - - - - - - - - - - - - -
    in_features

    (int): Number of features in the input tensor

    n_classes

    (int): Number of classes in the dataset

    cutoffs

    (Sequence): Cutoffs used to assign targets to their buckets

    div_value

    (float, optional): value used as an exponent to compute sizes -of the clusters. Default: 4.0

    head_bias

    (bool, optional): If True, adds a bias term to the 'head' of the -adaptive softmax. Default: False

    - -

    Value

    - -

    NamedTuple with output and loss fields:

      -
    • output is a Tensor of size N containing computed target -log probabilities for each example

    • -
    • loss is a Scalar representing the computed negative -log likelihood loss

    • -
    - -

    Details

    - -

    Adaptive softmax is an approximate strategy for training models with large -output spaces. It is most effective when the label distribution is highly -imbalanced, for example in natural language modelling, where the word -frequency distribution approximately follows the Zipf's law.

    -

    Adaptive softmax partitions the labels into several clusters, according to -their frequency. These clusters may contain different number of targets -each.

    -

    Additionally, clusters containing less frequent labels assign lower -dimensional embeddings to those labels, which speeds up the computation. -For each minibatch, only clusters for which at least one target is -present are evaluated.

    -

    The idea is that the clusters which are accessed frequently -(like the first one, containing most frequent labels), should also be cheap -to compute -- that is, contain a small number of assigned labels. -We highly recommend taking a look at the original paper for more details.

      -
    • cutoffs should be an ordered Sequence of integers sorted -in the increasing order. -It controls number of clusters and the partitioning of targets into -clusters. For example setting cutoffs = c(10, 100, 1000) -means that first 10 targets will be assigned -to the 'head' of the adaptive softmax, targets 11, 12, ..., 100 will be -assigned to the first cluster, and targets 101, 102, ..., 1000 will be -assigned to the second cluster, while targets -1001, 1002, ..., n_classes - 1 will be assigned -to the last, third cluster.

    • -
    • div_value is used to compute the size of each additional cluster, -which is given as -\(\left\lfloor\frac{\mbox{in\_features}}{\mbox{div\_value}^{idx}}\right\rfloor\), -where \(idx\) is the cluster index (with clusters -for less frequent words having larger indices, -and indices starting from \(1\)).

    • -
    • head_bias if set to True, adds a bias term to the 'head' of the -adaptive softmax. See paper for details. Set to False in the official -implementation.

    • -
    - -

    Note

    - -

    This module returns a NamedTuple with output -and loss fields. See further documentation for details.

    -

    To compute log-probabilities for all classes, the log_prob -method can be used.

    -

    Warning

    - - - -

    Labels passed as inputs to this module should be sorted according to -their frequency. This means that the most frequent label should be -represented by the index 0, and the least frequent -label should be represented by the index n_classes - 1.

    -

    Shape

    - - - -
      -
    • input: \((N, \mbox{in\_features})\)

    • -
    • target: \((N)\) where each value satisfies \(0 <= \mbox{target[i]} <= \mbox{n\_classes}\)

    • -
    • output1: \((N)\)

    • -
    • output2: Scalar

    • -
    - - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nn_batch_norm1d.html b/docs/reference/nn_batch_norm1d.html deleted file mode 100644 index ca25d62f31ce9a0a147cb41fab2e2a0a31cfd24f..0000000000000000000000000000000000000000 --- a/docs/reference/nn_batch_norm1d.html +++ /dev/null @@ -1,287 +0,0 @@ - - - - - - - - -BatchNorm1D module — nn_batch_norm1d • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Applies Batch Normalization over a 2D or 3D input (a mini-batch of 1D -inputs with optional additional channel dimension) as described in the paper -Batch Normalization: Accelerating Deep Network Training by Reducing Internal Covariate Shift

    -
    - -
    nn_batch_norm1d(
    -  num_features,
    -  eps = 1e-05,
    -  momentum = 0.1,
    -  affine = TRUE,
    -  track_running_stats = TRUE
    -)
    - -

    Arguments

    - - - - - - - - - - - - - - - - - - - - - - -
    num_features

    \(C\) from an expected input of size -\((N, C, L)\) or \(L\) from input of size \((N, L)\)

    eps

    a value added to the denominator for numerical stability. -Default: 1e-5

    momentum

    the value used for the running_mean and running_var -computation. Can be set to NULL for cumulative moving average -(i.e. simple average). Default: 0.1

    affine

    a boolean value that when set to TRUE, this module has -learnable affine parameters. Default: TRUE

    track_running_stats

    a boolean value that when set to TRUE, this -module tracks the running mean and variance, and when set to FALSE, -this module does not track such statistics and always uses batch -statistics in both training and eval modes. Default: TRUE

    - -

    Details

    - -

    $$ -y = \frac{x - \mathrm{E}[x]}{\sqrt{\mathrm{Var}[x] + \epsilon}} * \gamma + \beta -$$

    -

    The mean and standard-deviation are calculated per-dimension over -the mini-batches and \(\gamma\) and \(\beta\) are learnable parameter vectors -of size C (where C is the input size). By default, the elements of \(\gamma\) -are set to 1 and the elements of \(\beta\) are set to 0.

    -

    Also by default, during training this layer keeps running estimates of its -computed mean and variance, which are then used for normalization during -evaluation. The running estimates are kept with a default :attr:momentum -of 0.1. -If track_running_stats is set to FALSE, this layer then does not -keep running estimates, and batch statistics are instead used during -evaluation time as well.

    -

    Note

    - - - - -

    This momentum argument is different from one used in optimizer -classes and the conventional notion of momentum. Mathematically, the -update rule for running statistics here is -\(\hat{x}_{\mbox{new}} = (1 - \mbox{momentum}) \times \hat{x} + \mbox{momentum} \times x_t\), -where \(\hat{x}\) is the estimated statistic and \(x_t\) is the -new observed value.

    -

    Because the Batch Normalization is done over the C dimension, computing statistics -on (N, L) slices, it's common terminology to call this Temporal Batch Normalization.

    -

    Shape

    - - - -
      -
    • Input: \((N, C)\) or \((N, C, L)\)

    • -
    • Output: \((N, C)\) or \((N, C, L)\) (same shape as input)

    • -
    - - -

    Examples

    -
    # \dontrun{ -# With Learnable Parameters -m <- nn_batch_norm1d(100) -# Without Learnable Parameters -m <- nn_batch_norm1d(100, affine = FALSE) -input <- torch_randn(20, 100) -output <- m(input) - -# }
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nn_batch_norm2d.html b/docs/reference/nn_batch_norm2d.html deleted file mode 100644 index 34a8884a54e0e772c09f2b2e085e50bdbd9c7ec5..0000000000000000000000000000000000000000 --- a/docs/reference/nn_batch_norm2d.html +++ /dev/null @@ -1,286 +0,0 @@ - - - - - - - - -BatchNorm2D — nn_batch_norm2d • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Applies Batch Normalization over a 4D input (a mini-batch of 2D inputs -additional channel dimension) as described in the paper -Batch Normalization: Accelerating Deep Network Training by Reducing Internal Covariate Shift.

    -
    - -
    nn_batch_norm2d(
    -  num_features,
    -  eps = 1e-05,
    -  momentum = 0.1,
    -  affine = TRUE,
    -  track_running_stats = TRUE
    -)
    - -

    Arguments

    - - - - - - - - - - - - - - - - - - - - - - -
    num_features

    \(C\) from an expected input of size -\((N, C, H, W)\)

    eps

    a value added to the denominator for numerical stability. -Default: 1e-5

    momentum

    the value used for the running_mean and running_var -computation. Can be set to None for cumulative moving average -(i.e. simple average). Default: 0.1

    affine

    a boolean value that when set to TRUE, this module has -learnable affine parameters. Default: TRUE

    track_running_stats

    a boolean value that when set to TRUE, this -module tracks the running mean and variance, and when set to FALSE, -this module does not track such statistics and uses batch statistics instead -in both training and eval modes if the running mean and variance are None. -Default: TRUE

    - -

    Details

    - -

    $$ - y = \frac{x - \mathrm{E}[x]}{ \sqrt{\mathrm{Var}[x] + \epsilon}} * \gamma + \beta -$$

    -

    The mean and standard-deviation are calculated per-dimension over -the mini-batches and \(\gamma\) and \(\beta\) are learnable parameter vectors -of size C (where C is the input size). By default, the elements of \(\gamma\) are set -to 1 and the elements of \(\beta\) are set to 0. The standard-deviation is calculated -via the biased estimator, equivalent to torch_var(input, unbiased=FALSE). -Also by default, during training this layer keeps running estimates of its -computed mean and variance, which are then used for normalization during -evaluation. The running estimates are kept with a default momentum -of 0.1.

    -

    If track_running_stats is set to FALSE, this layer then does not -keep running estimates, and batch statistics are instead used during -evaluation time as well.

    -

    Note

    - -

    This momentum argument is different from one used in optimizer -classes and the conventional notion of momentum. Mathematically, the -update rule for running statistics here is -\(\hat{x}_{\mbox{new}} = (1 - \mbox{momentum}) \times \hat{x} + \mbox{momentum} \times x_t\), -where \(\hat{x}\) is the estimated statistic and \(x_t\) is the -new observed value. -Because the Batch Normalization is done over the C dimension, computing statistics -on (N, H, W) slices, it's common terminology to call this Spatial Batch Normalization.

    -

    Shape

    - - - -
      -
    • Input: \((N, C, H, W)\)

    • -
    • Output: \((N, C, H, W)\) (same shape as input)

    • -
    - - -

    Examples

    -
    # \dontrun{ -# With Learnable Parameters -m <- nn_batch_norm2d(100) -# Without Learnable Parameters -m <- nn_batch_norm2d(100, affine=FALSE) -input <- torch_randn(20, 100, 35, 45) -output <- m(input) - -# }
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nn_bce_loss.html b/docs/reference/nn_bce_loss.html deleted file mode 100644 index 9f6987f694ebba6780eaa85888154abfc1177502..0000000000000000000000000000000000000000 --- a/docs/reference/nn_bce_loss.html +++ /dev/null @@ -1,271 +0,0 @@ - - - - - - - - -Binary cross entropy loss — nn_bce_loss • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Creates a criterion that measures the Binary Cross Entropy -between the target and the output:

    -
    - -
    nn_bce_loss(weight = NULL, reduction = "mean")
    - -

    Arguments

    - - - - - - - - - - -
    weight

    (Tensor, optional): a manual rescaling weight given to the loss -of each batch element. If given, has to be a Tensor of size nbatch.

    reduction

    (string, optional): Specifies the reduction to apply to the output: -'none' | 'mean' | 'sum'. 'none': no reduction will be applied, -'mean': the sum of the output will be divided by the number of -elements in the output, 'sum': the output will be summed. Note: size_average -and reduce are in the process of being deprecated, and in the meantime, -specifying either of those two args will override reduction. Default: 'mean'

    - -

    Details

    - -

    The unreduced (i.e. with reduction set to 'none') loss can be described as: -$$ - \ell(x, y) = L = \{l_1,\dots,l_N\}^\top, \quad -l_n = - w_n \left[ y_n \cdot \log x_n + (1 - y_n) \cdot \log (1 - x_n) \right] -$$ -where \(N\) is the batch size. If reduction is not 'none' -(default 'mean'), then

    -

    $$ - \ell(x, y) = \left\{ \begin{array}{ll} -\mbox{mean}(L), & \mbox{if reduction} = \mbox{'mean';}\\ -\mbox{sum}(L), & \mbox{if reduction} = \mbox{'sum'.} -\end{array} -\right. -$$

    -

    This is used for measuring the error of a reconstruction in for example -an auto-encoder. Note that the targets \(y\) should be numbers -between 0 and 1.

    -

    Notice that if \(x_n\) is either 0 or 1, one of the log terms would be -mathematically undefined in the above loss equation. PyTorch chooses to set -\(\log (0) = -\infty\), since \(\lim_{x\to 0} \log (x) = -\infty\).

    -

    However, an infinite term in the loss equation is not desirable for several reasons. -For one, if either \(y_n = 0\) or \((1 - y_n) = 0\), then we would be -multiplying 0 with infinity. Secondly, if we have an infinite loss value, then -we would also have an infinite term in our gradient, since -\(\lim_{x\to 0} \frac{d}{dx} \log (x) = \infty\).

    -

    This would make BCELoss's backward method nonlinear with respect to \(x_n\), -and using it for things like linear regression would not be straight-forward. -Our solution is that BCELoss clamps its log function outputs to be greater than -or equal to -100. This way, we can always have a finite loss value and a linear -backward method.

    -

    Shape

    - - - -
      -
    • Input: \((N, *)\) where \(*\) means, any number of additional -dimensions

    • -
    • Target: \((N, *)\), same shape as the input

    • -
    • Output: scalar. If reduction is 'none', then \((N, *)\), same -shape as input.

    • -
    - - -

    Examples

    -
    # \dontrun{ -m <- nn_sigmoid() -loss <- nn_bce_loss() -input <- torch_randn(3, requires_grad=TRUE) -target <- torch_rand(3) -output <- loss(m(input), target) -output$backward() - -# }
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nn_bilinear.html b/docs/reference/nn_bilinear.html deleted file mode 100644 index e2cd608a09c8c1bac9e7f2b12d619e2a86f59ed3..0000000000000000000000000000000000000000 --- a/docs/reference/nn_bilinear.html +++ /dev/null @@ -1,257 +0,0 @@ - - - - - - - - -Bilinear module — nn_bilinear • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Applies a bilinear transformation to the incoming data -\(y = x_1^T A x_2 + b\)

    -
    - -
    nn_bilinear(in1_features, in2_features, out_features, bias = TRUE)
    - -

    Arguments

    - - - - - - - - - - - - - - - - - - -
    in1_features

    size of each first input sample

    in2_features

    size of each second input sample

    out_features

    size of each output sample

    bias

    If set to FALSE, the layer will not learn an additive bias. -Default: TRUE

    - -

    Shape

    - - - -
      -
    • Input1: \((N, *, H_{in1})\) \(H_{in1}=\mbox{in1\_features}\) and -\(*\) means any number of additional dimensions. All but the last -dimension of the inputs should be the same.

    • -
    • Input2: \((N, *, H_{in2})\) where \(H_{in2}=\mbox{in2\_features}\).

    • -
    • Output: \((N, *, H_{out})\) where \(H_{out}=\mbox{out\_features}\) -and all but the last dimension are the same shape as the input.

    • -
    - -

    Attributes

    - - - -
      -
    • weight: the learnable weights of the module of shape -\((\mbox{out\_features}, \mbox{in1\_features}, \mbox{in2\_features})\). -The values are initialized from \(\mathcal{U}(-\sqrt{k}, \sqrt{k})\), where -\(k = \frac{1}{\mbox{in1\_features}}\)

    • -
    • bias: the learnable bias of the module of shape \((\mbox{out\_features})\). -If bias is TRUE, the values are initialized from -\(\mathcal{U}(-\sqrt{k}, \sqrt{k})\), where -\(k = \frac{1}{\mbox{in1\_features}}\)

    • -
    - - -

    Examples

    -
    # \dontrun{ -m <- nn_bilinear(20, 30, 50) -input1 <- torch_randn(128, 20) -input2 <- torch_randn(128, 30) -output = m(input1, input2) -print(output$size())
    #> [1] 128 50
    -# } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nn_celu.html b/docs/reference/nn_celu.html deleted file mode 100644 index 28f07fdebb647dd876e35fd4628f0d732aefd189..0000000000000000000000000000000000000000 --- a/docs/reference/nn_celu.html +++ /dev/null @@ -1,233 +0,0 @@ - - - - - - - - -CELU module — nn_celu • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Applies the element-wise function:

    -
    - -
    nn_celu(alpha = 1, inplace = FALSE)
    - -

    Arguments

    - - - - - - - - - - -
    alpha

    the \(\alpha\) value for the CELU formulation. Default: 1.0

    inplace

    can optionally do the operation in-place. Default: FALSE

    - -

    Details

    - -

    $$ - \mbox{CELU}(x) = \max(0,x) + \min(0, \alpha * (\exp(x/\alpha) - 1)) -$$

    -

    More details can be found in the paper -Continuously Differentiable Exponential Linear Units.

    -

    Shape

    - - - -
      -
    • Input: \((N, *)\) where * means, any number of additional -dimensions

    • -
    • Output: \((N, *)\), same shape as the input

    • -
    - - -

    Examples

    -
    # \dontrun{ -m <- nn_celu() -input <- torch_randn(2) -output <- m(input) - -# }
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nn_conv1d.html b/docs/reference/nn_conv1d.html deleted file mode 100644 index 5c37fbd8a65c20c4c065ce2b194ae977e6f25d6f..0000000000000000000000000000000000000000 --- a/docs/reference/nn_conv1d.html +++ /dev/null @@ -1,344 +0,0 @@ - - - - - - - - -Conv1D module — nn_conv1d • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Applies a 1D convolution over an input signal composed of several input -planes. -In the simplest case, the output value of the layer with input size -\((N, C_{\mbox{in}}, L)\) and output \((N, C_{\mbox{out}}, L_{\mbox{out}})\) can be -precisely described as:

    -
    - -
    nn_conv1d(
    -  in_channels,
    -  out_channels,
    -  kernel_size,
    -  stride = 1,
    -  padding = 0,
    -  dilation = 1,
    -  groups = 1,
    -  bias = TRUE,
    -  padding_mode = "zeros"
    -)
    - -

    Arguments

    - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    in_channels

    (int): Number of channels in the input image

    out_channels

    (int): Number of channels produced by the convolution

    kernel_size

    (int or tuple): Size of the convolving kernel

    stride

    (int or tuple, optional): Stride of the convolution. Default: 1

    padding

    (int or tuple, optional): Zero-padding added to both sides of -the input. Default: 0

    dilation

    (int or tuple, optional): Spacing between kernel -elements. Default: 1

    groups

    (int, optional): Number of blocked connections from input -channels to output channels. Default: 1

    bias

    (bool, optional): If TRUE, adds a learnable bias to the -output. Default: TRUE

    padding_mode

    (string, optional): 'zeros', 'reflect', -'replicate' or 'circular'. Default: 'zeros'

    - -

    Details

    - -

    $$ -\mbox{out}(N_i, C_{\mbox{out}_j}) = \mbox{bias}(C_{\mbox{out}_j}) + - \sum_{k = 0}^{C_{in} - 1} \mbox{weight}(C_{\mbox{out}_j}, k) -\star \mbox{input}(N_i, k) -$$

    -

    where \(\star\) is the valid -cross-correlation operator, -\(N\) is a batch size, \(C\) denotes a number of channels, -\(L\) is a length of signal sequence.

      -
    • stride controls the stride for the cross-correlation, a single -number or a one-element tuple.

    • -
    • padding controls the amount of implicit zero-paddings on both sides -for padding number of points.

    • -
    • dilation controls the spacing between the kernel points; also -known as the à trous algorithm. It is harder to describe, but this -link -has a nice visualization of what dilation does.

    • -
    • groups controls the connections between inputs and outputs. -in_channels and out_channels must both be divisible by -groups. For example,

        -
      • At groups=1, all inputs are convolved to all outputs.

      • -
      • At groups=2, the operation becomes equivalent to having two conv -layers side by side, each seeing half the input channels, -and producing half the output channels, and both subsequently -concatenated.

      • -
      • At groups= in_channels, each input channel is convolved with -its own set of filters, -of size \(\left\lfloor\frac{out\_channels}{in\_channels}\right\rfloor\).

      • -
    • -
    - -

    Note

    - - - - -

    Depending of the size of your kernel, several (of the last) -columns of the input might be lost, because it is a valid -cross-correlation, and not a full cross-correlation. -It is up to the user to add proper padding.

    -

    When groups == in_channels and out_channels == K * in_channels, -where K is a positive integer, this operation is also termed in -literature as depthwise convolution. -In other words, for an input of size \((N, C_{in}, L_{in})\), -a depthwise convolution with a depthwise multiplier K, can be constructed by arguments -\((C_{\mbox{in}}=C_{in}, C_{\mbox{out}}=C_{in} \times K, ..., \mbox{groups}=C_{in})\).

    -

    Shape

    - - - -
      -
    • Input: \((N, C_{in}, L_{in})\)

    • -
    • Output: \((N, C_{out}, L_{out})\) where

    • -
    - -

    $$ - L_{out} = \left\lfloor\frac{L_{in} + 2 \times \mbox{padding} - \mbox{dilation} - \times (\mbox{kernel\_size} - 1) - 1}{\mbox{stride}} + 1\right\rfloor -$$

    -

    Attributes

    - - - -
      -
    • weight (Tensor): the learnable weights of the module of shape -\((\mbox{out\_channels}, \frac{\mbox{in\_channels}}{\mbox{groups}}, \mbox{kernel\_size})\). -The values of these weights are sampled from -\(\mathcal{U}(-\sqrt{k}, \sqrt{k})\) where -\(k = \frac{groups}{C_{\mbox{in}} * \mbox{kernel\_size}}\)

    • -
    • bias (Tensor): the learnable bias of the module of shape -(out_channels). If bias is TRUE, then the values of these weights are -sampled from \(\mathcal{U}(-\sqrt{k}, \sqrt{k})\) where -\(k = \frac{groups}{C_{\mbox{in}} * \mbox{kernel\_size}}\)

    • -
    - - -

    Examples

    -
    # \dontrun{ -m <- nn_conv1d(16, 33, 3, stride=2) -input <- torch_randn(20, 16, 50) -output <- m(input) - -# }
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nn_conv2d.html b/docs/reference/nn_conv2d.html deleted file mode 100644 index 3c2a0ce4b442735c89840f929248325546cb305f..0000000000000000000000000000000000000000 --- a/docs/reference/nn_conv2d.html +++ /dev/null @@ -1,361 +0,0 @@ - - - - - - - - -Conv2D module — nn_conv2d • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Applies a 2D convolution over an input signal composed of several input -planes.

    -
    - -
    nn_conv2d(
    -  in_channels,
    -  out_channels,
    -  kernel_size,
    -  stride = 1,
    -  padding = 0,
    -  dilation = 1,
    -  groups = 1,
    -  bias = TRUE,
    -  padding_mode = "zeros"
    -)
    - -

    Arguments

    - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    in_channels

    (int): Number of channels in the input image

    out_channels

    (int): Number of channels produced by the convolution

    kernel_size

    (int or tuple): Size of the convolving kernel

    stride

    (int or tuple, optional): Stride of the convolution. Default: 1

    padding

    (int or tuple, optional): Zero-padding added to both sides of -the input. Default: 0

    dilation

    (int or tuple, optional): Spacing between kernel elements. Default: 1

    groups

    (int, optional): Number of blocked connections from input -channels to output channels. Default: 1

    bias

    (bool, optional): If TRUE, adds a learnable bias to the -output. Default: TRUE

    padding_mode

    (string, optional): 'zeros', 'reflect', -'replicate' or 'circular'. Default: 'zeros'

    - -

    Details

    - -

    In the simplest case, the output value of the layer with input size -\((N, C_{\mbox{in}}, H, W)\) and output \((N, C_{\mbox{out}}, H_{\mbox{out}}, W_{\mbox{out}})\) -can be precisely described as:

    -

    $$ -\mbox{out}(N_i, C_{\mbox{out}_j}) = \mbox{bias}(C_{\mbox{out}_j}) + - \sum_{k = 0}^{C_{\mbox{in}} - 1} \mbox{weight}(C_{\mbox{out}_j}, k) \star \mbox{input}(N_i, k) -$$

    -

    where \(\star\) is the valid 2D cross-correlation operator, -\(N\) is a batch size, \(C\) denotes a number of channels, -\(H\) is a height of input planes in pixels, and \(W\) is -width in pixels.

      -
    • stride controls the stride for the cross-correlation, a single -number or a tuple.

    • -
    • padding controls the amount of implicit zero-paddings on both -sides for padding number of points for each dimension.

    • -
    • dilation controls the spacing between the kernel points; also -known as the à trous algorithm. It is harder to describe, but this link_ -has a nice visualization of what dilation does.

    • -
    • groups controls the connections between inputs and outputs. -in_channels and out_channels must both be divisible by -groups. For example,

        -
      • At groups=1, all inputs are convolved to all outputs.

      • -
      • At groups=2, the operation becomes equivalent to having two conv -layers side by side, each seeing half the input channels, -and producing half the output channels, and both subsequently -concatenated.

      • -
      • At groups= in_channels, each input channel is convolved with -its own set of filters, of size: -\(\left\lfloor\frac{out\_channels}{in\_channels}\right\rfloor\).

      • -
    • -
    - -

    The parameters kernel_size, stride, padding, dilation can either be:

      -
    • a single int -- in which case the same value is used for the height and -width dimension

    • -
    • a tuple of two ints -- in which case, the first int is used for the height dimension, -and the second int for the width dimension

    • -
    - -

    Note

    - - - - -

    Depending of the size of your kernel, several (of the last) -columns of the input might be lost, because it is a valid cross-correlation, -and not a full cross-correlation. -It is up to the user to add proper padding.

    -

    When groups == in_channels and out_channels == K * in_channels, -where K is a positive integer, this operation is also termed in -literature as depthwise convolution. -In other words, for an input of size :math:(N, C_{in}, H_{in}, W_{in}), -a depthwise convolution with a depthwise multiplier K, can be constructed by arguments -\((in\_channels=C_{in}, out\_channels=C_{in} \times K, ..., groups=C_{in})\).

    -

    In some circumstances when using the CUDA backend with CuDNN, this operator -may select a nondeterministic algorithm to increase performance. If this is -undesirable, you can try to make the operation deterministic (potentially at -a performance cost) by setting backends_cudnn_deterministic = TRUE.

    -

    Shape

    - - - -
      -
    • Input: \((N, C_{in}, H_{in}, W_{in})\)

    • -
    • Output: \((N, C_{out}, H_{out}, W_{out})\) where -$$ - H_{out} = \left\lfloor\frac{H_{in} + 2 \times \mbox{padding}[0] - \mbox{dilation}[0] - \times (\mbox{kernel\_size}[0] - 1) - 1}{\mbox{stride}[0]} + 1\right\rfloor -$$ -$$ - W_{out} = \left\lfloor\frac{W_{in} + 2 \times \mbox{padding}[1] - \mbox{dilation}[1] - \times (\mbox{kernel\_size}[1] - 1) - 1}{\mbox{stride}[1]} + 1\right\rfloor -$$

    • -
    - -

    Attributes

    - - - -
      -
    • weight (Tensor): the learnable weights of the module of shape -\((\mbox{out\_channels}, \frac{\mbox{in\_channels}}{\mbox{groups}}\), -\(\mbox{kernel\_size[0]}, \mbox{kernel\_size[1]})\). -The values of these weights are sampled from -\(\mathcal{U}(-\sqrt{k}, \sqrt{k})\) where -\(k = \frac{groups}{C_{\mbox{in}} * \prod_{i=0}^{1}\mbox{kernel\_size}[i]}\)

    • -
    • bias (Tensor): the learnable bias of the module of shape -(out_channels). If bias is TRUE, -then the values of these weights are -sampled from \(\mathcal{U}(-\sqrt{k}, \sqrt{k})\) where -\(k = \frac{groups}{C_{\mbox{in}} * \prod_{i=0}^{1}\mbox{kernel\_size}[i]}\)

    • -
    - - -

    Examples

    -
    # \dontrun{ - -# With square kernels and equal stride -m <- nn_conv2d(16, 33, 3, stride = 2) -# non-square kernels and unequal stride and with padding -m <- nn_conv2d(16, 33, c(3, 5), stride=c(2, 1), padding=c(4, 2)) -# non-square kernels and unequal stride and with padding and dilation -m <- nn_conv2d(16, 33, c(3, 5), stride=c(2, 1), padding=c(4, 2), dilation=c(3, 1)) -input <- torch_randn(20, 16, 50, 100) -output <- m(input) - -# }
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nn_conv3d.html b/docs/reference/nn_conv3d.html deleted file mode 100644 index b31fe1dc99d6c032222f0600390678dca3bf1387..0000000000000000000000000000000000000000 --- a/docs/reference/nn_conv3d.html +++ /dev/null @@ -1,349 +0,0 @@ - - - - - - - - -Conv3D module — nn_conv3d • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Applies a 3D convolution over an input signal composed of several input -planes. -In the simplest case, the output value of the layer with input size \((N, C_{in}, D, H, W)\) -and output \((N, C_{out}, D_{out}, H_{out}, W_{out})\) can be precisely described as:

    -
    - -
    nn_conv3d(
    -  in_channels,
    -  out_channels,
    -  kernel_size,
    -  stride = 1,
    -  padding = 0,
    -  dilation = 1,
    -  groups = 1,
    -  bias = TRUE,
    -  padding_mode = "zeros"
    -)
    - -

    Arguments

    - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    in_channels

    (int): Number of channels in the input image

    out_channels

    (int): Number of channels produced by the convolution

    kernel_size

    (int or tuple): Size of the convolving kernel

    stride

    (int or tuple, optional): Stride of the convolution. Default: 1

    padding

    (int or tuple, optional): Zero-padding added to all three sides of the input. Default: 0

    dilation

    (int or tuple, optional): Spacing between kernel elements. Default: 1

    groups

    (int, optional): Number of blocked connections from input channels to output channels. Default: 1

    bias

    (bool, optional): If TRUE, adds a learnable bias to the output. Default: TRUE

    padding_mode

    (string, optional): 'zeros', 'reflect', 'replicate' or 'circular'. Default: 'zeros'

    - -

    Details

    - -

    $$ - out(N_i, C_{out_j}) = bias(C_{out_j}) + - \sum_{k = 0}^{C_{in} - 1} weight(C_{out_j}, k) \star input(N_i, k) -$$

    -

    where \(\star\) is the valid 3D cross-correlation operator

      -
    • stride controls the stride for the cross-correlation.

    • -
    • padding controls the amount of implicit zero-paddings on both -sides for padding number of points for each dimension.

    • -
    • dilation controls the spacing between the kernel points; also known as the à trous algorithm. -It is harder to describe, but this link_ has a nice visualization of what dilation does.

    • -
    • groups controls the connections between inputs and outputs. -in_channels and out_channels must both be divisible by -groups. For example,

    • -
    • At groups=1, all inputs are convolved to all outputs.

    • -
    • At groups=2, the operation becomes equivalent to having two conv -layers side by side, each seeing half the input channels, -and producing half the output channels, and both subsequently -concatenated.

    • -
    • At groups= in_channels, each input channel is convolved with -its own set of filters, of size -\(\left\lfloor\frac{out\_channels}{in\_channels}\right\rfloor\).

    • -
    - -

    The parameters kernel_size, stride, padding, dilation can either be:

      -
    • a single int -- in which case the same value is used for the depth, height and width dimension

    • -
    • a tuple of three ints -- in which case, the first int is used for the depth dimension, -the second int for the height dimension and the third int for the width dimension

    • -
    - -

    Note

    - -

    Depending of the size of your kernel, several (of the last) -columns of the input might be lost, because it is a valid cross-correlation, -and not a full cross-correlation. -It is up to the user to add proper padding.

    -

    When groups == in_channels and out_channels == K * in_channels, -where K is a positive integer, this operation is also termed in -literature as depthwise convolution. -In other words, for an input of size \((N, C_{in}, D_{in}, H_{in}, W_{in})\), -a depthwise convolution with a depthwise multiplier K, can be constructed by arguments -\((in\_channels=C_{in}, out\_channels=C_{in} \times K, ..., groups=C_{in})\).

    -

    In some circumstances when using the CUDA backend with CuDNN, this operator -may select a nondeterministic algorithm to increase performance. If this is -undesirable, you can try to make the operation deterministic (potentially at -a performance cost) by setting torch.backends.cudnn.deterministic = TRUE. -Please see the notes on :doc:/notes/randomness for background.

    -

    Shape

    - - - -
      -
    • Input: \((N, C_{in}, D_{in}, H_{in}, W_{in})\)

    • -
    • Output: \((N, C_{out}, D_{out}, H_{out}, W_{out})\) where -$$ - D_{out} = \left\lfloor\frac{D_{in} + 2 \times \mbox{padding}[0] - \mbox{dilation}[0] - \times (\mbox{kernel\_size}[0] - 1) - 1}{\mbox{stride}[0]} + 1\right\rfloor - $$ -$$ - H_{out} = \left\lfloor\frac{H_{in} + 2 \times \mbox{padding}[1] - \mbox{dilation}[1] - \times (\mbox{kernel\_size}[1] - 1) - 1}{\mbox{stride}[1]} + 1\right\rfloor - $$ -$$ - W_{out} = \left\lfloor\frac{W_{in} + 2 \times \mbox{padding}[2] - \mbox{dilation}[2] - \times (\mbox{kernel\_size}[2] - 1) - 1}{\mbox{stride}[2]} + 1\right\rfloor - $$

    • -
    - -

    Attributes

    - - - -
      -
    • weight (Tensor): the learnable weights of the module of shape -\((\mbox{out\_channels}, \frac{\mbox{in\_channels}}{\mbox{groups}},\) -\(\mbox{kernel\_size[0]}, \mbox{kernel\_size[1]}, \mbox{kernel\_size[2]})\). -The values of these weights are sampled from -\(\mathcal{U}(-\sqrt{k}, \sqrt{k})\) where -\(k = \frac{groups}{C_{\mbox{in}} * \prod_{i=0}^{2}\mbox{kernel\_size}[i]}\)

    • -
    • bias (Tensor): the learnable bias of the module of shape (out_channels). If bias is True, -then the values of these weights are -sampled from \(\mathcal{U}(-\sqrt{k}, \sqrt{k})\) where -\(k = \frac{groups}{C_{\mbox{in}} * \prod_{i=0}^{2}\mbox{kernel\_size}[i]}\)

    • -
    - - -

    Examples

    -
    # \dontrun{ -# With square kernels and equal stride -m <- nn_conv3d(16, 33, 3, stride=2) -# non-square kernels and unequal stride and with padding -m <- nn_conv3d(16, 33, c(3, 5, 2), stride=c(2, 1, 1), padding=c(4, 2, 0)) -input <- torch_randn(20, 16, 10, 50, 100) -output <- m(input) - -# }
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nn_conv_transpose1d.html b/docs/reference/nn_conv_transpose1d.html deleted file mode 100644 index f499ebd0be95420d53b56f8afc5022658d6c81f4..0000000000000000000000000000000000000000 --- a/docs/reference/nn_conv_transpose1d.html +++ /dev/null @@ -1,342 +0,0 @@ - - - - - - - - -ConvTranspose1D — nn_conv_transpose1d • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Applies a 1D transposed convolution operator over an input image -composed of several input planes.

    -
    - -
    nn_conv_transpose1d(
    -  in_channels,
    -  out_channels,
    -  kernel_size,
    -  stride = 1,
    -  padding = 0,
    -  output_padding = 0,
    -  groups = 1,
    -  bias = TRUE,
    -  dilation = 1,
    -  padding_mode = "zeros"
    -)
    - -

    Arguments

    - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    in_channels

    (int): Number of channels in the input image

    out_channels

    (int): Number of channels produced by the convolution

    kernel_size

    (int or tuple): Size of the convolving kernel

    stride

    (int or tuple, optional): Stride of the convolution. Default: 1

    padding

    (int or tuple, optional): dilation * (kernel_size - 1) - padding zero-padding -will be added to both sides of the input. Default: 0

    output_padding

    (int or tuple, optional): Additional size added to one side -of the output shape. Default: 0

    groups

    (int, optional): Number of blocked connections from input channels to output channels. Default: 1

    bias

    (bool, optional): If True, adds a learnable bias to the output. Default: TRUE

    dilation

    (int or tuple, optional): Spacing between kernel elements. Default: 1

    padding_mode

    (string, optional): 'zeros', 'reflect', -'replicate' or 'circular'. Default: 'zeros'

    - -

    Details

    - -

    This module can be seen as the gradient of Conv1d with respect to its input. -It is also known as a fractionally-strided convolution or -a deconvolution (although it is not an actual deconvolution operation).

      -
    • stride controls the stride for the cross-correlation.

    • -
    • padding controls the amount of implicit zero-paddings on both -sides for dilation * (kernel_size - 1) - padding number of points. See note -below for details.

    • -
    • output_padding controls the additional size added to one side -of the output shape. See note below for details.

    • -
    • dilation controls the spacing between the kernel points; also known as the -à trous algorithm. It is harder to describe, but this link -has a nice visualization of what dilation does.

    • -
    • groups controls the connections between inputs and outputs. -in_channels and out_channels must both be divisible by -groups. For example,

        -
      • At groups=1, all inputs are convolved to all outputs.

      • -
      • At groups=2, the operation becomes equivalent to having two conv -layers side by side, each seeing half the input channels, -and producing half the output channels, and both subsequently -concatenated.

      • -
      • At groups= in_channels, each input channel is convolved with -its own set of filters (of size -\(\left\lfloor\frac{out\_channels}{in\_channels}\right\rfloor\)).

      • -
    • -
    - -

    Note

    - -

    Depending of the size of your kernel, several (of the last) -columns of the input might be lost, because it is a valid cross-correlation, -and not a full cross-correlation. -It is up to the user to add proper padding.

    -

    The padding argument effectively adds dilation * (kernel_size - 1) - padding -amount of zero padding to both sizes of the input. This is set so that -when a ~torch.nn.Conv1d and a ~torch.nn.ConvTranspose1d -are initialized with same parameters, they are inverses of each other in -regard to the input and output shapes. However, when stride > 1, -~torch.nn.Conv1d maps multiple input shapes to the same output -shape. output_padding is provided to resolve this ambiguity by -effectively increasing the calculated output shape on one side. Note -that output_padding is only used to find output shape, but does -not actually add zero-padding to output.

    -

    In some circumstances when using the CUDA backend with CuDNN, this operator -may select a nondeterministic algorithm to increase performance. If this is -undesirable, you can try to make the operation deterministic (potentially at -a performance cost) by setting torch.backends.cudnn.deterministic = TRUE.

    -

    Shape

    - - - -
      -
    • Input: \((N, C_{in}, L_{in})\)

    • -
    • Output: \((N, C_{out}, L_{out})\) where -$$ - L_{out} = (L_{in} - 1) \times \mbox{stride} - 2 \times \mbox{padding} + \mbox{dilation} -\times (\mbox{kernel\_size} - 1) + \mbox{output\_padding} + 1 -$$

    • -
    - -

    Attributes

    - - - -
      -
    • weight (Tensor): the learnable weights of the module of shape -\((\mbox{in\_channels}, \frac{\mbox{out\_channels}}{\mbox{groups}},\) -\(\mbox{kernel\_size})\). -The values of these weights are sampled from -\(\mathcal{U}(-\sqrt{k}, \sqrt{k})\) where -\(k = \frac{groups}{C_{\mbox{out}} * \mbox{kernel\_size}}\)

    • -
    • bias (Tensor): the learnable bias of the module of shape (out_channels). -If bias is TRUE, then the values of these weights are -sampled from \(\mathcal{U}(-\sqrt{k}, \sqrt{k})\) where -\(k = \frac{groups}{C_{\mbox{out}} * \mbox{kernel\_size}}\)

    • -
    - - -

    Examples

    -
    # \dontrun{ -m <- nn_conv_transpose1d(32, 16, 2) -input <- torch_randn(10, 32, 2) -output <- m(input) - -# }
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nn_conv_transpose2d.html b/docs/reference/nn_conv_transpose2d.html deleted file mode 100644 index fec88b2fb8a43558ebcfc0c05ec99e7b35cd04b1..0000000000000000000000000000000000000000 --- a/docs/reference/nn_conv_transpose2d.html +++ /dev/null @@ -1,361 +0,0 @@ - - - - - - - - -ConvTranpose2D module — nn_conv_transpose2d • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Applies a 2D transposed convolution operator over an input image -composed of several input planes.

    -
    - -
    nn_conv_transpose2d(
    -  in_channels,
    -  out_channels,
    -  kernel_size,
    -  stride = 1,
    -  padding = 0,
    -  output_padding = 0,
    -  groups = 1,
    -  bias = TRUE,
    -  dilation = 1,
    -  padding_mode = "zeros"
    -)
    - -

    Arguments

    - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    in_channels

    (int): Number of channels in the input image

    out_channels

    (int): Number of channels produced by the convolution

    kernel_size

    (int or tuple): Size of the convolving kernel

    stride

    (int or tuple, optional): Stride of the convolution. Default: 1

    padding

    (int or tuple, optional): dilation * (kernel_size - 1) - padding zero-padding -will be added to both sides of each dimension in the input. Default: 0

    output_padding

    (int or tuple, optional): Additional size added to one side -of each dimension in the output shape. Default: 0

    groups

    (int, optional): Number of blocked connections from input channels to output channels. Default: 1

    bias

    (bool, optional): If True, adds a learnable bias to the output. Default: True

    dilation

    (int or tuple, optional): Spacing between kernel elements. Default: 1

    padding_mode

    (string, optional): 'zeros', 'reflect', -'replicate' or 'circular'. Default: 'zeros'

    - -

    Details

    - -

    This module can be seen as the gradient of Conv2d with respect to its input. -It is also known as a fractionally-strided convolution or -a deconvolution (although it is not an actual deconvolution operation).

      -
    • stride controls the stride for the cross-correlation.

    • -
    • padding controls the amount of implicit zero-paddings on both -sides for dilation * (kernel_size - 1) - padding number of points. See note -below for details.

    • -
    • output_padding controls the additional size added to one side -of the output shape. See note below for details.

    • -
    • dilation controls the spacing between the kernel points; also known as the à trous algorithm. -It is harder to describe, but this link_ has a nice visualization of what dilation does.

    • -
    • groups controls the connections between inputs and outputs. -in_channels and out_channels must both be divisible by -groups. For example,

        -
      • At groups=1, all inputs are convolved to all outputs.

      • -
      • At groups=2, the operation becomes equivalent to having two conv -layers side by side, each seeing half the input channels, -and producing half the output channels, and both subsequently -concatenated.

      • -
      • At groups= in_channels, each input channel is convolved with -its own set of filters (of size -\(\left\lfloor\frac{out\_channels}{in\_channels}\right\rfloor\)).

      • -
    • -
    - -

    The parameters kernel_size, stride, padding, output_padding -can either be:

      -
    • a single int -- in which case the same value is used for the height and width dimensions

    • -
    • a tuple of two ints -- in which case, the first int is used for the height dimension, -and the second int for the width dimension

    • -
    - -

    Note

    - -

    Depending of the size of your kernel, several (of the last) -columns of the input might be lost, because it is a valid cross-correlation_, -and not a full cross-correlation. It is up to the user to add proper padding.

    -

    The padding argument effectively adds dilation * (kernel_size - 1) - padding -amount of zero padding to both sizes of the input. This is set so that -when a nn_conv2d and a nn_conv_transpose2d are initialized with same -parameters, they are inverses of each other in -regard to the input and output shapes. However, when stride > 1, -nn_conv2d maps multiple input shapes to the same output -shape. output_padding is provided to resolve this ambiguity by -effectively increasing the calculated output shape on one side. Note -that output_padding is only used to find output shape, but does -not actually add zero-padding to output.

    -

    In some circumstances when using the CUDA backend with CuDNN, this operator -may select a nondeterministic algorithm to increase performance. If this is -undesirable, you can try to make the operation deterministic (potentially at -a performance cost) by setting torch.backends.cudnn.deterministic = TRUE.

    -

    Shape

    - - - -
      -
    • Input: \((N, C_{in}, H_{in}, W_{in})\)

    • -
    • Output: \((N, C_{out}, H_{out}, W_{out})\) where -$$ - H_{out} = (H_{in} - 1) \times \mbox{stride}[0] - 2 \times \mbox{padding}[0] + \mbox{dilation}[0] -\times (\mbox{kernel\_size}[0] - 1) + \mbox{output\_padding}[0] + 1 -$$ -$$ - W_{out} = (W_{in} - 1) \times \mbox{stride}[1] - 2 \times \mbox{padding}[1] + \mbox{dilation}[1] -\times (\mbox{kernel\_size}[1] - 1) + \mbox{output\_padding}[1] + 1 -$$

    • -
    - -

    Attributes

    - - - -
      -
    • weight (Tensor): the learnable weights of the module of shape -\((\mbox{in\_channels}, \frac{\mbox{out\_channels}}{\mbox{groups}},\) -\(\mbox{kernel\_size[0]}, \mbox{kernel\_size[1]})\). -The values of these weights are sampled from -\(\mathcal{U}(-\sqrt{k}, \sqrt{k})\) where -\(k = \frac{groups}{C_{\mbox{out}} * \prod_{i=0}^{1}\mbox{kernel\_size}[i]}\)

    • -
    • bias (Tensor): the learnable bias of the module of shape (out_channels) -If bias is True, then the values of these weights are -sampled from \(\mathcal{U}(-\sqrt{k}, \sqrt{k})\) where -\(k = \frac{groups}{C_{\mbox{out}} * \prod_{i=0}^{1}\mbox{kernel\_size}[i]}\)

    • -
    - - -

    Examples

    -
    # \dontrun{ -# With square kernels and equal stride -m <- nn_conv_transpose2d(16, 33, 3, stride=2) -# non-square kernels and unequal stride and with padding -m <- nn_conv_transpose2d(16, 33, c(3, 5), stride=c(2, 1), padding=c(4, 2)) -input <- torch_randn(20, 16, 50, 100) -output <- m(input) -# exact output size can be also specified as an argument -input <- torch_randn(1, 16, 12, 12) -downsample <- nn_conv2d(16, 16, 3, stride=2, padding=1) -upsample <- nn_conv_transpose2d(16, 16, 3, stride=2, padding=1) -h <- downsample(input) -h$size()
    #> [1] 1 16 6 6
    output <- upsample(h, output_size=input$size()) -output$size()
    #> [1] 1 16 12 12
    -# } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nn_conv_transpose3d.html b/docs/reference/nn_conv_transpose3d.html deleted file mode 100644 index e929e4949f461b8a1cebd9b157b33c42b4eef7ed..0000000000000000000000000000000000000000 --- a/docs/reference/nn_conv_transpose3d.html +++ /dev/null @@ -1,354 +0,0 @@ - - - - - - - - -ConvTranpose3D module — nn_conv_transpose3d • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Applies a 3D transposed convolution operator over an input image composed of several input -planes.

    -
    - -
    nn_conv_transpose3d(
    -  in_channels,
    -  out_channels,
    -  kernel_size,
    -  stride = 1,
    -  padding = 0,
    -  output_padding = 0,
    -  groups = 1,
    -  bias = TRUE,
    -  dilation = 1,
    -  padding_mode = "zeros"
    -)
    - -

    Arguments

    - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    in_channels

    (int): Number of channels in the input image

    out_channels

    (int): Number of channels produced by the convolution

    kernel_size

    (int or tuple): Size of the convolving kernel

    stride

    (int or tuple, optional): Stride of the convolution. Default: 1

    padding

    (int or tuple, optional): dilation * (kernel_size - 1) - padding zero-padding -will be added to both sides of each dimension in the input. Default: 0 -output_padding (int or tuple, optional): Additional size added to one side -of each dimension in the output shape. Default: 0

    output_padding

    (int or tuple, optional): Additional size added to one side -of each dimension in the output shape. Default: 0

    groups

    (int, optional): Number of blocked connections from input channels to output channels. Default: 1

    bias

    (bool, optional): If True, adds a learnable bias to the output. Default: True

    dilation

    (int or tuple, optional): Spacing between kernel elements. Default: 1

    padding_mode

    (string, optional): 'zeros', 'reflect', 'replicate' or 'circular'. Default: 'zeros'

    - -

    Details

    - -

    The transposed convolution operator multiplies each input value element-wise by a learnable kernel, -and sums over the outputs from all input feature planes.

    -

    This module can be seen as the gradient of Conv3d with respect to its input. -It is also known as a fractionally-strided convolution or -a deconvolution (although it is not an actual deconvolution operation).

      -
    • stride controls the stride for the cross-correlation.

    • -
    • padding controls the amount of implicit zero-paddings on both -sides for dilation * (kernel_size - 1) - padding number of points. See note -below for details.

    • -
    • output_padding controls the additional size added to one side -of the output shape. See note below for details.

    • -
    • dilation controls the spacing between the kernel points; also known as the à trous algorithm. -It is harder to describe, but this link_ has a nice visualization of what dilation does.

    • -
    • groups controls the connections between inputs and outputs. -in_channels and out_channels must both be divisible by -groups. For example,

        -
      • At groups=1, all inputs are convolved to all outputs.

      • -
      • At groups=2, the operation becomes equivalent to having two conv -layers side by side, each seeing half the input channels, -and producing half the output channels, and both subsequently -concatenated.

      • -
      • At groups= in_channels, each input channel is convolved with -its own set of filters (of size -\(\left\lfloor\frac{out\_channels}{in\_channels}\right\rfloor\)).

      • -
    • -
    - -

    The parameters kernel_size, stride, padding, output_padding -can either be:

      -
    • a single int -- in which case the same value is used for the depth, height and width dimensions

    • -
    • a tuple of three ints -- in which case, the first int is used for the depth dimension, -the second int for the height dimension and the third int for the width dimension

    • -
    - -

    Note

    - -

    Depending of the size of your kernel, several (of the last) -columns of the input might be lost, because it is a valid cross-correlation, -and not a full cross-correlation. -It is up to the user to add proper padding.

    -

    The padding argument effectively adds dilation * (kernel_size - 1) - padding -amount of zero padding to both sizes of the input. This is set so that -when a ~torch.nn.Conv3d and a ~torch.nn.ConvTranspose3d -are initialized with same parameters, they are inverses of each other in -regard to the input and output shapes. However, when stride > 1, -~torch.nn.Conv3d maps multiple input shapes to the same output -shape. output_padding is provided to resolve this ambiguity by -effectively increasing the calculated output shape on one side. Note -that output_padding is only used to find output shape, but does -not actually add zero-padding to output.

    -

    In some circumstances when using the CUDA backend with CuDNN, this operator -may select a nondeterministic algorithm to increase performance. If this is -undesirable, you can try to make the operation deterministic (potentially at -a performance cost) by setting torch.backends.cudnn.deterministic = TRUE.

    -

    Shape

    - - - -
      -
    • Input: \((N, C_{in}, D_{in}, H_{in}, W_{in})\)

    • -
    • Output: \((N, C_{out}, D_{out}, H_{out}, W_{out})\) where -$$ - D_{out} = (D_{in} - 1) \times \mbox{stride}[0] - 2 \times \mbox{padding}[0] + \mbox{dilation}[0] -\times (\mbox{kernel\_size}[0] - 1) + \mbox{output\_padding}[0] + 1 -$$ -$$ - H_{out} = (H_{in} - 1) \times \mbox{stride}[1] - 2 \times \mbox{padding}[1] + \mbox{dilation}[1] -\times (\mbox{kernel\_size}[1] - 1) + \mbox{output\_padding}[1] + 1 -$$ -$$ - W_{out} = (W_{in} - 1) \times \mbox{stride}[2] - 2 \times \mbox{padding}[2] + \mbox{dilation}[2] -\times (\mbox{kernel\_size}[2] - 1) + \mbox{output\_padding}[2] + 1 -$$

    • -
    - -

    Attributes

    - - - -
      -
    • weight (Tensor): the learnable weights of the module of shape -\((\mbox{in\_channels}, \frac{\mbox{out\_channels}}{\mbox{groups}},\) -\(\mbox{kernel\_size[0]}, \mbox{kernel\_size[1]}, \mbox{kernel\_size[2]})\). -The values of these weights are sampled from -\(\mathcal{U}(-\sqrt{k}, \sqrt{k})\) where -\(k = \frac{groups}{C_{\mbox{out}} * \prod_{i=0}^{2}\mbox{kernel\_size}[i]}\)

    • -
    • bias (Tensor): the learnable bias of the module of shape (out_channels) -If bias is True, then the values of these weights are -sampled from \(\mathcal{U}(-\sqrt{k}, \sqrt{k})\) where -\(k = \frac{groups}{C_{\mbox{out}} * \prod_{i=0}^{2}\mbox{kernel\_size}[i]}\)

    • -
    - - -

    Examples

    -
    
    -  
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nn_cross_entropy_loss.html b/docs/reference/nn_cross_entropy_loss.html deleted file mode 100644 index b986b77ca276a5ea8faedce6f5dd68903a7ddce2..0000000000000000000000000000000000000000 --- a/docs/reference/nn_cross_entropy_loss.html +++ /dev/null @@ -1,277 +0,0 @@ - - - - - - - - -CrossEntropyLoss module — nn_cross_entropy_loss • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    This criterion combines nn_log_softmax() and nn_nll_loss() in one single class. -It is useful when training a classification problem with C classes.

    -
    - -
    nn_cross_entropy_loss(weight = NULL, ignore_index = -100, reduction = "mean")
    - -

    Arguments

    - - - - - - - - - - - - - - -
    weight

    (Tensor, optional): a manual rescaling weight given to each class. -If given, has to be a Tensor of size C

    ignore_index

    (int, optional): Specifies a target value that is ignored -and does not contribute to the input gradient. When size_average is -TRUE, the loss is averaged over non-ignored targets.

    reduction

    (string, optional): Specifies the reduction to apply to the output: -'none' | 'mean' | 'sum'. 'none': no reduction will be applied, -'mean': the sum of the output will be divided by the number of -elements in the output, 'sum': the output will be summed. Note: size_average -and reduce are in the process of being deprecated, and in the meantime, -specifying either of those two args will override reduction. Default: 'mean'

    - -

    Details

    - -

    If provided, the optional argument weight should be a 1D Tensor -assigning weight to each of the classes.

    -

    This is particularly useful when you have an unbalanced training set. -The input is expected to contain raw, unnormalized scores for each class. -input has to be a Tensor of size either \((minibatch, C)\) or -\((minibatch, C, d_1, d_2, ..., d_K)\) -with \(K \geq 1\) for the K-dimensional case (described later).

    -

    This criterion expects a class index in the range \([0, C-1]\) as the -target for each value of a 1D tensor of size minibatch; if ignore_index -is specified, this criterion also accepts this class index (this index may not -necessarily be in the class range).

    -

    The loss can be described as: -$$ - \mbox{loss}(x, class) = -\log\left(\frac{\exp(x[class])}{\sum_j \exp(x[j])}\right) -= -x[class] + \log\left(\sum_j \exp(x[j])\right) -$$ -or in the case of the weight argument being specified: -$$ - \mbox{loss}(x, class) = weight[class] \left(-x[class] + \log\left(\sum_j \exp(x[j])\right)\right) -$$

    -

    The losses are averaged across observations for each minibatch. -Can also be used for higher dimension inputs, such as 2D images, by providing -an input of size \((minibatch, C, d_1, d_2, ..., d_K)\) with \(K \geq 1\), -where \(K\) is the number of dimensions, and a target of appropriate shape -(see below).

    -

    Shape

    - - - -
      -
    • Input: \((N, C)\) where C = number of classes, or -\((N, C, d_1, d_2, ..., d_K)\) with \(K \geq 1\) -in the case of K-dimensional loss.

    • -
    • Target: \((N)\) where each value is \(0 \leq \mbox{targets}[i] \leq C-1\), or -\((N, d_1, d_2, ..., d_K)\) with \(K \geq 1\) in the case of -K-dimensional loss.

    • -
    • Output: scalar. -If reduction is 'none', then the same size as the target: -\((N)\), or -\((N, d_1, d_2, ..., d_K)\) with \(K \geq 1\) in the case -of K-dimensional loss.

    • -
    - - -

    Examples

    -
    # \dontrun{ -loss <- nn_cross_entropy_loss() -input <- torch_randn(3, 5, requires_grad=TRUE) -target <- torch_randint(low = 1, high = 5, size = 3, dtype = torch_long()) -output <- loss(input, target) -output$backward() - -# }
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nn_dropout.html b/docs/reference/nn_dropout.html deleted file mode 100644 index 95fa808ba10a0d9ae69d3bbb87883eb19566e0fa..0000000000000000000000000000000000000000 --- a/docs/reference/nn_dropout.html +++ /dev/null @@ -1,239 +0,0 @@ - - - - - - - - -Dropout module — nn_dropout • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    During training, randomly zeroes some of the elements of the input -tensor with probability p using samples from a Bernoulli -distribution. Each channel will be zeroed out independently on every forward -call.

    -
    - -
    nn_dropout(p = 0.5, inplace = FALSE)
    - -

    Arguments

    - - - - - - - - - - -
    p

    probability of an element to be zeroed. Default: 0.5

    inplace

    If set to TRUE, will do this operation in-place. Default: FALSE.

    - -

    Details

    - -

    This has proven to be an effective technique for regularization and -preventing the co-adaptation of neurons as described in the paper -Improving neural networks by preventing co-adaptation of feature detectors.

    -

    Furthermore, the outputs are scaled by a factor of :math:\frac{1}{1-p} during -training. This means that during evaluation the module simply computes an -identity function.

    -

    Shape

    - - - -
      -
    • Input: \((*)\). Input can be of any shape

    • -
    • Output: \((*)\). Output is of the same shape as input

    • -
    - - -

    Examples

    -
    # \dontrun{ -m <- nn_dropout(p = 0.2) -input <- torch_randn(20, 16) -output <- m(input) - -# }
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nn_dropout2d.html b/docs/reference/nn_dropout2d.html deleted file mode 100644 index efcc07a6ba6dc0c4f6e37af555bfeaa190f8e20a..0000000000000000000000000000000000000000 --- a/docs/reference/nn_dropout2d.html +++ /dev/null @@ -1,243 +0,0 @@ - - - - - - - - -Dropout2D module — nn_dropout2d • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Randomly zero out entire channels (a channel is a 2D feature map, -e.g., the \(j\)-th channel of the \(i\)-th sample in the -batched input is a 2D tensor \(\mbox{input}[i, j]\)).

    -
    - -
    nn_dropout2d(p = 0.5, inplace = FALSE)
    - -

    Arguments

    - - - - - - - - - - -
    p

    (float, optional): probability of an element to be zero-ed.

    inplace

    (bool, optional): If set to TRUE, will do this operation -in-place

    - -

    Details

    - -

    Each channel will be zeroed out independently on every forward call with -probability p using samples from a Bernoulli distribution. -Usually the input comes from nn_conv2d modules.

    -

    As described in the paper -Efficient Object Localization Using Convolutional Networks , -if adjacent pixels within feature maps are strongly correlated -(as is normally the case in early convolution layers) then i.i.d. dropout -will not regularize the activations and will otherwise just result -in an effective learning rate decrease. -In this case, nn_dropout2d will help promote independence between -feature maps and should be used instead.

    -

    Shape

    - - - -
      -
    • Input: \((N, C, H, W)\)

    • -
    • Output: \((N, C, H, W)\) (same shape as input)

    • -
    - - -

    Examples

    -
    # \dontrun{ -m <- nn_dropout2d(p = 0.2) -input <- torch_randn(20, 16, 32, 32) -output <- m(input) - -# }
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nn_dropout3d.html b/docs/reference/nn_dropout3d.html deleted file mode 100644 index 32d4545dfbb9ab138232d56407cafe9cf86847ee..0000000000000000000000000000000000000000 --- a/docs/reference/nn_dropout3d.html +++ /dev/null @@ -1,243 +0,0 @@ - - - - - - - - -Dropout3D module — nn_dropout3d • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Randomly zero out entire channels (a channel is a 3D feature map, -e.g., the \(j\)-th channel of the \(i\)-th sample in the -batched input is a 3D tensor \(\mbox{input}[i, j]\)).

    -
    - -
    nn_dropout3d(p = 0.5, inplace = FALSE)
    - -

    Arguments

    - - - - - - - - - - -
    p

    (float, optional): probability of an element to be zeroed.

    inplace

    (bool, optional): If set to TRUE, will do this operation -in-place

    - -

    Details

    - -

    Each channel will be zeroed out independently on every forward call with -probability p using samples from a Bernoulli distribution. -Usually the input comes from nn_conv2d modules.

    -

    As described in the paper -Efficient Object Localization Using Convolutional Networks , -if adjacent pixels within feature maps are strongly correlated -(as is normally the case in early convolution layers) then i.i.d. dropout -will not regularize the activations and will otherwise just result -in an effective learning rate decrease.

    -

    In this case, nn_dropout3d will help promote independence between -feature maps and should be used instead.

    -

    Shape

    - - - -
      -
    • Input: \((N, C, D, H, W)\)

    • -
    • Output: \((N, C, D, H, W)\) (same shape as input)

    • -
    - - -

    Examples

    -
    # \dontrun{ -m <- nn_dropout3d(p = 0.2) -input <- torch_randn(20, 16, 4, 32, 32) -output <- m(input) - -# }
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nn_elu.html b/docs/reference/nn_elu.html deleted file mode 100644 index c68ba52e1fee531b05e1ef948df07924b133cf4e..0000000000000000000000000000000000000000 --- a/docs/reference/nn_elu.html +++ /dev/null @@ -1,231 +0,0 @@ - - - - - - - - -ELU module — nn_elu • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Applies the element-wise function:

    -
    - -
    nn_elu(alpha = 1, inplace = FALSE)
    - -

    Arguments

    - - - - - - - - - - -
    alpha

    the \(\alpha\) value for the ELU formulation. Default: 1.0

    inplace

    can optionally do the operation in-place. Default: FALSE

    - -

    Details

    - -

    $$ - \mbox{ELU}(x) = \max(0,x) + \min(0, \alpha * (\exp(x) - 1)) -$$

    -

    Shape

    - - - -
      -
    • Input: \((N, *)\) where * means, any number of additional -dimensions

    • -
    • Output: \((N, *)\), same shape as the input

    • -
    - - -

    Examples

    -
    # \dontrun{ -m <- nn_elu() -input <- torch_randn(2) -output <- m(input) - -# }
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nn_embedding.html b/docs/reference/nn_embedding.html deleted file mode 100644 index a8881ef38c667b9fef6cec4845c7839eb2847878..0000000000000000000000000000000000000000 --- a/docs/reference/nn_embedding.html +++ /dev/null @@ -1,311 +0,0 @@ - - - - - - - - -Embedding module — nn_embedding • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    A simple lookup table that stores embeddings of a fixed dictionary and size. -This module is often used to store word embeddings and retrieve them using indices. -The input to the module is a list of indices, and the output is the corresponding -word embeddings.

    -
    - -
    nn_embedding(
    -  num_embeddings,
    -  embedding_dim,
    -  padding_idx = NULL,
    -  max_norm = NULL,
    -  norm_type = 2,
    -  scale_grad_by_freq = FALSE,
    -  sparse = FALSE,
    -  .weight = NULL
    -)
    - -

    Arguments

    - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    num_embeddings

    (int): size of the dictionary of embeddings

    embedding_dim

    (int): the size of each embedding vector

    padding_idx

    (int, optional): If given, pads the output with the embedding vector at padding_idx -(initialized to zeros) whenever it encounters the index.

    max_norm

    (float, optional): If given, each embedding vector with norm larger than max_norm -is renormalized to have norm max_norm.

    norm_type

    (float, optional): The p of the p-norm to compute for the max_norm option. Default 2.

    scale_grad_by_freq

    (boolean, optional): If given, this will scale gradients by the inverse of frequency of -the words in the mini-batch. Default False.

    sparse

    (bool, optional): If True, gradient w.r.t. weight matrix will be a sparse tensor.

    .weight

    (Tensor) embeddings weights (in case you want to set it manually)

    -

    See Notes for more details regarding sparse gradients.

    - -

    Note

    - -

    Keep in mind that only a limited number of optimizers support -sparse gradients: currently it's optim.SGD (CUDA and CPU), -optim.SparseAdam (CUDA and CPU) and optim.Adagrad (CPU)

    -

    With padding_idx set, the embedding vector at -padding_idx is initialized to all zeros. However, note that this -vector can be modified afterwards, e.g., using a customized -initialization method, and thus changing the vector used to pad the -output. The gradient for this vector from nn_embedding -is always zero.

    -

    Attributes

    - - - -
      -
    • weight (Tensor): the learnable weights of the module of shape (num_embeddings, embedding_dim) -initialized from \(\mathcal{N}(0, 1)\)

    • -
    - -

    Shape

    - - - -
      -
    • Input: \((*)\), LongTensor of arbitrary shape containing the indices to extract

    • -
    • Output: \((*, H)\), where * is the input shape and \(H=\mbox{embedding\_dim}\)

    • -
    - - -

    Examples

    -
    # \dontrun{ -# an Embedding module containing 10 tensors of size 3 -embedding <- nn_embedding(10, 3) -# a batch of 2 samples of 4 indices each -input <- torch_tensor(rbind(c(1,2,4,5),c(4,3,2,9)), dtype = torch_long()) -embedding(input)
    #> torch_tensor -#> (1,.,.) = -#> -0.5531 0.2969 -1.9168 -#> -0.7095 -0.1328 -0.7352 -#> -1.5311 -0.6539 0.7804 -#> 1.5343 0.1139 1.1985 -#> -#> (2,.,.) = -#> -1.5311 -0.6539 0.7804 -#> -0.1120 0.9578 0.1195 -#> -0.7095 -0.1328 -0.7352 -#> -0.4247 0.6266 -0.1286 -#> [ CPUFloatType{2,4,3} ]
    # example with padding_idx -embedding <- nn_embedding(10, 3, padding_idx=1) -input <- torch_tensor(matrix(c(1,3,1,6), nrow = 1), dtype = torch_long()) -embedding(input)
    #> torch_tensor -#> (1,.,.) = -#> 0.0000 0.0000 0.0000 -#> -1.2943 -1.0279 0.6483 -#> 0.0000 0.0000 0.0000 -#> 0.4053 0.7866 -0.3922 -#> [ CPUFloatType{1,4,3} ]
    -# } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nn_gelu.html b/docs/reference/nn_gelu.html deleted file mode 100644 index 57885d648d9938c46cf70a87fe44bb246c466050..0000000000000000000000000000000000000000 --- a/docs/reference/nn_gelu.html +++ /dev/null @@ -1,219 +0,0 @@ - - - - - - - - -GELU module — nn_gelu • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Applies the Gaussian Error Linear Units function: -$$\mbox{GELU}(x) = x * \Phi(x)$$

    -
    - -
    nn_gelu()
    - - -

    Details

    - -

    where \(\Phi(x)\) is the Cumulative Distribution Function for Gaussian Distribution.

    -

    Shape

    - - - -
      -
    • Input: \((N, *)\) where * means, any number of additional -dimensions

    • -
    • Output: \((N, *)\), same shape as the input

    • -
    - - -

    Examples

    -
    # \dontrun{ -m = nn_gelu() -input <- torch_randn(2) -output <- m(input) - -# }
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nn_glu.html b/docs/reference/nn_glu.html deleted file mode 100644 index 117725434b7d5ad41f5f3a1d8e75034744bfd50c..0000000000000000000000000000000000000000 --- a/docs/reference/nn_glu.html +++ /dev/null @@ -1,226 +0,0 @@ - - - - - - - - -GLU module — nn_glu • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Applies the gated linear unit function -\({GLU}(a, b)= a \otimes \sigma(b)\) where \(a\) is the first half -of the input matrices and \(b\) is the second half.

    -
    - -
    nn_glu(dim = -1)
    - -

    Arguments

    - - - - - - -
    dim

    (int): the dimension on which to split the input. Default: -1

    - -

    Shape

    - - - -
      -
    • Input: \((\ast_1, N, \ast_2)\) where * means, any number of additional -dimensions

    • -
    • Output: \((\ast_1, M, \ast_2)\) where \(M=N/2\)

    • -
    - - -

    Examples

    -
    # \dontrun{ -m <- nn_glu() -input <- torch_randn(4, 2) -output <- m(input) - -# }
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nn_hardshrink.html b/docs/reference/nn_hardshrink.html deleted file mode 100644 index 3fc589f2e1caf3fc81d108a6aacadec2cea8ccce..0000000000000000000000000000000000000000 --- a/docs/reference/nn_hardshrink.html +++ /dev/null @@ -1,233 +0,0 @@ - - - - - - - - -Hardshwink module — nn_hardshrink • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Applies the hard shrinkage function element-wise:

    -
    - -
    nn_hardshrink(lambd = 0.5)
    - -

    Arguments

    - - - - - - -
    lambd

    the \(\lambda\) value for the Hardshrink formulation. Default: 0.5

    - -

    Details

    - -

    $$ - \mbox{HardShrink}(x) = - \left\{ \begin{array}{ll} -x, & \mbox{ if } x > \lambda \\ -x, & \mbox{ if } x < -\lambda \\ -0, & \mbox{ otherwise } -\end{array} -\right. -$$

    -

    Shape

    - - - -
      -
    • Input: \((N, *)\) where * means, any number of additional -dimensions

    • -
    • Output: \((N, *)\), same shape as the input

    • -
    - - -

    Examples

    -
    # \dontrun{ -m <- nn_hardshrink() -input <- torch_randn(2) -output <- m(input) - -# }
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nn_hardsigmoid.html b/docs/reference/nn_hardsigmoid.html deleted file mode 100644 index 4290cdbcbc1ec25263761882a82ca0f3d6f6fe2a..0000000000000000000000000000000000000000 --- a/docs/reference/nn_hardsigmoid.html +++ /dev/null @@ -1,224 +0,0 @@ - - - - - - - - -Hardsigmoid module — nn_hardsigmoid • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Applies the element-wise function:

    -
    - -
    nn_hardsigmoid()
    - - -

    Details

    - -

    $$ -\mbox{Hardsigmoid}(x) = \left\{ \begin{array}{ll} - 0 & \mbox{if~} x \le -3, \\ - 1 & \mbox{if~} x \ge +3, \\ - x / 6 + 1 / 2 & \mbox{otherwise} -\end{array} -\right. -$$

    -

    Shape

    - - - -
      -
    • Input: \((N, *)\) where * means, any number of additional -dimensions

    • -
    • Output: \((N, *)\), same shape as the input

    • -
    - - -

    Examples

    -
    # \dontrun{ -m <- nn_hardsigmoid() -input <- torch_randn(2) -output <- m(input) - -# }
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nn_hardswish.html b/docs/reference/nn_hardswish.html deleted file mode 100644 index aeff39b397a1bb32032efc29c8c2cacbba62cad5..0000000000000000000000000000000000000000 --- a/docs/reference/nn_hardswish.html +++ /dev/null @@ -1,220 +0,0 @@ - - - - - - - - -Hardswish module — nn_hardswish • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Applies the hardswish function, element-wise, as described in the paper: -Searching for MobileNetV3

    -
    - -
    nn_hardswish()
    - - -

    Details

    - -

    $$ \mbox{Hardswish}(x) = \left\{ - \begin{array}{ll} - 0 & \mbox{if } x \le -3, \\ - x & \mbox{if } x \ge +3, \\ - x \cdot (x + 3)/6 & \mbox{otherwise} - \end{array} - \right. $$

    -

    Shape

    - - - -
      -
    • Input: \((N, *)\) where * means, any number of additional -dimensions

    • -
    • Output: \((N, *)\), same shape as the input

    • -
    - - -

    Examples

    -
    
    -  
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nn_hardtanh.html b/docs/reference/nn_hardtanh.html deleted file mode 100644 index d959dddda31a7847f0251496915232c22d013758..0000000000000000000000000000000000000000 --- a/docs/reference/nn_hardtanh.html +++ /dev/null @@ -1,244 +0,0 @@ - - - - - - - - -Hardtanh module — nn_hardtanh • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Applies the HardTanh function element-wise -HardTanh is defined as:

    -
    - -
    nn_hardtanh(min_val = -1, max_val = 1, inplace = FALSE)
    - -

    Arguments

    - - - - - - - - - - - - - - -
    min_val

    minimum value of the linear region range. Default: -1

    max_val

    maximum value of the linear region range. Default: 1

    inplace

    can optionally do the operation in-place. Default: FALSE

    - -

    Details

    - -

    $$ -\mbox{HardTanh}(x) = \left\{ \begin{array}{ll} - 1 & \mbox{ if } x > 1 \\ - -1 & \mbox{ if } x < -1 \\ - x & \mbox{ otherwise } \\ -\end{array} -\right. -$$

    -

    The range of the linear region :math:[-1, 1] can be adjusted using -min_val and max_val.

    -

    Shape

    - - - -
      -
    • Input: \((N, *)\) where * means, any number of additional -dimensions

    • -
    • Output: \((N, *)\), same shape as the input

    • -
    - - -

    Examples

    -
    # \dontrun{ -m <- nn_hardtanh(-2, 2) -input <- torch_randn(2) -output <- m(input) - -# }
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nn_identity.html b/docs/reference/nn_identity.html deleted file mode 100644 index 8ce1f5eb9322d632e68d33baef4f4e1582c09e2d..0000000000000000000000000000000000000000 --- a/docs/reference/nn_identity.html +++ /dev/null @@ -1,213 +0,0 @@ - - - - - - - - -Identity module — nn_identity • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    A placeholder identity operator that is argument-insensitive.

    -
    - -
    nn_identity(...)
    - -

    Arguments

    - - - - - - -
    ...

    any arguments (unused)

    - - -

    Examples

    -
    # \dontrun{ -m <- nn_identity(54, unused_argument1 = 0.1, unused_argument2 = FALSE) -input <- torch_randn(128, 20) -output <- m(input) -print(output$size())
    #> [1] 128 20
    -# } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nn_init_calculate_gain.html b/docs/reference/nn_init_calculate_gain.html deleted file mode 100644 index 33069b6fb07e6d14e18a20aba16a74966eda9603..0000000000000000000000000000000000000000 --- a/docs/reference/nn_init_calculate_gain.html +++ /dev/null @@ -1,209 +0,0 @@ - - - - - - - - -Calculate gain — nn_init_calculate_gain • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Return the recommended gain value for the given nonlinearity function.

    -
    - -
    nn_init_calculate_gain(nonlinearity, param = NULL)
    - -

    Arguments

    - - - - - - - - - - -
    nonlinearity

    the non-linear function

    param

    optional parameter for the non-linear function

    - - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nn_init_constant_.html b/docs/reference/nn_init_constant_.html deleted file mode 100644 index 081b9dc2ab566486a98e93c3ec02b2a9aba11c26..0000000000000000000000000000000000000000 --- a/docs/reference/nn_init_constant_.html +++ /dev/null @@ -1,219 +0,0 @@ - - - - - - - - -Constant initialization — nn_init_constant_ • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Fills the input Tensor with the value val.

    -
    - -
    nn_init_constant_(tensor, val)
    - -

    Arguments

    - - - - - - - - - - -
    tensor

    an n-dimensional Tensor

    val

    the value to fill the tensor with

    - - -

    Examples

    -
    # \dontrun{ -w <- torch_empty(3, 5) -nn_init_constant_(w, 0.3)
    #> torch_tensor -#> 0.3000 0.3000 0.3000 0.3000 0.3000 -#> 0.3000 0.3000 0.3000 0.3000 0.3000 -#> 0.3000 0.3000 0.3000 0.3000 0.3000 -#> [ CPUFloatType{3,5} ]
    -# } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nn_init_dirac_.html b/docs/reference/nn_init_dirac_.html deleted file mode 100644 index 73aa5202dbab28d3f11db92a1d51b8fd28f5c60d..0000000000000000000000000000000000000000 --- a/docs/reference/nn_init_dirac_.html +++ /dev/null @@ -1,217 +0,0 @@ - - - - - - - - -Dirac initialization — nn_init_dirac_ • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Fills the 3, 4, 5-dimensional input Tensor with the Dirac -delta function. Preserves the identity of the inputs in Convolutional -layers, where as many input channels are preserved as possible. In case -of groups>1, each group of channels preserves identity.

    -
    - -
    nn_init_dirac_(tensor, groups = 1)
    - -

    Arguments

    - - - - - - - - - - -
    tensor

    a 3, 4, 5-dimensional torch.Tensor

    groups

    (optional) number of groups in the conv layer (default: 1)

    - - -

    Examples

    -
    
    -  
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nn_init_eye_.html b/docs/reference/nn_init_eye_.html deleted file mode 100644 index 9dc9030f14591a46b6ca9a9154f2b433e4202b1e..0000000000000000000000000000000000000000 --- a/docs/reference/nn_init_eye_.html +++ /dev/null @@ -1,219 +0,0 @@ - - - - - - - - -Eye initialization — nn_init_eye_ • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Fills the 2-dimensional input Tensor with the identity matrix. -Preserves the identity of the inputs in Linear layers, where as -many inputs are preserved as possible.

    -
    - -
    nn_init_eye_(tensor)
    - -

    Arguments

    - - - - - - -
    tensor

    a 2-dimensional torch tensor.

    - - -

    Examples

    -
    # \dontrun{ -w <- torch_empty(3, 5) -nn_init_eye_(w)
    #> torch_tensor -#> 1 0 0 0 0 -#> 0 1 0 0 0 -#> 0 0 1 0 0 -#> [ CPUFloatType{3,5} ]
    -# } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nn_init_kaiming_normal_.html b/docs/reference/nn_init_kaiming_normal_.html deleted file mode 100644 index d5d58686d6bf410b67a7dc5e6739bdaf3d4a1a5c..0000000000000000000000000000000000000000 --- a/docs/reference/nn_init_kaiming_normal_.html +++ /dev/null @@ -1,240 +0,0 @@ - - - - - - - - -Kaiming normal initialization — nn_init_kaiming_normal_ • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Fills the input Tensor with values according to the method -described in Delving deep into rectifiers: Surpassing human-level performance on ImageNet classification - He, K. et al. (2015), using a -normal distribution.

    -
    - -
    nn_init_kaiming_normal_(
    -  tensor,
    -  a = 0,
    -  mode = "fan_in",
    -  nonlinearity = "leaky_relu"
    -)
    - -

    Arguments

    - - - - - - - - - - - - - - - - - - -
    tensor

    an n-dimensional torch.Tensor

    a

    the negative slope of the rectifier used after this layer (only used -with 'leaky_relu')

    mode

    either 'fan_in' (default) or 'fan_out'. Choosing 'fan_in' preserves -the magnitude of the variance of the weights in the forward pass. Choosing -'fan_out' preserves the magnitudes in the backwards pass.

    nonlinearity

    the non-linear function. recommended to use only with 'relu' -or 'leaky_relu' (default).

    - - -

    Examples

    -
    # \dontrun{ -w <- torch_empty(3, 5) -nn_init_kaiming_normal_(w, mode = "fan_in", nonlinearity = "leaky_relu")
    #> torch_tensor -#> -0.5594 0.2408 0.3946 0.5860 -0.4834 -#> -0.0442 0.7170 -0.3028 0.4015 -0.8906 -#> -0.5157 -0.1763 0.9366 0.4640 -0.5356 -#> [ CPUFloatType{3,5} ]
    -# } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nn_init_kaiming_uniform_.html b/docs/reference/nn_init_kaiming_uniform_.html deleted file mode 100644 index 62e5e6ce159b33c2a84a8003a25100983196f0d7..0000000000000000000000000000000000000000 --- a/docs/reference/nn_init_kaiming_uniform_.html +++ /dev/null @@ -1,240 +0,0 @@ - - - - - - - - -Kaiming uniform initialization — nn_init_kaiming_uniform_ • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Fills the input Tensor with values according to the method -described in Delving deep into rectifiers: Surpassing human-level performance on ImageNet classification - He, K. et al. (2015), using a -uniform distribution.

    -
    - -
    nn_init_kaiming_uniform_(
    -  tensor,
    -  a = 0,
    -  mode = "fan_in",
    -  nonlinearity = "leaky_relu"
    -)
    - -

    Arguments

    - - - - - - - - - - - - - - - - - - -
    tensor

    an n-dimensional torch.Tensor

    a

    the negative slope of the rectifier used after this layer (only used -with 'leaky_relu')

    mode

    either 'fan_in' (default) or 'fan_out'. Choosing 'fan_in' preserves -the magnitude of the variance of the weights in the forward pass. Choosing -'fan_out' preserves the magnitudes in the backwards pass.

    nonlinearity

    the non-linear function. recommended to use only with 'relu' -or 'leaky_relu' (default).

    - - -

    Examples

    -
    # \dontrun{ -w <- torch_empty(3, 5) -nn_init_kaiming_uniform_(w, mode = "fan_in", nonlinearity = "leaky_relu")
    #> torch_tensor -#> -0.7460 0.2070 -0.1066 -0.4344 -0.4666 -#> -0.5351 -0.4524 0.0950 -1.0077 -0.2169 -#> -0.9525 0.8753 0.0070 -0.4553 -0.3445 -#> [ CPUFloatType{3,5} ]
    -# } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nn_init_normal_.html b/docs/reference/nn_init_normal_.html deleted file mode 100644 index a7962cbc0d84088cf15c30b8d15b5ce9a8c12dc9..0000000000000000000000000000000000000000 --- a/docs/reference/nn_init_normal_.html +++ /dev/null @@ -1,223 +0,0 @@ - - - - - - - - -Normal initialization — nn_init_normal_ • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Fills the input Tensor with values drawn from the normal distribution

    -
    - -
    nn_init_normal_(tensor, mean = 0, std = 1)
    - -

    Arguments

    - - - - - - - - - - - - - - -
    tensor

    an n-dimensional Tensor

    mean

    the mean of the normal distribution

    std

    the standard deviation of the normal distribution

    - - -

    Examples

    -
    # \dontrun{ -w <- torch_empty(3, 5) -nn_init_normal_(w)
    #> torch_tensor -#> -1.0569 -1.0900 1.2740 -1.7728 0.0593 -#> -1.7131 -0.1353 0.8191 0.1481 -0.9940 -#> -0.7544 -1.0298 0.4237 1.4650 0.0575 -#> [ CPUFloatType{3,5} ]
    -# } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nn_init_ones_.html b/docs/reference/nn_init_ones_.html deleted file mode 100644 index ebf2f5e6a9ed7d7bf9c60d44e60a0673a150f0ec..0000000000000000000000000000000000000000 --- a/docs/reference/nn_init_ones_.html +++ /dev/null @@ -1,215 +0,0 @@ - - - - - - - - -Ones initialization — nn_init_ones_ • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Fills the input Tensor with the scalar value 1

    -
    - -
    nn_init_ones_(tensor)
    - -

    Arguments

    - - - - - - -
    tensor

    an n-dimensional Tensor

    - - -

    Examples

    -
    # \dontrun{ -w <- torch_empty(3, 5) -nn_init_ones_(w)
    #> torch_tensor -#> 1 1 1 1 1 -#> 1 1 1 1 1 -#> 1 1 1 1 1 -#> [ CPUFloatType{3,5} ]
    -# } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nn_init_orthogonal_.html b/docs/reference/nn_init_orthogonal_.html deleted file mode 100644 index 661a3085d91f42035dbcb157c98488bde773cdf2..0000000000000000000000000000000000000000 --- a/docs/reference/nn_init_orthogonal_.html +++ /dev/null @@ -1,225 +0,0 @@ - - - - - - - - -Orthogonal initialization — nn_init_orthogonal_ • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Fills the input Tensor with a (semi) orthogonal matrix, as -described in Exact solutions to the nonlinear dynamics of learning in deep linear neural networks - Saxe, A. et al. (2013). The input tensor must have -at least 2 dimensions, and for tensors with more than 2 dimensions the -trailing dimensions are flattened.

    -
    - -
    nn_init_orthogonal_(tensor, gain = 1)
    - -

    Arguments

    - - - - - - - - - - -
    tensor

    an n-dimensional Tensor

    gain

    optional scaling factor

    - - -

    Examples

    -
    # \dontrun{ -w <- torch_empty(3,5) -nn_init_orthogonal_(w)
    #> torch_tensor -#> -0.2147 0.0073 -0.0312 -0.0439 0.9752 -#> -0.8268 0.5222 0.0419 0.0979 -0.1802 -#> 0.3963 0.5329 -0.0498 0.7371 0.1148 -#> [ CPUFloatType{3,5} ]
    -# } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nn_init_sparse_.html b/docs/reference/nn_init_sparse_.html deleted file mode 100644 index c9b963d67878426e2aca519360eda1f7d86189c7..0000000000000000000000000000000000000000 --- a/docs/reference/nn_init_sparse_.html +++ /dev/null @@ -1,220 +0,0 @@ - - - - - - - - -Sparse initialization — nn_init_sparse_ • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Fills the 2D input Tensor as a sparse matrix, where the -non-zero elements will be drawn from the normal distribution -as described in Deep learning via Hessian-free optimization - Martens, J. (2010).

    -
    - -
    nn_init_sparse_(tensor, sparsity, std = 0.01)
    - -

    Arguments

    - - - - - - - - - - - - - - -
    tensor

    an n-dimensional Tensor

    sparsity

    The fraction of elements in each column to be set to zero

    std

    the standard deviation of the normal distribution used to generate -the non-zero values

    - - -

    Examples

    -
    
    -  
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nn_init_trunc_normal_.html b/docs/reference/nn_init_trunc_normal_.html deleted file mode 100644 index 9ed52ecb0d7ffe5b6a7be855592507dcb1521046..0000000000000000000000000000000000000000 --- a/docs/reference/nn_init_trunc_normal_.html +++ /dev/null @@ -1,233 +0,0 @@ - - - - - - - - -Truncated normal initialization — nn_init_trunc_normal_ • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Fills the input Tensor with values drawn from a truncated -normal distribution.

    -
    - -
    nn_init_trunc_normal_(tensor, mean = 0, std = 1, a = -2, b = -2)
    - -

    Arguments

    - - - - - - - - - - - - - - - - - - - - - - -
    tensor

    an n-dimensional Tensor

    mean

    the mean of the normal distribution

    std

    the standard deviation of the normal distribution

    a

    the minimum cutoff value

    b

    the maximum cutoff value

    - - -

    Examples

    -
    # \dontrun{ -w <- torch_empty(3, 5) -nn_init_trunc_normal_(w)
    #> torch_tensor -#> -2 -2 -2 -2 -2 -#> -2 -2 -2 -2 -2 -#> -2 -2 -2 -2 -2 -#> [ CPUFloatType{3,5} ]
    -# } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nn_init_uniform_.html b/docs/reference/nn_init_uniform_.html deleted file mode 100644 index 9e974bcd2e646611ffb8970679a63ea4e06057d1..0000000000000000000000000000000000000000 --- a/docs/reference/nn_init_uniform_.html +++ /dev/null @@ -1,223 +0,0 @@ - - - - - - - - -Uniform initialization — nn_init_uniform_ • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Fills the input Tensor with values drawn from the uniform distribution

    -
    - -
    nn_init_uniform_(tensor, a = 0, b = 1)
    - -

    Arguments

    - - - - - - - - - - - - - - -
    tensor

    an n-dimensional Tensor

    a

    the lower bound of the uniform distribution

    b

    the upper bound of the uniform distribution

    - - -

    Examples

    -
    # \dontrun{ -w <- torch_empty(3, 5) -nn_init_uniform_(w)
    #> torch_tensor -#> 0.8556 0.9331 0.3515 0.8071 0.4948 -#> 0.6075 0.9042 0.7181 0.7329 0.7563 -#> 0.2584 0.5293 0.9757 0.3030 0.3341 -#> [ CPUFloatType{3,5} ]
    -# } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nn_init_xavier_normal_.html b/docs/reference/nn_init_xavier_normal_.html deleted file mode 100644 index bf02555881ff0e87d40d9a58d4354ed0fab8a522..0000000000000000000000000000000000000000 --- a/docs/reference/nn_init_xavier_normal_.html +++ /dev/null @@ -1,223 +0,0 @@ - - - - - - - - -Xavier normal initialization — nn_init_xavier_normal_ • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Fills the input Tensor with values according to the method -described in Understanding the difficulty of training deep feedforward neural networks - Glorot, X. & Bengio, Y. (2010), using a normal -distribution.

    -
    - -
    nn_init_xavier_normal_(tensor, gain = 1)
    - -

    Arguments

    - - - - - - - - - - -
    tensor

    an n-dimensional Tensor

    gain

    an optional scaling factor

    - - -

    Examples

    -
    # \dontrun{ -w <- torch_empty(3, 5) -nn_init_xavier_normal_(w)
    #> torch_tensor -#> 1.2535 -0.2197 0.5425 -3.0052 -4.2446 -#> -0.3570 -1.6970 -2.0154 -0.5348 2.7582 -#> 0.8714 -0.8924 0.7675 3.2553 -1.4333 -#> [ CPUFloatType{3,5} ]
    -# } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nn_init_xavier_uniform_.html b/docs/reference/nn_init_xavier_uniform_.html deleted file mode 100644 index c925e4da3bd856dea30faea9ba24568a043a9a7e..0000000000000000000000000000000000000000 --- a/docs/reference/nn_init_xavier_uniform_.html +++ /dev/null @@ -1,223 +0,0 @@ - - - - - - - - -Xavier uniform initialization — nn_init_xavier_uniform_ • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Fills the input Tensor with values according to the method -described in Understanding the difficulty of training deep feedforward neural networks - Glorot, X. & Bengio, Y. (2010), using a uniform -distribution.

    -
    - -
    nn_init_xavier_uniform_(tensor, gain = 1)
    - -

    Arguments

    - - - - - - - - - - -
    tensor

    an n-dimensional Tensor

    gain

    an optional scaling factor

    - - -

    Examples

    -
    # \dontrun{ -w <- torch_empty(3, 5) -nn_init_xavier_uniform_(w)
    #> torch_tensor -#> 1.3397 1.1040 -3.0453 -1.7935 0.9545 -#> -0.0194 -2.4483 2.9345 2.2750 -2.4048 -#> -0.4406 -2.2409 0.4155 -0.1573 1.9776 -#> [ CPUFloatType{3,5} ]
    -# } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nn_init_zeros_.html b/docs/reference/nn_init_zeros_.html deleted file mode 100644 index 9e4a29058be88615968b66a22055e0f40bc59bf0..0000000000000000000000000000000000000000 --- a/docs/reference/nn_init_zeros_.html +++ /dev/null @@ -1,215 +0,0 @@ - - - - - - - - -Zeros initialization — nn_init_zeros_ • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Fills the input Tensor with the scalar value 0

    -
    - -
    nn_init_zeros_(tensor)
    - -

    Arguments

    - - - - - - -
    tensor

    an n-dimensional tensor

    - - -

    Examples

    -
    # \dontrun{ -w <- torch_empty(3, 5) -nn_init_zeros_(w)
    #> torch_tensor -#> 0 0 0 0 0 -#> 0 0 0 0 0 -#> 0 0 0 0 0 -#> [ CPUFloatType{3,5} ]
    -# } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nn_leaky_relu.html b/docs/reference/nn_leaky_relu.html deleted file mode 100644 index 884c40ebbf1ba044c6376135f3c34ac8a490db79..0000000000000000000000000000000000000000 --- a/docs/reference/nn_leaky_relu.html +++ /dev/null @@ -1,240 +0,0 @@ - - - - - - - - -LeakyReLU module — nn_leaky_relu • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Applies the element-wise function:

    -
    - -
    nn_leaky_relu(negative_slope = 0.01, inplace = FALSE)
    - -

    Arguments

    - - - - - - - - - - -
    negative_slope

    Controls the angle of the negative slope. Default: 1e-2

    inplace

    can optionally do the operation in-place. Default: FALSE

    - -

    Details

    - -

    $$ - \mbox{LeakyReLU}(x) = \max(0, x) + \mbox{negative\_slope} * \min(0, x) -$$ -or

    -

    $$ - \mbox{LeakyRELU}(x) = - \left\{ \begin{array}{ll} -x, & \mbox{ if } x \geq 0 \\ -\mbox{negative\_slope} \times x, & \mbox{ otherwise } -\end{array} -\right. -$$

    -

    Shape

    - - - -
      -
    • Input: \((N, *)\) where * means, any number of additional -dimensions

    • -
    • Output: \((N, *)\), same shape as the input

    • -
    - - -

    Examples

    -
    # \dontrun{ -m <- nn_leaky_relu(0.1) -input <- torch_randn(2) -output <- m(input) - -# }
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nn_linear.html b/docs/reference/nn_linear.html deleted file mode 100644 index d5a6fee866238a95d8755f6b09e1332a7edb2ba2..0000000000000000000000000000000000000000 --- a/docs/reference/nn_linear.html +++ /dev/null @@ -1,248 +0,0 @@ - - - - - - - - -Linear module — nn_linear • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Applies a linear transformation to the incoming data: y = xA^T + b

    -
    - -
    nn_linear(in_features, out_features, bias = TRUE)
    - -

    Arguments

    - - - - - - - - - - - - - - -
    in_features

    size of each input sample

    out_features

    size of each output sample

    bias

    If set to FALSE, the layer will not learn an additive bias. -Default: TRUE

    - -

    Shape

    - - - -
      -
    • Input: (N, *, H_in) where * means any number of -additional dimensions and H_in = in_features.

    • -
    • Output: (N, *, H_out) where all but the last dimension -are the same shape as the input and :math:H_out = out_features.

    • -
    - -

    Attributes

    - - - -
      -
    • weight: the learnable weights of the module of shape -(out_features, in_features). The values are -initialized from \(U(-\sqrt{k}, \sqrt{k})\)s, where -\(k = \frac{1}{\mbox{in\_features}}\)

    • -
    • bias: the learnable bias of the module of shape \((\mbox{out\_features})\). -If bias is TRUE, the values are initialized from -\(\mathcal{U}(-\sqrt{k}, \sqrt{k})\) where -\(k = \frac{1}{\mbox{in\_features}}\)

    • -
    - - -

    Examples

    -
    # \dontrun{ -m <- nn_linear(20, 30) -input <- torch_randn(128, 20) -output <- m(input) -print(output$size())
    #> [1] 128 30
    -# } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nn_log_sigmoid.html b/docs/reference/nn_log_sigmoid.html deleted file mode 100644 index b75b58183b43ac9e65235503d9c9617687714bc9..0000000000000000000000000000000000000000 --- a/docs/reference/nn_log_sigmoid.html +++ /dev/null @@ -1,220 +0,0 @@ - - - - - - - - -LogSigmoid module — nn_log_sigmoid • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Applies the element-wise function: -$$ - \mbox{LogSigmoid}(x) = \log\left(\frac{ 1 }{ 1 + \exp(-x)}\right) - $$

    -
    - -
    nn_log_sigmoid()
    - - -

    Shape

    - - - -
      -
    • Input: \((N, *)\) where * means, any number of additional -dimensions

    • -
    • Output: \((N, *)\), same shape as the input

    • -
    - - -

    Examples

    -
    # \dontrun{ -m <- nn_log_sigmoid() -input <- torch_randn(2) -output <- m(input) - -# }
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nn_log_softmax.html b/docs/reference/nn_log_softmax.html deleted file mode 100644 index e20e5a05a177eef6eb12611ce37a9af55f40a9ec..0000000000000000000000000000000000000000 --- a/docs/reference/nn_log_softmax.html +++ /dev/null @@ -1,233 +0,0 @@ - - - - - - - - -LogSoftmax module — nn_log_softmax • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Applies the \(\log(\mbox{Softmax}(x))\) function to an n-dimensional -input Tensor. The LogSoftmax formulation can be simplified as:

    -
    - -
    nn_log_softmax(dim)
    - -

    Arguments

    - - - - - - -
    dim

    (int): A dimension along which LogSoftmax will be computed.

    - -

    Value

    - -

    a Tensor of the same dimension and shape as the input with -values in the range [-inf, 0)

    -

    Details

    - -

    $$ - \mbox{LogSoftmax}(x_{i}) = \log\left(\frac{\exp(x_i) }{ \sum_j \exp(x_j)} \right) -$$

    -

    Shape

    - - - -
      -
    • Input: \((*)\) where * means, any number of additional -dimensions

    • -
    • Output: \((*)\), same shape as the input

    • -
    - - -

    Examples

    -
    # \dontrun{ -m <- nn_log_softmax(1) -input <- torch_randn(2, 3) -output <- m(input) - -# }
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nn_max_pool1d.html b/docs/reference/nn_max_pool1d.html deleted file mode 100644 index 7a707ed7b3a3969dd7bf0418fca167d8afcef87c..0000000000000000000000000000000000000000 --- a/docs/reference/nn_max_pool1d.html +++ /dev/null @@ -1,268 +0,0 @@ - - - - - - - - -MaxPool1D module — nn_max_pool1d • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Applies a 1D max pooling over an input signal composed of several input -planes.

    -
    - -
    nn_max_pool1d(
    -  kernel_size,
    -  stride = NULL,
    -  padding = 0,
    -  dilation = 1,
    -  return_indices = FALSE,
    -  ceil_mode = FALSE
    -)
    - -

    Arguments

    - - - - - - - - - - - - - - - - - - - - - - - - - - -
    kernel_size

    the size of the window to take a max over

    stride

    the stride of the window. Default value is kernel_size

    padding

    implicit zero padding to be added on both sides

    dilation

    a parameter that controls the stride of elements in the window

    return_indices

    if TRUE, will return the max indices along with the outputs. -Useful for nn_max_unpool1d() later.

    ceil_mode

    when TRUE, will use ceil instead of floor to compute the output shape

    - -

    Details

    - -

    In the simplest case, the output value of the layer with input size \((N, C, L)\) -and output \((N, C, L_{out})\) can be precisely described as:

    -

    $$ - out(N_i, C_j, k) = \max_{m=0, \ldots, \mbox{kernel\_size} - 1} -input(N_i, C_j, stride \times k + m) -$$

    -

    If padding is non-zero, then the input is implicitly zero-padded on both sides -for padding number of points. dilation controls the spacing between the kernel points. -It is harder to describe, but this link -has a nice visualization of what dilation does.

    -

    Shape

    - - - -
      -
    • Input: \((N, C, L_{in})\)

    • -
    • Output: \((N, C, L_{out})\), where

    • -
    - -

    $$ - L_{out} = \left\lfloor \frac{L_{in} + 2 \times \mbox{padding} - \mbox{dilation} - \times (\mbox{kernel\_size} - 1) - 1}{\mbox{stride}} + 1\right\rfloor -$$

    - -

    Examples

    -
    # \dontrun{ -# pool of size=3, stride=2 -m <- nn_max_pool1d(3, stride=2) -input <- torch_randn(20, 16, 50) -output <- m(input) - -# }
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nn_max_pool2d.html b/docs/reference/nn_max_pool2d.html deleted file mode 100644 index 7804fca4a25b77ed38e8fa8408996490272fe78b..0000000000000000000000000000000000000000 --- a/docs/reference/nn_max_pool2d.html +++ /dev/null @@ -1,283 +0,0 @@ - - - - - - - - -MaxPool2D module — nn_max_pool2d • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Applies a 2D max pooling over an input signal composed of several input -planes.

    -
    - -
    nn_max_pool2d(
    -  kernel_size,
    -  stride = NULL,
    -  padding = 0,
    -  dilation = 1,
    -  return_indices = FALSE,
    -  ceil_mode = FALSE
    -)
    - -

    Arguments

    - - - - - - - - - - - - - - - - - - - - - - - - - - -
    kernel_size

    the size of the window to take a max over

    stride

    the stride of the window. Default value is kernel_size

    padding

    implicit zero padding to be added on both sides

    dilation

    a parameter that controls the stride of elements in the window

    return_indices

    if TRUE, will return the max indices along with the outputs. -Useful for nn_max_unpool2d() later.

    ceil_mode

    when TRUE, will use ceil instead of floor to compute the output shape

    - -

    Details

    - -

    In the simplest case, the output value of the layer with input size \((N, C, H, W)\), -output \((N, C, H_{out}, W_{out})\) and kernel_size \((kH, kW)\) -can be precisely described as:

    -

    $$ - \begin{array}{ll} -out(N_i, C_j, h, w) ={} & \max_{m=0, \ldots, kH-1} \max_{n=0, \ldots, kW-1} \\ -& \mbox{input}(N_i, C_j, \mbox{stride[0]} \times h + m, - \mbox{stride[1]} \times w + n) -\end{array} -$$

    -

    If padding is non-zero, then the input is implicitly zero-padded on both sides -for padding number of points. dilation controls the spacing between the kernel points. -It is harder to describe, but this link has a nice visualization of what dilation does.

    -

    The parameters kernel_size, stride, padding, dilation can either be:

      -
    • a single int -- in which case the same value is used for the height and width dimension

    • -
    • a tuple of two ints -- in which case, the first int is used for the height dimension, -and the second int for the width dimension

    • -
    - -

    Shape

    - - - -
      -
    • Input: \((N, C, H_{in}, W_{in})\)

    • -
    • Output: \((N, C, H_{out}, W_{out})\), where

    • -
    - -

    $$ - H_{out} = \left\lfloor\frac{H_{in} + 2 * \mbox{padding[0]} - \mbox{dilation[0]} - \times (\mbox{kernel\_size[0]} - 1) - 1}{\mbox{stride[0]}} + 1\right\rfloor -$$

    -

    $$ - W_{out} = \left\lfloor\frac{W_{in} + 2 * \mbox{padding[1]} - \mbox{dilation[1]} - \times (\mbox{kernel\_size[1]} - 1) - 1}{\mbox{stride[1]}} + 1\right\rfloor -$$

    - -

    Examples

    -
    # \dontrun{ -# pool of square window of size=3, stride=2 -m <- nn_max_pool2d(3, stride=2) -# pool of non-square window -m <- nn_max_pool2d(c(3, 2), stride=c(2, 1)) -input <- torch_randn(20, 16, 50, 32) -output <- m(input) - -# }
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nn_module.html b/docs/reference/nn_module.html deleted file mode 100644 index 8848bc1feac4965996564770ef19085726cc17db..0000000000000000000000000000000000000000 --- a/docs/reference/nn_module.html +++ /dev/null @@ -1,234 +0,0 @@ - - - - - - - - -Base class for all neural network modules. — nn_module • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Your models should also subclass this class.

    -
    - -
    nn_module(classname = NULL, inherit = nn_Module, ...)
    - -

    Arguments

    - - - - - - - - - - - - - - -
    classname

    an optional name for the module

    inherit

    an optional module to inherit from

    ...

    methods implementation

    - -

    Details

    - -

    Modules can also contain other Modules, allowing to nest them in a tree -structure. You can assign the submodules as regular attributes.

    - -

    Examples

    -
    # \dontrun{ -model <- nn_module( - initialize = function() { - self$conv1 <- nn_conv2d(1, 20, 5) - self$conv2 <- nn_conv2d(20, 20, 5) - }, - forward = function(input) { - input <- self$conv1(input) - input <- nnf_relu(input) - input <- self$conv2(input) - input <- nnf_relu(input) - input - } -) - -# }
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nn_module_list.html b/docs/reference/nn_module_list.html deleted file mode 100644 index b03a9e1bfd1b176b08eb83fde6b04c0aa3f9fa8b..0000000000000000000000000000000000000000 --- a/docs/reference/nn_module_list.html +++ /dev/null @@ -1,224 +0,0 @@ - - - - - - - - -Holds submodules in a list. — nn_module_list • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    nn_module_list can be indexed like a regular R list, but -modules it contains are properly registered, and will be visible by all -nn_module methods.

    -
    - -
    nn_module_list(modules = list())
    - -

    Arguments

    - - - - - - -
    modules

    a list of modules to add

    - - -

    Examples

    -
    # \dontrun{ - -my_module <- nn_module( - initialize = function() { - self$linears <- nn_module_list(lapply(1:10, function(x) nn_linear(10, 10))) - }, - forward = function(x) { - for (i in 1:length(self$linears)) - x <- self$linears[[i]](x) - x - } -) - -# }
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nn_multihead_attention.html b/docs/reference/nn_multihead_attention.html deleted file mode 100644 index f7b9e3a961f95ffa22401e6963111f06de242906..0000000000000000000000000000000000000000 --- a/docs/reference/nn_multihead_attention.html +++ /dev/null @@ -1,289 +0,0 @@ - - - - - - - - -MultiHead attention — nn_multihead_attention • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Allows the model to jointly attend to information -from different representation subspaces. -See reference: Attention Is All You Need

    -
    - -
    nn_multihead_attention(
    -  embed_dim,
    -  num_heads,
    -  dropout = 0,
    -  bias = TRUE,
    -  add_bias_kv = FALSE,
    -  add_zero_attn = FALSE,
    -  kdim = NULL,
    -  vdim = NULL
    -)
    - -

    Arguments

    - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    embed_dim

    total dimension of the model.

    num_heads

    parallel attention heads.

    dropout

    a Dropout layer on attn_output_weights. Default: 0.0.

    bias

    add bias as module parameter. Default: True.

    add_bias_kv

    add bias to the key and value sequences at dim=0.

    add_zero_attn

    add a new batch of zeros to the key and -value sequences at dim=1.

    kdim

    total number of features in key. Default: NULL

    vdim

    total number of features in value. Default: NULL. -Note: if kdim and vdim are NULL, they will be set to embed_dim such that -query, key, and value have the same number of features.

    - -

    Details

    - -

    $$ - \mbox{MultiHead}(Q, K, V) = \mbox{Concat}(head_1,\dots,head_h)W^O -\mbox{where} head_i = \mbox{Attention}(QW_i^Q, KW_i^K, VW_i^V) -$$

    -

    Shape

    - - - - -

    Inputs:

      -
    • query: \((L, N, E)\) where L is the target sequence length, N is the batch size, E is -the embedding dimension.

    • -
    • key: \((S, N, E)\), where S is the source sequence length, N is the batch size, E is -the embedding dimension.

    • -
    • value: \((S, N, E)\) where S is the source sequence length, N is the batch size, E is -the embedding dimension.

    • -
    • key_padding_mask: \((N, S)\) where N is the batch size, S is the source sequence length. -If a ByteTensor is provided, the non-zero positions will be ignored while the position -with the zero positions will be unchanged. If a BoolTensor is provided, the positions with the -value of True will be ignored while the position with the value of False will be unchanged.

    • -
    • attn_mask: 2D mask \((L, S)\) where L is the target sequence length, S is the source sequence length. -3D mask \((N*num_heads, L, S)\) where N is the batch size, L is the target sequence length, -S is the source sequence length. attn_mask ensure that position i is allowed to attend the unmasked -positions. If a ByteTensor is provided, the non-zero positions are not allowed to attend -while the zero positions will be unchanged. If a BoolTensor is provided, positions with True -is not allowed to attend while False values will be unchanged. If a FloatTensor -is provided, it will be added to the attention weight.

    • -
    - -

    Outputs:

      -
    • attn_output: \((L, N, E)\) where L is the target sequence length, N is the batch size, -E is the embedding dimension.

    • -
    • attn_output_weights: \((N, L, S)\) where N is the batch size, -L is the target sequence length, S is the source sequence length.

    • -
    - - -

    Examples

    -
    
    -  
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nn_prelu.html b/docs/reference/nn_prelu.html deleted file mode 100644 index ec2da259c793fbb1d12b906fd9ba75685094c4dd..0000000000000000000000000000000000000000 --- a/docs/reference/nn_prelu.html +++ /dev/null @@ -1,270 +0,0 @@ - - - - - - - - -PReLU module — nn_prelu • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Applies the element-wise function: -$$ - \mbox{PReLU}(x) = \max(0,x) + a * \min(0,x) -$$ -or -$$ - \mbox{PReLU}(x) = - \left\{ \begin{array}{ll} -x, & \mbox{ if } x \geq 0 \\ -ax, & \mbox{ otherwise } -\end{array} -\right. -$$

    -
    - -
    nn_prelu(num_parameters = 1, init = 0.25)
    - -

    Arguments

    - - - - - - - - - - -
    num_parameters

    (int): number of \(a\) to learn. -Although it takes an int as input, there is only two values are legitimate: -1, or the number of channels at input. Default: 1

    init

    (float): the initial value of \(a\). Default: 0.25

    - -

    Details

    - -

    Here \(a\) is a learnable parameter. When called without arguments, nn.prelu() uses a single -parameter \(a\) across all input channels. If called with nn_prelu(nChannels), -a separate \(a\) is used for each input channel.

    -

    Note

    - -

    weight decay should not be used when learning \(a\) for good performance.

    -

    Channel dim is the 2nd dim of input. When input has dims < 2, then there is -no channel dim and the number of channels = 1.

    -

    Shape

    - - - -
      -
    • Input: \((N, *)\) where * means, any number of additional -dimensions

    • -
    • Output: \((N, *)\), same shape as the input

    • -
    - -

    Attributes

    - - - -
      -
    • weight (Tensor): the learnable weights of shape (num_parameters).

    • -
    - - -

    Examples

    -
    # \dontrun{ -m <- nn_prelu() -input <- torch_randn(2) -output <- m(input) - -# }
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nn_relu.html b/docs/reference/nn_relu.html deleted file mode 100644 index 21a181a9b61cb56a9d170f77e82ce857fe1e0d9c..0000000000000000000000000000000000000000 --- a/docs/reference/nn_relu.html +++ /dev/null @@ -1,227 +0,0 @@ - - - - - - - - -ReLU module — nn_relu • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Applies the rectified linear unit function element-wise -$$\mbox{ReLU}(x) = (x)^+ = \max(0, x)$$

    -
    - -
    nn_relu(inplace = FALSE)
    - -

    Arguments

    - - - - - - -
    inplace

    can optionally do the operation in-place. Default: FALSE

    - -

    Shape

    - - - -
      -
    • Input: \((N, *)\) where * means, any number of additional -dimensions

    • -
    • Output: \((N, *)\), same shape as the input

    • -
    - - -

    Examples

    -
    # \dontrun{ -m <- nn_relu() -input <- torch_randn(2) -m(input)
    #> torch_tensor -#> 0.2952 -#> 0.0000 -#> [ CPUFloatType{2} ]
    -# } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nn_relu6.html b/docs/reference/nn_relu6.html deleted file mode 100644 index 3bc8d5d38286802c0d35938cf0b51f0c5b39c8b8..0000000000000000000000000000000000000000 --- a/docs/reference/nn_relu6.html +++ /dev/null @@ -1,227 +0,0 @@ - - - - - - - - -ReLu6 module — nn_relu6 • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Applies the element-wise function:

    -
    - -
    nn_relu6(inplace = FALSE)
    - -

    Arguments

    - - - - - - -
    inplace

    can optionally do the operation in-place. Default: FALSE

    - -

    Details

    - -

    $$ - \mbox{ReLU6}(x) = \min(\max(0,x), 6) -$$

    -

    Shape

    - - - -
      -
    • Input: \((N, *)\) where * means, any number of additional -dimensions

    • -
    • Output: \((N, *)\), same shape as the input

    • -
    - - -

    Examples

    -
    # \dontrun{ -m <- nn_relu6() -input <- torch_randn(2) -output <- m(input) - -# }
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nn_rnn.html b/docs/reference/nn_rnn.html deleted file mode 100644 index 343ee10ffe3ba0ddeb94131d8f7ec7b338232178..0000000000000000000000000000000000000000 --- a/docs/reference/nn_rnn.html +++ /dev/null @@ -1,446 +0,0 @@ - - - - - - - - -RNN module — nn_rnn • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Applies a multi-layer Elman RNN with \(\tanh\) or \(\mbox{ReLU}\) non-linearity -to an input sequence.

    -
    - -
    nn_rnn(
    -  input_size,
    -  hidden_size,
    -  num_layers = 1,
    -  nonlinearity = NULL,
    -  bias = TRUE,
    -  batch_first = FALSE,
    -  dropout = 0,
    -  bidirectional = FALSE,
    -  ...
    -)
    - -

    Arguments

    - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    input_size

    The number of expected features in the input x

    hidden_size

    The number of features in the hidden state h

    num_layers

    Number of recurrent layers. E.g., setting num_layers=2 -would mean stacking two RNNs together to form a stacked RNN, -with the second RNN taking in outputs of the first RNN and -computing the final results. Default: 1

    nonlinearity

    The non-linearity to use. Can be either 'tanh' or -'relu'. Default: 'tanh'

    bias

    If FALSE, then the layer does not use bias weights b_ih and -b_hh. Default: TRUE

    batch_first

    If TRUE, then the input and output tensors are provided -as (batch, seq, feature). Default: FALSE

    dropout

    If non-zero, introduces a Dropout layer on the outputs of each -RNN layer except the last layer, with dropout probability equal to -dropout. Default: 0

    bidirectional

    If TRUE, becomes a bidirectional RNN. Default: FALSE

    ...

    other arguments that can be passed to the super class.

    - -

    Details

    - -

    For each element in the input sequence, each layer computes the following -function:

    -

    $$ -h_t = \tanh(W_{ih} x_t + b_{ih} + W_{hh} h_{(t-1)} + b_{hh}) -$$

    -

    where \(h_t\) is the hidden state at time t, \(x_t\) is -the input at time t, and \(h_{(t-1)}\) is the hidden state of the -previous layer at time t-1 or the initial hidden state at time 0. -If nonlinearity is 'relu', then \(\mbox{ReLU}\) is used instead of -\(\tanh\).

    -

    Inputs

    - - - -
      -
    • input of shape (seq_len, batch, input_size): tensor containing the features -of the input sequence. The input can also be a packed variable length -sequence.

    • -
    • h_0 of shape (num_layers * num_directions, batch, hidden_size): tensor -containing the initial hidden state for each element in the batch. -Defaults to zero if not provided. If the RNN is bidirectional, -num_directions should be 2, else it should be 1.

    • -
    - -

    Outputs

    - - - -
      -
    • output of shape (seq_len, batch, num_directions * hidden_size): tensor -containing the output features (h_t) from the last layer of the RNN, -for each t. If a :class:nn_packed_sequence has -been given as the input, the output will also be a packed sequence. -For the unpacked case, the directions can be separated -using output$view(seq_len, batch, num_directions, hidden_size), -with forward and backward being direction 0 and 1 respectively. -Similarly, the directions can be separated in the packed case.

    • -
    • h_n of shape (num_layers * num_directions, batch, hidden_size): tensor -containing the hidden state for t = seq_len. -Like output, the layers can be separated using -h_n$view(num_layers, num_directions, batch, hidden_size).

    • -
    - -

    Shape

    - - - -
      -
    • Input1: \((L, N, H_{in})\) tensor containing input features where -\(H_{in}=\mbox{input\_size}\) and L represents a sequence length.

    • -
    • Input2: \((S, N, H_{out})\) tensor -containing the initial hidden state for each element in the batch. -\(H_{out}=\mbox{hidden\_size}\) -Defaults to zero if not provided. where \(S=\mbox{num\_layers} * \mbox{num\_directions}\) -If the RNN is bidirectional, num_directions should be 2, else it should be 1.

    • -
    • Output1: \((L, N, H_{all})\) where \(H_{all}=\mbox{num\_directions} * \mbox{hidden\_size}\)

    • -
    • Output2: \((S, N, H_{out})\) tensor containing the next hidden state -for each element in the batch

    • -
    - -

    Attributes

    - - - -
      -
    • weight_ih_l[k]: the learnable input-hidden weights of the k-th layer, -of shape (hidden_size, input_size) for k = 0. Otherwise, the shape is -(hidden_size, num_directions * hidden_size)

    • -
    • weight_hh_l[k]: the learnable hidden-hidden weights of the k-th layer, -of shape (hidden_size, hidden_size)

    • -
    • bias_ih_l[k]: the learnable input-hidden bias of the k-th layer, -of shape (hidden_size)

    • -
    • bias_hh_l[k]: the learnable hidden-hidden bias of the k-th layer, -of shape (hidden_size)

    • -
    - -

    Note

    - - - - -

    All the weights and biases are initialized from \(\mathcal{U}(-\sqrt{k}, \sqrt{k})\) -where \(k = \frac{1}{\mbox{hidden\_size}}\)

    - -

    Examples

    -
    # \dontrun{ -rnn <- nn_rnn(10, 20, 2) -input <- torch_randn(5, 3, 10) -h0 <- torch_randn(2, 3, 20) -rnn(input, h0)
    #> [[1]] -#> torch_tensor -#> (1,.,.) = -#> Columns 1 to 9 0.1563 0.2797 0.4653 0.8483 -0.5044 0.5032 -0.2032 0.8317 -0.0350 -#> 0.1632 -0.1904 0.5733 0.3198 -0.3562 0.3904 -0.6873 0.7901 -0.9472 -#> -0.6539 -0.6025 0.4797 -0.8375 0.7255 -0.4198 0.5030 -0.1419 0.2092 -#> -#> Columns 10 to 18 0.9481 -0.2581 -0.6185 -0.7787 -0.2685 -0.7144 -0.3700 0.4854 0.0008 -#> 0.8805 -0.2755 -0.9071 -0.5580 -0.0061 -0.8662 -0.7541 0.7855 -0.6157 -#> 0.5480 0.5395 -0.1759 0.2965 0.7805 -0.2651 -0.8080 0.2643 0.7689 -#> -#> Columns 19 to 20 0.3487 -0.1631 -#> 0.2535 0.5113 -#> -0.0188 0.3766 -#> -#> (2,.,.) = -#> Columns 1 to 9 0.1416 0.4498 0.1932 0.1637 -0.5115 -0.7125 -0.4544 -0.1721 -0.0345 -#> -0.7202 0.1668 0.1129 0.6700 -0.6155 0.2947 -0.4558 0.1086 0.1323 -#> 0.3006 -0.0124 -0.0014 0.5341 -0.0718 0.3542 -0.1502 0.4042 -0.5156 -#> -#> Columns 10 to 18 0.8057 0.2908 -0.1019 -0.2748 0.2799 0.6493 -0.7338 0.1910 0.1886 -#> -0.1154 0.1593 -0.5129 -0.6648 0.1733 -0.2496 -0.0886 0.2975 0.5137 -#> 0.8810 0.4194 -0.0103 -0.8285 -0.0348 -0.0102 -0.6628 0.1669 -0.4613 -#> -#> Columns 19 to 20 -0.0403 0.0080 -#> -0.0600 -0.4162 -#> 0.0908 0.6937 -#> -#> (3,.,.) = -#> Columns 1 to 9 -0.1978 0.2242 0.0024 0.3932 -0.4801 -0.4895 -0.5400 -0.0527 -0.4520 -#> 0.4544 0.0302 0.4917 0.2736 -0.5769 -0.1859 -0.4959 0.0229 -0.2535 -#> -0.5535 0.3675 0.5847 0.6636 -0.3288 -0.2481 -0.1065 -0.0289 -0.5147 -#> -#> Columns 10 to 18 0.8121 0.5476 -0.5889 -0.2491 0.5971 0.3482 -0.4202 0.5075 0.0695 -#> 0.8887 0.3603 -0.1642 -0.3072 0.2559 -0.0096 -0.6545 0.5044 0.5036 -#> 0.6373 -0.1207 0.0495 -0.3367 0.4293 0.4361 -0.3157 0.3224 0.6757 -#> -#> Columns 19 to 20 -0.0226 -0.0955 -#> 0.6364 0.2054 -#> 0.1772 -0.2871 -#> -#> (4,.,.) = -#> Columns 1 to 9 -0.5002 0.2480 -0.0165 0.4973 -0.7685 0.0885 -0.3330 -0.2697 -0.1477 -#> -0.5379 0.1719 0.2126 0.1891 -0.5105 0.2180 -0.5122 -0.1882 -0.4472 -#> 0.0806 0.0901 0.5329 0.3643 -0.6769 0.4601 -0.5399 -0.3066 -0.0994 -#> -#> Columns 10 to 18 0.4677 -0.0194 -0.3609 -0.2897 0.3666 0.0276 0.0770 0.5985 0.5201 -#> 0.6046 -0.2503 -0.4701 -0.1266 0.3423 0.1259 -0.2631 0.5912 0.1230 -#> 0.6537 -0.2490 -0.3203 -0.3803 0.0304 -0.0077 0.1981 0.6495 -0.0583 -#> -#> Columns 19 to 20 -0.0329 -0.2124 -#> 0.1306 0.0613 -#> 0.0430 -0.0534 -#> -#> (5,.,.) = -#> Columns 1 to 9 -0.3560 0.0896 0.2468 0.0908 -0.3990 -0.1175 -0.3947 -0.0834 0.1421 -#> 0.1891 -0.0772 0.2671 0.0296 -0.1929 -0.2009 -0.5507 -0.2240 0.2121 -#> -0.1833 0.2226 -0.0158 0.5592 -0.5925 0.0255 -0.6282 -0.1562 0.0561 -#> -#> Columns 10 to 18 0.5649 -0.0882 -0.4652 -0.2057 0.0088 0.0349 -0.0315 0.3252 0.6167 -#> 0.5626 0.3505 -0.2768 -0.4894 -0.0599 0.4348 -0.1352 0.2022 0.2273 -#> 0.2647 0.0037 -0.3756 -0.3976 -0.0172 -0.1532 -0.4150 0.3451 0.3110 -#> -#> Columns 19 to 20 0.2250 0.1283 -#> -0.0717 0.2627 -#> 0.1909 -0.1445 -#> [ CPUFloatType{5,3,20} ] -#> -#> [[2]] -#> torch_tensor -#> (1,.,.) = -#> Columns 1 to 9 -0.2977 -0.3901 -0.3494 -0.6523 0.3627 0.1448 -0.3341 0.2196 0.1126 -#> 0.2888 -0.7529 0.1781 -0.0379 -0.2393 0.3807 0.1044 -0.0212 -0.5096 -#> -0.1402 -0.3835 -0.2036 0.3084 0.1285 -0.3805 0.1103 0.0476 0.2100 -#> -#> Columns 10 to 18 -0.6325 -0.1108 -0.1481 0.0602 0.7081 -0.3749 0.6918 -0.4901 -0.2858 -#> 0.2888 0.4654 0.2154 -0.3173 0.4848 0.3496 -0.1522 -0.0645 -0.5102 -#> 0.2073 0.5197 0.0807 0.4554 0.0247 -0.2980 -0.3274 -0.1698 -0.0551 -#> -#> Columns 19 to 20 0.0842 0.0867 -#> 0.1297 -0.4188 -#> 0.6599 -0.5773 -#> -#> (2,.,.) = -#> Columns 1 to 9 -0.3560 0.0896 0.2468 0.0908 -0.3990 -0.1175 -0.3947 -0.0834 0.1421 -#> 0.1891 -0.0772 0.2671 0.0296 -0.1929 -0.2009 -0.5507 -0.2240 0.2121 -#> -0.1833 0.2226 -0.0158 0.5592 -0.5925 0.0255 -0.6282 -0.1562 0.0561 -#> -#> Columns 10 to 18 0.5649 -0.0882 -0.4652 -0.2057 0.0088 0.0349 -0.0315 0.3252 0.6167 -#> 0.5626 0.3505 -0.2768 -0.4894 -0.0599 0.4348 -0.1352 0.2022 0.2273 -#> 0.2647 0.0037 -0.3756 -0.3976 -0.0172 -0.1532 -0.4150 0.3451 0.3110 -#> -#> Columns 19 to 20 0.2250 0.1283 -#> -0.0717 0.2627 -#> 0.1909 -0.1445 -#> [ CPUFloatType{2,3,20} ] -#>
    -# } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nn_rrelu.html b/docs/reference/nn_rrelu.html deleted file mode 100644 index 5755b8fb004d6bae40872a67126d0de3f0e74480..0000000000000000000000000000000000000000 --- a/docs/reference/nn_rrelu.html +++ /dev/null @@ -1,250 +0,0 @@ - - - - - - - - -RReLU module — nn_rrelu • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Applies the randomized leaky rectified liner unit function, element-wise, -as described in the paper:

    -
    - -
    nn_rrelu(lower = 1/8, upper = 1/3, inplace = FALSE)
    - -

    Arguments

    - - - - - - - - - - - - - - -
    lower

    lower bound of the uniform distribution. Default: \(\frac{1}{8}\)

    upper

    upper bound of the uniform distribution. Default: \(\frac{1}{3}\)

    inplace

    can optionally do the operation in-place. Default: FALSE

    - -

    Details

    - -

    Empirical Evaluation of Rectified Activations in Convolutional Network.

    -

    The function is defined as:

    -

    $$ -\mbox{RReLU}(x) = -\left\{ \begin{array}{ll} -x & \mbox{if } x \geq 0 \\ -ax & \mbox{ otherwise } -\end{array} -\right. -$$

    -

    where \(a\) is randomly sampled from uniform distribution -\(\mathcal{U}(\mbox{lower}, \mbox{upper})\). -See: https://arxiv.org/pdf/1505.00853.pdf

    -

    Shape

    - - - -
      -
    • Input: \((N, *)\) where * means, any number of additional -dimensions

    • -
    • Output: \((N, *)\), same shape as the input

    • -
    - - -

    Examples

    -
    # \dontrun{ -m <- nn_rrelu(0.1, 0.3) -input <- torch_randn(2) -m(input)
    #> torch_tensor -#> -0.0421 -#> 1.4246 -#> [ CPUFloatType{2} ]
    -# } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nn_selu.html b/docs/reference/nn_selu.html deleted file mode 100644 index f38b8e60b05a3659b174d37010537dcbb10ca812..0000000000000000000000000000000000000000 --- a/docs/reference/nn_selu.html +++ /dev/null @@ -1,231 +0,0 @@ - - - - - - - - -SELU module — nn_selu • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Applied element-wise, as:

    -
    - -
    nn_selu(inplace = FALSE)
    - -

    Arguments

    - - - - - - -
    inplace

    (bool, optional): can optionally do the operation in-place. Default: FALSE

    - -

    Details

    - -

    $$ - \mbox{SELU}(x) = \mbox{scale} * (\max(0,x) + \min(0, \alpha * (\exp(x) - 1))) -$$

    -

    with \(\alpha = 1.6732632423543772848170429916717\) and -\(\mbox{scale} = 1.0507009873554804934193349852946\).

    -

    More details can be found in the paper -Self-Normalizing Neural Networks.

    -

    Shape

    - - - -
      -
    • Input: \((N, *)\) where * means, any number of additional -dimensions

    • -
    • Output: \((N, *)\), same shape as the input

    • -
    - - -

    Examples

    -
    # \dontrun{ -m <- nn_selu() -input <- torch_randn(2) -output <- m(input) - -# }
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nn_sequential.html b/docs/reference/nn_sequential.html deleted file mode 100644 index afd123a4acd5933edfef6323f7fb501068aaaeec..0000000000000000000000000000000000000000 --- a/docs/reference/nn_sequential.html +++ /dev/null @@ -1,226 +0,0 @@ - - - - - - - - -A sequential container — nn_sequential • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    A sequential container. -Modules will be added to it in the order they are passed in the constructor. -See examples.

    -
    - -
    nn_sequential(..., name = NULL)
    - -

    Arguments

    - - - - - - - - - - -
    ...

    sequence of modules to be added

    name

    optional name for the generated module.

    - - -

    Examples

    -
    # \dontrun{ - -model <- nn_sequential( - nn_conv2d(1, 20, 5), - nn_relu(), - nn_conv2d(20, 64, 5), - nn_relu() -) -input <- torch_randn(32, 1, 28, 28) -output <- model(input) - -# }
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nn_sigmoid.html b/docs/reference/nn_sigmoid.html deleted file mode 100644 index 912e815e93ab78dc5515fd7d442ee64740816272..0000000000000000000000000000000000000000 --- a/docs/reference/nn_sigmoid.html +++ /dev/null @@ -1,219 +0,0 @@ - - - - - - - - -Sigmoid module — nn_sigmoid • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Applies the element-wise function:

    -
    - -
    nn_sigmoid()
    - - -

    Details

    - -

    $$ - \mbox{Sigmoid}(x) = \sigma(x) = \frac{1}{1 + \exp(-x)} -$$

    -

    Shape

    - - - -
      -
    • Input: \((N, *)\) where * means, any number of additional -dimensions

    • -
    • Output: \((N, *)\), same shape as the input

    • -
    - - -

    Examples

    -
    # \dontrun{ -m <- nn_sigmoid() -input <- torch_randn(2) -output <- m(input) - -# }
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nn_softmax.html b/docs/reference/nn_softmax.html deleted file mode 100644 index 9ffface0c9bc32f19266ccf9b99a0f895d7f5080..0000000000000000000000000000000000000000 --- a/docs/reference/nn_softmax.html +++ /dev/null @@ -1,246 +0,0 @@ - - - - - - - - -Softmax module — nn_softmax • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Applies the Softmax function to an n-dimensional input Tensor -rescaling them so that the elements of the n-dimensional output Tensor -lie in the range [0,1] and sum to 1. -Softmax is defined as:

    -
    - -
    nn_softmax(dim)
    - -

    Arguments

    - - - - - - -
    dim

    (int): A dimension along which Softmax will be computed (so every slice -along dim will sum to 1).

    - -

    Value

    - -

    : -a Tensor of the same dimension and shape as the input with -values in the range [0, 1]

    -

    Details

    - -

    $$ - \mbox{Softmax}(x_{i}) = \frac{\exp(x_i)}{\sum_j \exp(x_j)} -$$

    -

    When the input Tensor is a sparse tensor then the unspecifed -values are treated as -Inf.

    -

    Note

    - -

    This module doesn't work directly with NLLLoss, -which expects the Log to be computed between the Softmax and itself. -Use LogSoftmax instead (it's faster and has better numerical properties).

    -

    Shape

    - - - -
      -
    • Input: \((*)\) where * means, any number of additional -dimensions

    • -
    • Output: \((*)\), same shape as the input

    • -
    - - -

    Examples

    -
    # \dontrun{ -m <- nn_softmax(1) -input <- torch_randn(2, 3) -output <- m(input) - -# }
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nn_softmax2d.html b/docs/reference/nn_softmax2d.html deleted file mode 100644 index 7288011ff4910bebde7ebf27e108b4c71ed8686e..0000000000000000000000000000000000000000 --- a/docs/reference/nn_softmax2d.html +++ /dev/null @@ -1,221 +0,0 @@ - - - - - - - - -Softmax2d module — nn_softmax2d • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Applies SoftMax over features to each spatial location. -When given an image of Channels x Height x Width, it will -apply Softmax to each location \((Channels, h_i, w_j)\)

    -
    - -
    nn_softmax2d()
    - - -

    Value

    - -

    a Tensor of the same dimension and shape as the input with -values in the range [0, 1]

    -

    Shape

    - - - -
      -
    • Input: \((N, C, H, W)\)

    • -
    • Output: \((N, C, H, W)\) (same shape as input)

    • -
    - - -

    Examples

    -
    # \dontrun{ -m <- nn_softmax2d() -input <- torch_randn(2, 3, 12, 13) -output <- m(input) - -# }
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nn_softmin.html b/docs/reference/nn_softmin.html deleted file mode 100644 index 59aa790e75defb8b2eec24afc6d033cd3dbc3fb2..0000000000000000000000000000000000000000 --- a/docs/reference/nn_softmin.html +++ /dev/null @@ -1,238 +0,0 @@ - - - - - - - - -Softmin — nn_softmin • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Applies the Softmin function to an n-dimensional input Tensor -rescaling them so that the elements of the n-dimensional output Tensor -lie in the range [0, 1] and sum to 1. -Softmin is defined as:

    -
    - -
    nn_softmin(dim)
    - -

    Arguments

    - - - - - - -
    dim

    (int): A dimension along which Softmin will be computed (so every slice -along dim will sum to 1).

    - -

    Value

    - -

    a Tensor of the same dimension and shape as the input, with -values in the range [0, 1].

    -

    Details

    - -

    $$ - \mbox{Softmin}(x_{i}) = \frac{\exp(-x_i)}{\sum_j \exp(-x_j)} -$$

    -

    Shape

    - - - -
      -
    • Input: \((*)\) where * means, any number of additional -dimensions

    • -
    • Output: \((*)\), same shape as the input

    • -
    - - -

    Examples

    -
    # \dontrun{ -m <- nn_softmin(dim = 1) -input <- torch_randn(2, 2) -output <- m(input) - -# }
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nn_softplus.html b/docs/reference/nn_softplus.html deleted file mode 100644 index 9ed55c7ea9a7324cdb33f50c78898f5362da1c36..0000000000000000000000000000000000000000 --- a/docs/reference/nn_softplus.html +++ /dev/null @@ -1,238 +0,0 @@ - - - - - - - - -Softplus module — nn_softplus • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Applies the element-wise function: -$$ - \mbox{Softplus}(x) = \frac{1}{\beta} * \log(1 + \exp(\beta * x)) -$$

    -
    - -
    nn_softplus(beta = 1, threshold = 20)
    - -

    Arguments

    - - - - - - - - - - -
    beta

    the \(\beta\) value for the Softplus formulation. Default: 1

    threshold

    values above this revert to a linear function. Default: 20

    - -

    Details

    - -

    SoftPlus is a smooth approximation to the ReLU function and can be used -to constrain the output of a machine to always be positive. -For numerical stability the implementation reverts to the linear function -when \(input \times \beta > threshold\).

    -

    Shape

    - - - -
      -
    • Input: \((N, *)\) where * means, any number of additional -dimensions

    • -
    • Output: \((N, *)\), same shape as the input

    • -
    - - -

    Examples

    -
    # \dontrun{ -m <- nn_softplus() -input <- torch_randn(2) -output <- m(input) - -# }
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nn_softshrink.html b/docs/reference/nn_softshrink.html deleted file mode 100644 index db79678e3a1781e1b3318413632b984b7bb69967..0000000000000000000000000000000000000000 --- a/docs/reference/nn_softshrink.html +++ /dev/null @@ -1,233 +0,0 @@ - - - - - - - - -Softshrink module — nn_softshrink • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Applies the soft shrinkage function elementwise:

    -
    - -
    nn_softshrink(lambd = 0.5)
    - -

    Arguments

    - - - - - - -
    lambd

    the \(\lambda\) (must be no less than zero) value for the Softshrink formulation. Default: 0.5

    - -

    Details

    - -

    $$ - \mbox{SoftShrinkage}(x) = - \left\{ \begin{array}{ll} -x - \lambda, & \mbox{ if } x > \lambda \\ -x + \lambda, & \mbox{ if } x < -\lambda \\ -0, & \mbox{ otherwise } -\end{array} -\right. -$$

    -

    Shape

    - - - -
      -
    • Input: \((N, *)\) where * means, any number of additional -dimensions

    • -
    • Output: \((N, *)\), same shape as the input

    • -
    - - -

    Examples

    -
    # \dontrun{ -m <- nn_softshrink() -input <- torch_randn(2) -output <- m(input) - -# }
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nn_softsign.html b/docs/reference/nn_softsign.html deleted file mode 100644 index 30815b625c0fe05b288b8456a5abf57a68f91e0b..0000000000000000000000000000000000000000 --- a/docs/reference/nn_softsign.html +++ /dev/null @@ -1,220 +0,0 @@ - - - - - - - - -Softsign module — nn_softsign • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Applies the element-wise function: -$$ - \mbox{SoftSign}(x) = \frac{x}{ 1 + |x|} -$$

    -
    - -
    nn_softsign()
    - - -

    Shape

    - - - -
      -
    • Input: \((N, *)\) where * means, any number of additional -dimensions

    • -
    • Output: \((N, *)\), same shape as the input

    • -
    - - -

    Examples

    -
    # \dontrun{ -m <- nn_softsign() -input <- torch_randn(2) -output <- m(input) - -# }
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nn_tanh.html b/docs/reference/nn_tanh.html deleted file mode 100644 index 7a1a1f3106323faff4d2448b23bc4c4f6a6343f1..0000000000000000000000000000000000000000 --- a/docs/reference/nn_tanh.html +++ /dev/null @@ -1,219 +0,0 @@ - - - - - - - - -Tanh module — nn_tanh • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Applies the element-wise function:

    -
    - -
    nn_tanh()
    - - -

    Details

    - -

    $$ - \mbox{Tanh}(x) = \tanh(x) = \frac{\exp(x) - \exp(-x)} {\exp(x) + \exp(-x)} -$$

    -

    Shape

    - - - -
      -
    • Input: \((N, *)\) where * means, any number of additional -dimensions

    • -
    • Output: \((N, *)\), same shape as the input

    • -
    - - -

    Examples

    -
    # \dontrun{ -m <- nn_tanh() -input <- torch_randn(2) -output <- m(input) - -# }
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nn_tanhshrink.html b/docs/reference/nn_tanhshrink.html deleted file mode 100644 index 0bd1f23cfb945141ee20c509390c6ca52919560a..0000000000000000000000000000000000000000 --- a/docs/reference/nn_tanhshrink.html +++ /dev/null @@ -1,219 +0,0 @@ - - - - - - - - -Tanhshrink module — nn_tanhshrink • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Applies the element-wise function:

    -
    - -
    nn_tanhshrink()
    - - -

    Details

    - -

    $$ - \mbox{Tanhshrink}(x) = x - \tanh(x) -$$

    -

    Shape

    - - - -
      -
    • Input: \((N, *)\) where * means, any number of additional -dimensions

    • -
    • Output: \((N, *)\), same shape as the input

    • -
    - - -

    Examples

    -
    # \dontrun{ -m <- nn_tanhshrink() -input <- torch_randn(2) -output <- m(input) - -# }
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nn_threshold.html b/docs/reference/nn_threshold.html deleted file mode 100644 index b7f43279661efffa04820e46baa98de984ff9f9e..0000000000000000000000000000000000000000 --- a/docs/reference/nn_threshold.html +++ /dev/null @@ -1,241 +0,0 @@ - - - - - - - - -Threshoold module — nn_threshold • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Thresholds each element of the input Tensor.

    -
    - -
    nn_threshold(threshold, value, inplace = FALSE)
    - -

    Arguments

    - - - - - - - - - - - - - - -
    threshold

    The value to threshold at

    value

    The value to replace with

    inplace

    can optionally do the operation in-place. Default: FALSE

    - -

    Details

    - -

    Threshold is defined as: -$$ - y = - \left\{ \begin{array}{ll} - x, &\mbox{ if } x > \mbox{threshold} \\ - \mbox{value}, &\mbox{ otherwise } - \end{array} - \right. -$$

    -

    Shape

    - - - -
      -
    • Input: \((N, *)\) where * means, any number of additional -dimensions

    • -
    • Output: \((N, *)\), same shape as the input

    • -
    - - -

    Examples

    -
    # \dontrun{ -m <- nn_threshold(0.1, 20) -input <- torch_randn(2) -output <- m(input) - -# }
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nn_utils_rnn_pack_padded_sequence.html b/docs/reference/nn_utils_rnn_pack_padded_sequence.html deleted file mode 100644 index 89df73e9ff58d29e7f3375e8bcedbbe0bcc5b6f6..0000000000000000000000000000000000000000 --- a/docs/reference/nn_utils_rnn_pack_padded_sequence.html +++ /dev/null @@ -1,246 +0,0 @@ - - - - - - - - -Packs a Tensor containing padded sequences of variable length. — nn_utils_rnn_pack_padded_sequence • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    input can be of size T x B x * where T is the length of the -longest sequence (equal to lengths[1]), B is the batch size, and -* is any number of dimensions (including 0). If batch_first is -TRUE, B x T x * input is expected.

    -
    - -
    nn_utils_rnn_pack_padded_sequence(
    -  input,
    -  lengths,
    -  batch_first = FALSE,
    -  enforce_sorted = TRUE
    -)
    - -

    Arguments

    - - - - - - - - - - - - - - - - - - -
    input

    (Tensor): padded batch of variable length sequences.

    lengths

    (Tensor): list of sequences lengths of each batch element.

    batch_first

    (bool, optional): if TRUE, the input is expected in B x T x * -format.

    enforce_sorted

    (bool, optional): if TRUE, the input is expected to -contain sequences sorted by length in a decreasing order. If -FALSE, the input will get sorted unconditionally. Default: TRUE.

    - -

    Value

    - -

    a PackedSequence object

    -

    Details

    - -

    For unsorted sequences, use enforce_sorted = FALSE. If enforce_sorted is -TRUE, the sequences should be sorted by length in a decreasing order, i.e. -input[,1] should be the longest sequence, and input[,B] the shortest -one. enforce_sorted = TRUE is only necessary for ONNX export.

    -

    Note

    - -

    This function accepts any input that has at least two dimensions. You -can apply it to pack the labels, and use the output of the RNN with -them to compute the loss directly. A Tensor can be retrieved from -a PackedSequence object by accessing its .data attribute.

    - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nn_utils_rnn_pack_sequence.html b/docs/reference/nn_utils_rnn_pack_sequence.html deleted file mode 100644 index afddfe3283138ddaa819dd44e6a6409ca0f8d3ea..0000000000000000000000000000000000000000 --- a/docs/reference/nn_utils_rnn_pack_sequence.html +++ /dev/null @@ -1,232 +0,0 @@ - - - - - - - - -Packs a list of variable length Tensors — nn_utils_rnn_pack_sequence • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    sequences should be a list of Tensors of size L x *, where L is -the length of a sequence and * is any number of trailing dimensions, -including zero.

    -
    - -
    nn_utils_rnn_pack_sequence(sequences, enforce_sorted = TRUE)
    - -

    Arguments

    - - - - - - - - - - -
    sequences

    (list[Tensor]): A list of sequences of decreasing length.

    enforce_sorted

    (bool, optional): if TRUE, checks that the input -contains sequences sorted by length in a decreasing order. If -FALSE, this condition is not checked. Default: TRUE.

    - -

    Value

    - -

    a PackedSequence object

    -

    Details

    - -

    For unsorted sequences, use enforce_sorted = FALSE. If enforce_sorted -is TRUE, the sequences should be sorted in the order of decreasing length. -enforce_sorted = TRUE is only necessary for ONNX export.

    - -

    Examples

    -
    # \dontrun{ -x <- torch_tensor(c(1,2,3), dtype = torch_long()) -y <- torch_tensor(c(4, 5), dtype = torch_long()) -z <- torch_tensor(c(6), dtype = torch_long()) - -p <- nn_utils_rnn_pack_sequence(list(x, y, z)) - -# }
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nn_utils_rnn_pad_packed_sequence.html b/docs/reference/nn_utils_rnn_pad_packed_sequence.html deleted file mode 100644 index 7fd5237e87006f23d034ceacbf417c5cf03373eb..0000000000000000000000000000000000000000 --- a/docs/reference/nn_utils_rnn_pad_packed_sequence.html +++ /dev/null @@ -1,273 +0,0 @@ - - - - - - - - -Pads a packed batch of variable length sequences. — nn_utils_rnn_pad_packed_sequence • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    It is an inverse operation to nn_utils_rnn_pack_padded_sequence().

    -
    - -
    nn_utils_rnn_pad_packed_sequence(
    -  sequence,
    -  batch_first = FALSE,
    -  padding_value = 0,
    -  total_length = NULL
    -)
    - -

    Arguments

    - - - - - - - - - - - - - - - - - - -
    sequence

    (PackedSequence): batch to pad

    batch_first

    (bool, optional): if True, the output will be in ``B x T x *` -format.

    padding_value

    (float, optional): values for padded elements.

    total_length

    (int, optional): if not NULL, the output will be padded to -have length total_length. This method will throw ValueError -if total_length is less than the max sequence length in -sequence.

    - -

    Value

    - -

    Tuple of Tensor containing the padded sequence, and a Tensor -containing the list of lengths of each sequence in the batch. -Batch elements will be re-ordered as they were ordered originally when -the batch was passed to nn_utils_rnn_pack_padded_sequence() or -nn_utils_rnn_pack_sequence().

    -

    Details

    - -

    The returned Tensor's data will be of size T x B x *, where T is the length -of the longest sequence and B is the batch size. If batch_first is TRUE, -the data will be transposed into B x T x * format.

    -

    Note

    - -

    total_length is useful to implement the -pack sequence -> recurrent network -> unpack sequence pattern in a -nn_module wrapped in ~torch.nn.DataParallel.

    - -

    Examples

    -
    # \dontrun{ -seq <- torch_tensor(rbind(c(1,2,0), c(3,0,0), c(4,5,6))) -lens <- c(2,1,3) -packed <- nn_utils_rnn_pack_padded_sequence(seq, lens, batch_first = TRUE, - enforce_sorted = FALSE) -packed
    #> <PackedSequence> -#> Public: -#> batch_sizes: active binding -#> clone: function (deep = FALSE) -#> data: active binding -#> initialize: function (ptr = NULL) -#> ptr: externalptr -#> sorted_indices: active binding -#> unsorted_indices: active binding
    nn_utils_rnn_pad_packed_sequence(packed, batch_first=TRUE)
    #> [[1]] -#> torch_tensor -#> 1 2 0 -#> 3 0 0 -#> 4 5 6 -#> [ CPUFloatType{3,3} ] -#> -#> [[2]] -#> torch_tensor -#> 2 -#> 1 -#> 3 -#> [ CPULongType{3} ] -#>
    -# } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nn_utils_rnn_pad_sequence.html b/docs/reference/nn_utils_rnn_pad_sequence.html deleted file mode 100644 index 9d35f911210b22f73905faa9cb790e25d65d69bd..0000000000000000000000000000000000000000 --- a/docs/reference/nn_utils_rnn_pad_sequence.html +++ /dev/null @@ -1,243 +0,0 @@ - - - - - - - - -Pad a list of variable length Tensors with <code>padding_value</code> — nn_utils_rnn_pad_sequence • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    pad_sequence stacks a list of Tensors along a new dimension, -and pads them to equal length. For example, if the input is list of -sequences with size L x * and if batch_first is False, and T x B x * -otherwise.

    -
    - -
    nn_utils_rnn_pad_sequence(sequences, batch_first = FALSE, padding_value = 0)
    - -

    Arguments

    - - - - - - - - - - - - - - -
    sequences

    (list[Tensor]): list of variable length sequences.

    batch_first

    (bool, optional): output will be in B x T x * if TRUE, -or in T x B x * otherwise

    padding_value

    (float, optional): value for padded elements. Default: 0.

    - -

    Value

    - -

    Tensor of size T x B x * if batch_first is FALSE. -Tensor of size B x T x * otherwise

    -

    Details

    - -

    B is batch size. It is equal to the number of elements in sequences. -T is length of the longest sequence. -L is length of the sequence. -* is any number of trailing dimensions, including none.

    -

    Note

    - -

    This function returns a Tensor of size T x B x * or B x T x * -where T is the length of the longest sequence. This function assumes -trailing dimensions and type of all the Tensors in sequences are same.

    - -

    Examples

    -
    # \dontrun{ -a <- torch_ones(25, 300) -b <- torch_ones(22, 300) -c <- torch_ones(15, 300) -nn_utils_rnn_pad_sequence(list(a, b, c))$size()
    #> [1] 25 3 300
    -# } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nnf_adaptive_avg_pool1d.html b/docs/reference/nnf_adaptive_avg_pool1d.html deleted file mode 100644 index 9b1e89528e9e150a0675accc4713abed0e49bd95..0000000000000000000000000000000000000000 --- a/docs/reference/nnf_adaptive_avg_pool1d.html +++ /dev/null @@ -1,211 +0,0 @@ - - - - - - - - -Adaptive_avg_pool1d — nnf_adaptive_avg_pool1d • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Applies a 1D adaptive average pooling over an input signal composed of -several input planes.

    -
    - -
    nnf_adaptive_avg_pool1d(input, output_size)
    - -

    Arguments

    - - - - - - - - - - -
    input

    input tensor of shape (minibatch , in_channels , iW)

    output_size

    the target output size (single integer)

    - - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nnf_adaptive_avg_pool2d.html b/docs/reference/nnf_adaptive_avg_pool2d.html deleted file mode 100644 index 1af950d67e0ca4fedcfb3340887d3335343c5412..0000000000000000000000000000000000000000 --- a/docs/reference/nnf_adaptive_avg_pool2d.html +++ /dev/null @@ -1,211 +0,0 @@ - - - - - - - - -Adaptive_avg_pool2d — nnf_adaptive_avg_pool2d • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Applies a 2D adaptive average pooling over an input signal composed of -several input planes.

    -
    - -
    nnf_adaptive_avg_pool2d(input, output_size)
    - -

    Arguments

    - - - - - - - - - - -
    input

    input tensor (minibatch, in_channels , iH , iW)

    output_size

    the target output size (single integer or double-integer tuple)

    - - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nnf_adaptive_avg_pool3d.html b/docs/reference/nnf_adaptive_avg_pool3d.html deleted file mode 100644 index f0fb43fe3775e88f8c6e61663e7eb63fa3fc1b4f..0000000000000000000000000000000000000000 --- a/docs/reference/nnf_adaptive_avg_pool3d.html +++ /dev/null @@ -1,211 +0,0 @@ - - - - - - - - -Adaptive_avg_pool3d — nnf_adaptive_avg_pool3d • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Applies a 3D adaptive average pooling over an input signal composed of -several input planes.

    -
    - -
    nnf_adaptive_avg_pool3d(input, output_size)
    - -

    Arguments

    - - - - - - - - - - -
    input

    input tensor (minibatch, in_channels , iT * iH , iW)

    output_size

    the target output size (single integer or triple-integer tuple)

    - - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nnf_adaptive_max_pool1d.html b/docs/reference/nnf_adaptive_max_pool1d.html deleted file mode 100644 index 88e4b28392c17912777df1116eb8479a0a665edd..0000000000000000000000000000000000000000 --- a/docs/reference/nnf_adaptive_max_pool1d.html +++ /dev/null @@ -1,215 +0,0 @@ - - - - - - - - -Adaptive_max_pool1d — nnf_adaptive_max_pool1d • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Applies a 1D adaptive max pooling over an input signal composed of -several input planes.

    -
    - -
    nnf_adaptive_max_pool1d(input, output_size, return_indices = FALSE)
    - -

    Arguments

    - - - - - - - - - - - - - - -
    input

    input tensor of shape (minibatch , in_channels , iW)

    output_size

    the target output size (single integer)

    return_indices

    whether to return pooling indices. Default: FALSE

    - - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nnf_adaptive_max_pool2d.html b/docs/reference/nnf_adaptive_max_pool2d.html deleted file mode 100644 index a114c3e35dd7a07b2aca25421997482e62680c5c..0000000000000000000000000000000000000000 --- a/docs/reference/nnf_adaptive_max_pool2d.html +++ /dev/null @@ -1,215 +0,0 @@ - - - - - - - - -Adaptive_max_pool2d — nnf_adaptive_max_pool2d • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Applies a 2D adaptive max pooling over an input signal composed of -several input planes.

    -
    - -
    nnf_adaptive_max_pool2d(input, output_size, return_indices = FALSE)
    - -

    Arguments

    - - - - - - - - - - - - - - -
    input

    input tensor (minibatch, in_channels , iH , iW)

    output_size

    the target output size (single integer or double-integer tuple)

    return_indices

    whether to return pooling indices. Default: FALSE

    - - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nnf_adaptive_max_pool3d.html b/docs/reference/nnf_adaptive_max_pool3d.html deleted file mode 100644 index d4ff2df41aae343b62c37776bb82b06fa0dd1e2b..0000000000000000000000000000000000000000 --- a/docs/reference/nnf_adaptive_max_pool3d.html +++ /dev/null @@ -1,215 +0,0 @@ - - - - - - - - -Adaptive_max_pool3d — nnf_adaptive_max_pool3d • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Applies a 3D adaptive max pooling over an input signal composed of -several input planes.

    -
    - -
    nnf_adaptive_max_pool3d(input, output_size, return_indices = FALSE)
    - -

    Arguments

    - - - - - - - - - - - - - - -
    input

    input tensor (minibatch, in_channels , iT * iH , iW)

    output_size

    the target output size (single integer or triple-integer tuple)

    return_indices

    whether to return pooling indices. Default:FALSE

    - - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nnf_affine_grid.html b/docs/reference/nnf_affine_grid.html deleted file mode 100644 index a90e6eea6b3bf9ac1b07bb239409912dec6efcdc..0000000000000000000000000000000000000000 --- a/docs/reference/nnf_affine_grid.html +++ /dev/null @@ -1,229 +0,0 @@ - - - - - - - - -Affine_grid — nnf_affine_grid • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Generates a 2D or 3D flow field (sampling grid), given a batch of -affine matrices theta.

    -
    - -
    nnf_affine_grid(theta, size, align_corners = FALSE)
    - -

    Arguments

    - - - - - - - - - - - - - - -
    theta

    (Tensor) input batch of affine matrices with shape -(\(N \times 2 \times 3\)) for 2D or (\(N \times 3 \times 4\)) for 3D

    size

    (torch.Size) the target output image size. (\(N \times C \times H \times W\) -for 2D or \(N \times C \times D \times H \times W\) for 3D) -Example: torch.Size((32, 3, 24, 24))

    align_corners

    (bool, optional) if True, consider -1 and 1 -to refer to the centers of the corner pixels rather than the image corners. -Refer to nnf_grid_sample() for a more complete description. A grid generated by -nnf_affine_grid() should be passed to nnf_grid_sample() with the same setting for -this option. Default: False

    - -

    Note

    - - - - -

    This function is often used in conjunction with nnf_grid_sample() -to build Spatial Transformer Networks_ .

    - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nnf_alpha_dropout.html b/docs/reference/nnf_alpha_dropout.html deleted file mode 100644 index 31fad0a03325efc533fdefc422d50f19d9edb76b..0000000000000000000000000000000000000000 --- a/docs/reference/nnf_alpha_dropout.html +++ /dev/null @@ -1,218 +0,0 @@ - - - - - - - - -Alpha_dropout — nnf_alpha_dropout • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Applies alpha dropout to the input.

    -
    - -
    nnf_alpha_dropout(input, p = 0.5, training = FALSE, inplace = FALSE)
    - -

    Arguments

    - - - - - - - - - - - - - - - - - - -
    input

    the input tensor

    p

    probability of an element to be zeroed. Default: 0.5

    training

    apply dropout if is TRUE. Default: TRUE

    inplace

    If set to TRUE, will do this operation in-place. -Default: FALSE

    - - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nnf_avg_pool1d.html b/docs/reference/nnf_avg_pool1d.html deleted file mode 100644 index e817c31eb60c95b8bfcb934ea8d471ecc5243cc2..0000000000000000000000000000000000000000 --- a/docs/reference/nnf_avg_pool1d.html +++ /dev/null @@ -1,239 +0,0 @@ - - - - - - - - -Avg_pool1d — nnf_avg_pool1d • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Applies a 1D average pooling over an input signal composed of several -input planes.

    -
    - -
    nnf_avg_pool1d(
    -  input,
    -  kernel_size,
    -  stride = NULL,
    -  padding = 0,
    -  ceil_mode = FALSE,
    -  count_include_pad = TRUE
    -)
    - -

    Arguments

    - - - - - - - - - - - - - - - - - - - - - - - - - - -
    input

    input tensor of shape (minibatch , in_channels , iW)

    kernel_size

    the size of the window. Can be a single number or a -tuple (kW,).

    stride

    the stride of the window. Can be a single number or a tuple -(sW,). Default: kernel_size

    padding

    implicit zero paddings on both sides of the input. Can be a -single number or a tuple (padW,). Default: 0

    ceil_mode

    when True, will use ceil instead of floor to compute the -output shape. Default: FALSE

    count_include_pad

    when True, will include the zero-padding in the -averaging calculation. Default: TRUE

    - - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nnf_avg_pool2d.html b/docs/reference/nnf_avg_pool2d.html deleted file mode 100644 index 7350448ebdc5d4ff46597a3f7c110b66c800d8fa..0000000000000000000000000000000000000000 --- a/docs/reference/nnf_avg_pool2d.html +++ /dev/null @@ -1,247 +0,0 @@ - - - - - - - - -Avg_pool2d — nnf_avg_pool2d • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Applies 2D average-pooling operation in \(kH * kW\) regions by step size -\(sH * sW\) steps. The number of output features is equal to the number of -input planes.

    -
    - -
    nnf_avg_pool2d(
    -  input,
    -  kernel_size,
    -  stride = NULL,
    -  padding = 0,
    -  ceil_mode = FALSE,
    -  count_include_pad = TRUE,
    -  divisor_override = NULL
    -)
    - -

    Arguments

    - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    input

    input tensor (minibatch, in_channels , iH , iW)

    kernel_size

    size of the pooling region. Can be a single number or a -tuple (kH, kW)

    stride

    stride of the pooling operation. Can be a single number or a -tuple (sH, sW). Default: kernel_size

    padding

    implicit zero paddings on both sides of the input. Can be a -single number or a tuple (padH, padW). Default: 0

    ceil_mode

    when True, will use ceil instead of floor in the formula -to compute the output shape. Default: FALSE

    count_include_pad

    when True, will include the zero-padding in the -averaging calculation. Default: TRUE

    divisor_override

    if specified, it will be used as divisor, otherwise -size of the pooling region will be used. Default: NULL

    - - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nnf_avg_pool3d.html b/docs/reference/nnf_avg_pool3d.html deleted file mode 100644 index a82de3ee2f27018862fc6a53e58f7a6b39cc4099..0000000000000000000000000000000000000000 --- a/docs/reference/nnf_avg_pool3d.html +++ /dev/null @@ -1,247 +0,0 @@ - - - - - - - - -Avg_pool3d — nnf_avg_pool3d • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Applies 3D average-pooling operation in \(kT * kH * kW\) regions by step -size \(sT * sH * sW\) steps. The number of output features is equal to -\(\lfloor \frac{ \mbox{input planes} }{sT} \rfloor\).

    -
    - -
    nnf_avg_pool3d(
    -  input,
    -  kernel_size,
    -  stride = NULL,
    -  padding = 0,
    -  ceil_mode = FALSE,
    -  count_include_pad = TRUE,
    -  divisor_override = NULL
    -)
    - -

    Arguments

    - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    input

    input tensor (minibatch, in_channels , iT * iH , iW)

    kernel_size

    size of the pooling region. Can be a single number or a -tuple (kT, kH, kW)

    stride

    stride of the pooling operation. Can be a single number or a -tuple (sT, sH, sW). Default: kernel_size

    padding

    implicit zero paddings on both sides of the input. Can be a -single number or a tuple (padT, padH, padW), Default: 0

    ceil_mode

    when True, will use ceil instead of floor in the formula -to compute the output shape

    count_include_pad

    when True, will include the zero-padding in the -averaging calculation

    divisor_override

    NA if specified, it will be used as divisor, otherwise -size of the pooling region will be used. Default: NULL

    - - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nnf_batch_norm.html b/docs/reference/nnf_batch_norm.html deleted file mode 100644 index 758554bf9b0feb608f08e68ae48a845e5138384e..0000000000000000000000000000000000000000 --- a/docs/reference/nnf_batch_norm.html +++ /dev/null @@ -1,243 +0,0 @@ - - - - - - - - -Batch_norm — nnf_batch_norm • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Applies Batch Normalization for each channel across a batch of data.

    -
    - -
    nnf_batch_norm(
    -  input,
    -  running_mean,
    -  running_var,
    -  weight = NULL,
    -  bias = NULL,
    -  training = FALSE,
    -  momentum = 0.1,
    -  eps = 1e-05
    -)
    - -

    Arguments

    - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    input

    input tensor

    running_mean

    the running_mean tensor

    running_var

    the running_var tensor

    weight

    the weight tensor

    bias

    the bias tensor

    training

    bool wether it's training. Default: FALSE

    momentum

    the value used for the running_mean and running_var computation. -Can be set to None for cumulative moving average (i.e. simple average). Default: 0.1

    eps

    a value added to the denominator for numerical stability. Default: 1e-5

    - - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nnf_bilinear.html b/docs/reference/nnf_bilinear.html deleted file mode 100644 index c87d9667dd47d2590ace39612d27b1d2dd95ca2a..0000000000000000000000000000000000000000 --- a/docs/reference/nnf_bilinear.html +++ /dev/null @@ -1,226 +0,0 @@ - - - - - - - - -Bilinear — nnf_bilinear • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Applies a bilinear transformation to the incoming data: -\(y = x_1 A x_2 + b\)

    -
    - -
    nnf_bilinear(input1, input2, weight, bias = NULL)
    - -

    Arguments

    - - - - - - - - - - - - - - - - - - -
    input1

    \((N, *, H_{in1})\) where \(H_{in1}=\mbox{in1\_features}\) -and \(*\) means any number of additional dimensions. -All but the last dimension of the inputs should be the same.

    input2

    \((N, *, H_{in2})\) where \(H_{in2}=\mbox{in2\_features}\)

    weight

    \((\mbox{out\_features}, \mbox{in1\_features}, -\mbox{in2\_features})\)

    bias

    \((\mbox{out\_features})\)

    - -

    Value

    - -

    output \((N, *, H_{out})\) where \(H_{out}=\mbox{out\_features}\) -and all but the last dimension are the same shape as the input.

    - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nnf_binary_cross_entropy.html b/docs/reference/nnf_binary_cross_entropy.html deleted file mode 100644 index b8c9b524e21680450c05ea8e2b55238d06c96fc6..0000000000000000000000000000000000000000 --- a/docs/reference/nnf_binary_cross_entropy.html +++ /dev/null @@ -1,227 +0,0 @@ - - - - - - - - -Binary_cross_entropy — nnf_binary_cross_entropy • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Function that measures the Binary Cross Entropy -between the target and the output.

    -
    - -
    nnf_binary_cross_entropy(
    -  input,
    -  target,
    -  weight = NULL,
    -  reduction = c("mean", "sum", "none")
    -)
    - -

    Arguments

    - - - - - - - - - - - - - - - - - - -
    input

    tensor (N,*) where ** means, any number of additional dimensions

    target

    tensor (N,*) , same shape as the input

    weight

    (tensor) weight for each value.

    reduction

    (string, optional) – Specifies the reduction to apply to the -output: 'none' | 'mean' | 'sum'. 'none': no reduction will be applied, 'mean': -the sum of the output will be divided by the number of elements in the output, -'sum': the output will be summed. Default: 'mean'

    - - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nnf_binary_cross_entropy_with_logits.html b/docs/reference/nnf_binary_cross_entropy_with_logits.html deleted file mode 100644 index 918494367f32c7a5db1146d010eee97c75b43098..0000000000000000000000000000000000000000 --- a/docs/reference/nnf_binary_cross_entropy_with_logits.html +++ /dev/null @@ -1,234 +0,0 @@ - - - - - - - - -Binary_cross_entropy_with_logits — nnf_binary_cross_entropy_with_logits • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Function that measures Binary Cross Entropy between target and output -logits.

    -
    - -
    nnf_binary_cross_entropy_with_logits(
    -  input,
    -  target,
    -  weight = NULL,
    -  reduction = c("mean", "sum", "none"),
    -  pos_weight = NULL
    -)
    - -

    Arguments

    - - - - - - - - - - - - - - - - - - - - - - -
    input

    Tensor of arbitrary shape

    target

    Tensor of the same shape as input

    weight

    (Tensor, optional) a manual rescaling weight if provided it's -repeated to match input tensor shape.

    reduction

    (string, optional) – Specifies the reduction to apply to the -output: 'none' | 'mean' | 'sum'. 'none': no reduction will be applied, 'mean': -the sum of the output will be divided by the number of elements in the output, -'sum': the output will be summed. Default: 'mean'

    pos_weight

    (Tensor, optional) a weight of positive examples. -Must be a vector with length equal to the number of classes.

    - - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nnf_celu.html b/docs/reference/nnf_celu.html deleted file mode 100644 index 2df5207f0b8ad020735364ecb8ff4ba480cb73c0..0000000000000000000000000000000000000000 --- a/docs/reference/nnf_celu.html +++ /dev/null @@ -1,216 +0,0 @@ - - - - - - - - -Celu — nnf_celu • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Applies element-wise, \(CELU(x) = max(0,x) + min(0, \alpha * (exp(x \alpha) - 1))\).

    -
    - -
    nnf_celu(input, alpha = 1, inplace = FALSE)
    -
    -nnf_celu_(input, alpha = 1)
    - -

    Arguments

    - - - - - - - - - - - - - - -
    input

    (N,*) tensor, where * means, any number of additional -dimensions

    alpha

    the alpha value for the CELU formulation. Default: 1.0

    inplace

    can optionally do the operation in-place. Default: FALSE

    - - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nnf_conv1d.html b/docs/reference/nnf_conv1d.html deleted file mode 100644 index 3f8a34f33fa02fbdee6f7e03bd0fbe542ff39d81..0000000000000000000000000000000000000000 --- a/docs/reference/nnf_conv1d.html +++ /dev/null @@ -1,243 +0,0 @@ - - - - - - - - -Conv1d — nnf_conv1d • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Applies a 1D convolution over an input signal composed of several input -planes.

    -
    - -
    nnf_conv1d(
    -  input,
    -  weight,
    -  bias = NULL,
    -  stride = 1,
    -  padding = 0,
    -  dilation = 1,
    -  groups = 1
    -)
    - -

    Arguments

    - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    input

    input tensor of shape (minibatch, in_channels , iW)

    weight

    filters of shape (out_channels, in_channels/groups , kW)

    bias

    optional bias of shape (out_channels). Default: NULL

    stride

    the stride of the convolving kernel. Can be a single number or -a one-element tuple (sW,). Default: 1

    padding

    implicit paddings on both sides of the input. Can be a -single number or a one-element tuple (padW,). Default: 0

    dilation

    the spacing between kernel elements. Can be a single number or -a one-element tuple (dW,). Default: 1

    groups

    split input into groups, in_channels should be divisible by -the number of groups. Default: 1

    - - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nnf_conv2d.html b/docs/reference/nnf_conv2d.html deleted file mode 100644 index 076a7144069a2115993230d98aae88cd429a4137..0000000000000000000000000000000000000000 --- a/docs/reference/nnf_conv2d.html +++ /dev/null @@ -1,243 +0,0 @@ - - - - - - - - -Conv2d — nnf_conv2d • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Applies a 2D convolution over an input image composed of several input -planes.

    -
    - -
    nnf_conv2d(
    -  input,
    -  weight,
    -  bias = NULL,
    -  stride = 1,
    -  padding = 0,
    -  dilation = 1,
    -  groups = 1
    -)
    - -

    Arguments

    - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    input

    input tensor of shape (minibatch, in_channels, iH , iW)

    weight

    filters of shape (out_channels , in_channels/groups, kH , kW)

    bias

    optional bias tensor of shape (out_channels). Default: NULL

    stride

    the stride of the convolving kernel. Can be a single number or a -tuple (sH, sW). Default: 1

    padding

    implicit paddings on both sides of the input. Can be a -single number or a tuple (padH, padW). Default: 0

    dilation

    the spacing between kernel elements. Can be a single number or -a tuple (dH, dW). Default: 1

    groups

    split input into groups, in_channels should be divisible by the -number of groups. Default: 1

    - - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nnf_conv3d.html b/docs/reference/nnf_conv3d.html deleted file mode 100644 index 2095988606d8737376c326f2413ebebfc0453efa..0000000000000000000000000000000000000000 --- a/docs/reference/nnf_conv3d.html +++ /dev/null @@ -1,243 +0,0 @@ - - - - - - - - -Conv3d — nnf_conv3d • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Applies a 3D convolution over an input image composed of several input -planes.

    -
    - -
    nnf_conv3d(
    -  input,
    -  weight,
    -  bias = NULL,
    -  stride = 1,
    -  padding = 0,
    -  dilation = 1,
    -  groups = 1
    -)
    - -

    Arguments

    - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    input

    input tensor of shape (minibatch, in_channels , iT , iH , iW)

    weight

    filters of shape (out_channels , in_channels/groups, kT , kH , kW)

    bias

    optional bias tensor of shape (out_channels). Default: NULL

    stride

    the stride of the convolving kernel. Can be a single number or a -tuple (sT, sH, sW). Default: 1

    padding

    implicit paddings on both sides of the input. Can be a -single number or a tuple (padT, padH, padW). Default: 0

    dilation

    the spacing between kernel elements. Can be a single number or -a tuple (dT, dH, dW). Default: 1

    groups

    split input into groups, in_channels should be divisible by -the number of groups. Default: 1

    - - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nnf_conv_tbc.html b/docs/reference/nnf_conv_tbc.html deleted file mode 100644 index b9f90e3b76e38db954e70d27a50aae81657d2714..0000000000000000000000000000000000000000 --- a/docs/reference/nnf_conv_tbc.html +++ /dev/null @@ -1,221 +0,0 @@ - - - - - - - - -Conv_tbc — nnf_conv_tbc • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Applies a 1-dimensional sequence convolution over an input sequence. -Input and output dimensions are (Time, Batch, Channels) - hence TBC.

    -
    - -
    nnf_conv_tbc(input, weight, bias, pad = 0)
    - -

    Arguments

    - - - - - - - - - - - - - - - - - - -
    input

    input tensor of shape \((\mbox{sequence length} \times -batch \times \mbox{in\_channels})\)

    weight

    filter of shape (\(\mbox{kernel width} \times \mbox{in\_channels} -\times \mbox{out\_channels}\))

    bias

    bias of shape (\(\mbox{out\_channels}\))

    pad

    number of timesteps to pad. Default: 0

    - - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nnf_conv_transpose1d.html b/docs/reference/nnf_conv_transpose1d.html deleted file mode 100644 index 9cd618c6b6d0ef0297ab6af2d89a00b1980f0507..0000000000000000000000000000000000000000 --- a/docs/reference/nnf_conv_transpose1d.html +++ /dev/null @@ -1,248 +0,0 @@ - - - - - - - - -Conv_transpose1d — nnf_conv_transpose1d • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Applies a 1D transposed convolution operator over an input signal -composed of several input planes, sometimes also called "deconvolution".

    -
    - -
    nnf_conv_transpose1d(
    -  input,
    -  weight,
    -  bias = NULL,
    -  stride = 1,
    -  padding = 0,
    -  output_padding = 0,
    -  groups = 1,
    -  dilation = 1
    -)
    - -

    Arguments

    - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    input

    input tensor of shape (minibatch, in_channels , iW)

    weight

    filters of shape (out_channels, in_channels/groups , kW)

    bias

    optional bias of shape (out_channels). Default: NULL

    stride

    the stride of the convolving kernel. Can be a single number or -a one-element tuple (sW,). Default: 1

    padding

    implicit paddings on both sides of the input. Can be a -single number or a one-element tuple (padW,). Default: 0

    output_padding

    padding applied to the output

    groups

    split input into groups, in_channels should be divisible by -the number of groups. Default: 1

    dilation

    the spacing between kernel elements. Can be a single number or -a one-element tuple (dW,). Default: 1

    - - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nnf_conv_transpose2d.html b/docs/reference/nnf_conv_transpose2d.html deleted file mode 100644 index c621ea578126243b98a2f3c067a7a757f3bd1646..0000000000000000000000000000000000000000 --- a/docs/reference/nnf_conv_transpose2d.html +++ /dev/null @@ -1,248 +0,0 @@ - - - - - - - - -Conv_transpose2d — nnf_conv_transpose2d • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Applies a 2D transposed convolution operator over an input image -composed of several input planes, sometimes also called "deconvolution".

    -
    - -
    nnf_conv_transpose2d(
    -  input,
    -  weight,
    -  bias = NULL,
    -  stride = 1,
    -  padding = 0,
    -  output_padding = 0,
    -  groups = 1,
    -  dilation = 1
    -)
    - -

    Arguments

    - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    input

    input tensor of shape (minibatch, in_channels, iH , iW)

    weight

    filters of shape (out_channels , in_channels/groups, kH , kW)

    bias

    optional bias tensor of shape (out_channels). Default: NULL

    stride

    the stride of the convolving kernel. Can be a single number or a -tuple (sH, sW). Default: 1

    padding

    implicit paddings on both sides of the input. Can be a -single number or a tuple (padH, padW). Default: 0

    output_padding

    padding applied to the output

    groups

    split input into groups, in_channels should be divisible by the -number of groups. Default: 1

    dilation

    the spacing between kernel elements. Can be a single number or -a tuple (dH, dW). Default: 1

    - - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nnf_conv_transpose3d.html b/docs/reference/nnf_conv_transpose3d.html deleted file mode 100644 index 2ba80f720cf1ec7fbc622e6ecc921698f8d82caa..0000000000000000000000000000000000000000 --- a/docs/reference/nnf_conv_transpose3d.html +++ /dev/null @@ -1,248 +0,0 @@ - - - - - - - - -Conv_transpose3d — nnf_conv_transpose3d • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Applies a 3D transposed convolution operator over an input image -composed of several input planes, sometimes also called "deconvolution"

    -
    - -
    nnf_conv_transpose3d(
    -  input,
    -  weight,
    -  bias = NULL,
    -  stride = 1,
    -  padding = 0,
    -  output_padding = 0,
    -  groups = 1,
    -  dilation = 1
    -)
    - -

    Arguments

    - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    input

    input tensor of shape (minibatch, in_channels , iT , iH , iW)

    weight

    filters of shape (out_channels , in_channels/groups, kT , kH , kW)

    bias

    optional bias tensor of shape (out_channels). Default: NULL

    stride

    the stride of the convolving kernel. Can be a single number or a -tuple (sT, sH, sW). Default: 1

    padding

    implicit paddings on both sides of the input. Can be a -single number or a tuple (padT, padH, padW). Default: 0

    output_padding

    padding applied to the output

    groups

    split input into groups, in_channels should be divisible by -the number of groups. Default: 1

    dilation

    the spacing between kernel elements. Can be a single number or -a tuple (dT, dH, dW). Default: 1

    - - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nnf_cosine_embedding_loss.html b/docs/reference/nnf_cosine_embedding_loss.html deleted file mode 100644 index a8e21e8c0ab45d833a8820f6b7cd197c465d03e0..0000000000000000000000000000000000000000 --- a/docs/reference/nnf_cosine_embedding_loss.html +++ /dev/null @@ -1,237 +0,0 @@ - - - - - - - - -Cosine_embedding_loss — nnf_cosine_embedding_loss • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Creates a criterion that measures the loss given input tensors x_1, x_2 and a -Tensor label y with values 1 or -1. This is used for measuring whether two inputs -are similar or dissimilar, using the cosine distance, and is typically used -for learning nonlinear embeddings or semi-supervised learning.

    -
    - -
    nnf_cosine_embedding_loss(
    -  input1,
    -  input2,
    -  target,
    -  margin = 0,
    -  reduction = c("mean", "sum", "none")
    -)
    - -

    Arguments

    - - - - - - - - - - - - - - - - - - - - - - -
    input1

    the input x_1 tensor

    input2

    the input x_2 tensor

    target

    the target tensor

    margin

    Should be a number from -1 to 1 , 0 to 0.5 is suggested. If margin -is missing, the default value is 0.

    reduction

    (string, optional) – Specifies the reduction to apply to the -output: 'none' | 'mean' | 'sum'. 'none': no reduction will be applied, 'mean': -the sum of the output will be divided by the number of elements in the output, -'sum': the output will be summed. Default: 'mean'

    - - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nnf_cosine_similarity.html b/docs/reference/nnf_cosine_similarity.html deleted file mode 100644 index bc86e3d15db4db909c9e815fe294e350e7d9e2d8..0000000000000000000000000000000000000000 --- a/docs/reference/nnf_cosine_similarity.html +++ /dev/null @@ -1,223 +0,0 @@ - - - - - - - - -Cosine_similarity — nnf_cosine_similarity • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Returns cosine similarity between x1 and x2, computed along dim.

    -
    - -
    nnf_cosine_similarity(x1, x2, dim = 1, eps = 1e-08)
    - -

    Arguments

    - - - - - - - - - - - - - - - - - - -
    x1

    (Tensor) First input.

    x2

    (Tensor) Second input (of size matching x1).

    dim

    (int, optional) Dimension of vectors. Default: 1

    eps

    (float, optional) Small value to avoid division by zero. -Default: 1e-8

    - -

    Details

    - -

    $$ - \mbox{similarity} = \frac{x_1 \cdot x_2}{\max(\Vert x_1 \Vert _2 \cdot \Vert x_2 \Vert _2, \epsilon)} -$$

    - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nnf_cross_entropy.html b/docs/reference/nnf_cross_entropy.html deleted file mode 100644 index d97f2ae53fc20f0b410cfff7ce15854f1c69c81f..0000000000000000000000000000000000000000 --- a/docs/reference/nnf_cross_entropy.html +++ /dev/null @@ -1,237 +0,0 @@ - - - - - - - - -Cross_entropy — nnf_cross_entropy • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    This criterion combines log_softmax and nll_loss in a single -function.

    -
    - -
    nnf_cross_entropy(
    -  input,
    -  target,
    -  weight = NULL,
    -  ignore_index = -100,
    -  reduction = c("mean", "sum", "none")
    -)
    - -

    Arguments

    - - - - - - - - - - - - - - - - - - - - - - -
    input

    (Tensor) \((N, C)\) where C = number of classes or \((N, C, H, W)\) -in case of 2D Loss, or \((N, C, d_1, d_2, ..., d_K)\) where \(K \geq 1\) -in the case of K-dimensional loss.

    target

    (Tensor) \((N)\) where each value is \(0 \leq \mbox{targets}[i] \leq C-1\), -or \((N, d_1, d_2, ..., d_K)\) where \(K \geq 1\) for K-dimensional loss.

    weight

    (Tensor, optional) a manual rescaling weight given to each class. If -given, has to be a Tensor of size C

    ignore_index

    (int, optional) Specifies a target value that is ignored -and does not contribute to the input gradient.

    reduction

    (string, optional) – Specifies the reduction to apply to the -output: 'none' | 'mean' | 'sum'. 'none': no reduction will be applied, 'mean': -the sum of the output will be divided by the number of elements in the output, -'sum': the output will be summed. Default: 'mean'

    - - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nnf_ctc_loss.html b/docs/reference/nnf_ctc_loss.html deleted file mode 100644 index 62e8152d616a19c27b84fb626ee40b5c988ca486..0000000000000000000000000000000000000000 --- a/docs/reference/nnf_ctc_loss.html +++ /dev/null @@ -1,245 +0,0 @@ - - - - - - - - -Ctc_loss — nnf_ctc_loss • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    The Connectionist Temporal Classification loss.

    -
    - -
    nnf_ctc_loss(
    -  log_probs,
    -  targets,
    -  input_lengths,
    -  target_lengths,
    -  blank = 0,
    -  reduction = c("mean", "sum", "none"),
    -  zero_infinity = FALSE
    -)
    - -

    Arguments

    - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    log_probs

    \((T, N, C)\) where C = number of characters in alphabet including blank, -T = input length, and N = batch size. The logarithmized probabilities of -the outputs (e.g. obtained with nnf_log_softmax).

    targets

    \((N, S)\) or (sum(target_lengths)). Targets cannot be blank. -In the second form, the targets are assumed to be concatenated.

    input_lengths

    \((N)\). Lengths of the inputs (must each be \(\leq T\))

    target_lengths

    \((N)\). Lengths of the targets

    blank

    (int, optional) Blank label. Default \(0\).

    reduction

    (string, optional) – Specifies the reduction to apply to the -output: 'none' | 'mean' | 'sum'. 'none': no reduction will be applied, 'mean': -the sum of the output will be divided by the number of elements in the output, -'sum': the output will be summed. Default: 'mean'

    zero_infinity

    (bool, optional) Whether to zero infinite losses and the -associated gradients. Default: FALSE Infinite losses mainly occur when the -inputs are too short to be aligned to the targets.

    - - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nnf_dropout.html b/docs/reference/nnf_dropout.html deleted file mode 100644 index e3a21289f1274c8abdd500116abd04b1ffb0116e..0000000000000000000000000000000000000000 --- a/docs/reference/nnf_dropout.html +++ /dev/null @@ -1,222 +0,0 @@ - - - - - - - - -Dropout — nnf_dropout • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    During training, randomly zeroes some of the elements of the input -tensor with probability p using samples from a Bernoulli -distribution.

    -
    - -
    nnf_dropout(input, p = 0.5, training = TRUE, inplace = FALSE)
    - -

    Arguments

    - - - - - - - - - - - - - - - - - - -
    input

    the input tensor

    p

    probability of an element to be zeroed. Default: 0.5

    training

    apply dropout if is TRUE. Default: TRUE

    inplace

    If set to TRUE, will do this operation in-place. -Default: FALSE

    - - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nnf_dropout2d.html b/docs/reference/nnf_dropout2d.html deleted file mode 100644 index d073239ce146f53bd4f17c89f1330bfd747783a3..0000000000000000000000000000000000000000 --- a/docs/reference/nnf_dropout2d.html +++ /dev/null @@ -1,226 +0,0 @@ - - - - - - - - -Dropout2d — nnf_dropout2d • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Randomly zero out entire channels (a channel is a 2D feature map, -e.g., the \(j\)-th channel of the \(i\)-th sample in the -batched input is a 2D tensor \(input[i, j]\)) of the input tensor). -Each channel will be zeroed out independently on every forward call with -probability p using samples from a Bernoulli distribution.

    -
    - -
    nnf_dropout2d(input, p = 0.5, training = TRUE, inplace = FALSE)
    - -

    Arguments

    - - - - - - - - - - - - - - - - - - -
    input

    the input tensor

    p

    probability of a channel to be zeroed. Default: 0.5

    training

    apply dropout if is TRUE. Default: TRUE.

    inplace

    If set to TRUE, will do this operation in-place. -Default: FALSE

    - - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nnf_dropout3d.html b/docs/reference/nnf_dropout3d.html deleted file mode 100644 index 700a69e445d82dc307e49e91c6b5b5079b7be5d5..0000000000000000000000000000000000000000 --- a/docs/reference/nnf_dropout3d.html +++ /dev/null @@ -1,226 +0,0 @@ - - - - - - - - -Dropout3d — nnf_dropout3d • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Randomly zero out entire channels (a channel is a 3D feature map, -e.g., the \(j\)-th channel of the \(i\)-th sample in the -batched input is a 3D tensor \(input[i, j]\)) of the input tensor). -Each channel will be zeroed out independently on every forward call with -probability p using samples from a Bernoulli distribution.

    -
    - -
    nnf_dropout3d(input, p = 0.5, training = TRUE, inplace = FALSE)
    - -

    Arguments

    - - - - - - - - - - - - - - - - - - -
    input

    the input tensor

    p

    probability of a channel to be zeroed. Default: 0.5

    training

    apply dropout if is TRUE. Default: TRUE.

    inplace

    If set to TRUE, will do this operation in-place. -Default: FALSE

    - - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nnf_elu.html b/docs/reference/nnf_elu.html deleted file mode 100644 index 0fda511504fb75425743a996b2f76e507dae952d..0000000000000000000000000000000000000000 --- a/docs/reference/nnf_elu.html +++ /dev/null @@ -1,228 +0,0 @@ - - - - - - - - -Elu — nnf_elu • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Applies element-wise, -$$ELU(x) = max(0,x) + min(0, \alpha * (exp(x) - 1))$$.

    -
    - -
    nnf_elu(input, alpha = 1, inplace = FALSE)
    -
    -nnf_elu_(input, alpha = 1)
    - -

    Arguments

    - - - - - - - - - - - - - - -
    input

    (N,*) tensor, where * means, any number of additional -dimensions

    alpha

    the alpha value for the ELU formulation. Default: 1.0

    inplace

    can optionally do the operation in-place. Default: FALSE

    - - -

    Examples

    -
    # \dontrun{ -x <- torch_randn(2, 2) -y <- nnf_elu(x, alpha = 1) -nnf_elu_(x, alpha = 1)
    #> torch_tensor -#> -0.7520 0.2844 -#> 1.3381 0.9215 -#> [ CPUFloatType{2,2} ]
    #> [1] TRUE
    -# } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nnf_embedding.html b/docs/reference/nnf_embedding.html deleted file mode 100644 index 3389e516792a8eabab504d015a4f63011d7386fb..0000000000000000000000000000000000000000 --- a/docs/reference/nnf_embedding.html +++ /dev/null @@ -1,250 +0,0 @@ - - - - - - - - -Embedding — nnf_embedding • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    A simple lookup table that looks up embeddings in a fixed dictionary and size.

    -
    - -
    nnf_embedding(
    -  input,
    -  weight,
    -  padding_idx = NULL,
    -  max_norm = NULL,
    -  norm_type = 2,
    -  scale_grad_by_freq = FALSE,
    -  sparse = FALSE
    -)
    - -

    Arguments

    - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    input

    (LongTensor) Tensor containing indices into the embedding matrix

    weight

    (Tensor) The embedding matrix with number of rows equal to the -maximum possible index + 1, and number of columns equal to the embedding size

    padding_idx

    (int, optional) If given, pads the output with the embedding -vector at padding_idx (initialized to zeros) whenever it encounters the index.

    max_norm

    (float, optional) If given, each embedding vector with norm larger -than max_norm is renormalized to have norm max_norm. Note: this will modify -weight in-place.

    norm_type

    (float, optional) The p of the p-norm to compute for the max_norm -option. Default 2.

    scale_grad_by_freq

    (boolean, optional) If given, this will scale gradients -by the inverse of frequency of the words in the mini-batch. Default FALSE.

    sparse

    (bool, optional) If TRUE, gradient w.r.t. weight will be a -sparse tensor. See Notes under nn_embedding for more details regarding -sparse gradients.

    - -

    Details

    - -

    This module is often used to retrieve word embeddings using indices. -The input to the module is a list of indices, and the embedding matrix, -and the output is the corresponding word embeddings.

    - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nnf_embedding_bag.html b/docs/reference/nnf_embedding_bag.html deleted file mode 100644 index c695baaf435e451c7db5ad686973ee131f6cc826..0000000000000000000000000000000000000000 --- a/docs/reference/nnf_embedding_bag.html +++ /dev/null @@ -1,267 +0,0 @@ - - - - - - - - -Embedding_bag — nnf_embedding_bag • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Computes sums, means or maxes of bags of embeddings, without instantiating the -intermediate embeddings.

    -
    - -
    nnf_embedding_bag(
    -  input,
    -  weight,
    -  offsets = NULL,
    -  max_norm = NULL,
    -  norm_type = 2,
    -  scale_grad_by_freq = FALSE,
    -  mode = "mean",
    -  sparse = FALSE,
    -  per_sample_weights = NULL,
    -  include_last_offset = FALSE
    -)
    - -

    Arguments

    - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    input

    (LongTensor) Tensor containing bags of indices into the embedding matrix

    weight

    (Tensor) The embedding matrix with number of rows equal to the -maximum possible index + 1, and number of columns equal to the embedding size

    offsets

    (LongTensor, optional) Only used when input is 1D. offsets -determines the starting index position of each bag (sequence) in input.

    max_norm

    (float, optional) If given, each embedding vector with norm -larger than max_norm is renormalized to have norm max_norm. -Note: this will modify weight in-place.

    norm_type

    (float, optional) The p in the p-norm to compute for the -max_norm option. Default 2.

    scale_grad_by_freq

    (boolean, optional) if given, this will scale gradients -by the inverse of frequency of the words in the mini-batch. Default FALSE. Note: this option is not supported when mode="max".

    mode

    (string, optional) "sum", "mean" or "max". Specifies -the way to reduce the bag. Default: 'mean'

    sparse

    (bool, optional) if TRUE, gradient w.r.t. weight will be a -sparse tensor. See Notes under nn_embedding for more details regarding -sparse gradients. Note: this option is not supported when mode="max".

    per_sample_weights

    (Tensor, optional) a tensor of float / double weights, -or NULL to indicate all weights should be taken to be 1. If specified, -per_sample_weights must have exactly the same shape as input and is treated -as having the same offsets, if those are not NULL.

    include_last_offset

    (bool, optional) if TRUE, the size of offsets is -equal to the number of bags + 1.

    - - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nnf_fold.html b/docs/reference/nnf_fold.html deleted file mode 100644 index b934f6ec5539447058bc8bb9ae310f2511f4cb6e..0000000000000000000000000000000000000000 --- a/docs/reference/nnf_fold.html +++ /dev/null @@ -1,245 +0,0 @@ - - - - - - - - -Fold — nnf_fold • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Combines an array of sliding local blocks into a large containing -tensor.

    -
    - -
    nnf_fold(
    -  input,
    -  output_size,
    -  kernel_size,
    -  dilation = 1,
    -  padding = 0,
    -  stride = 1
    -)
    - -

    Arguments

    - - - - - - - - - - - - - - - - - - - - - - - - - - -
    input

    the input tensor

    output_size

    the shape of the spatial dimensions of the output (i.e., -output$sizes()[-c(1,2)])

    kernel_size

    the size of the sliding blocks

    dilation

    a parameter that controls the stride of elements within the -neighborhood. Default: 1

    padding

    implicit zero padding to be added on both sides of input. -Default: 0

    stride

    the stride of the sliding blocks in the input spatial dimensions. -Default: 1

    - -

    Warning

    - - - - -

    Currently, only 4-D output tensors (batched image-like tensors) are -supported.

    - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nnf_fractional_max_pool2d.html b/docs/reference/nnf_fractional_max_pool2d.html deleted file mode 100644 index 5f41f39d9bd4c93bebdee1b09a16b087138b3dde..0000000000000000000000000000000000000000 --- a/docs/reference/nnf_fractional_max_pool2d.html +++ /dev/null @@ -1,242 +0,0 @@ - - - - - - - - -Fractional_max_pool2d — nnf_fractional_max_pool2d • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Applies 2D fractional max pooling over an input signal composed of several input planes.

    -
    - -
    nnf_fractional_max_pool2d(
    -  input,
    -  kernel_size,
    -  output_size = NULL,
    -  output_ratio = NULL,
    -  return_indices = FALSE,
    -  random_samples = NULL
    -)
    - -

    Arguments

    - - - - - - - - - - - - - - - - - - - - - - - - - - -
    input

    the input tensor

    kernel_size

    the size of the window to take a max over. Can be a -single number \(k\) (for a square kernel of \(k * k\)) or -a tuple (kH, kW)

    output_size

    the target output size of the image of the form \(oH * oW\). -Can be a tuple (oH, oW) or a single number \(oH\) for a square image \(oH * oH\)

    output_ratio

    If one wants to have an output size as a ratio of the input size, -this option can be given. This has to be a number or tuple in the range (0, 1)

    return_indices

    if True, will return the indices along with the outputs.

    random_samples

    optional random samples.

    - -

    Details

    - -

    Fractional MaxPooling is described in detail in the paper Fractional MaxPooling_ by Ben Graham

    -

    The max-pooling operation is applied in \(kH * kW\) regions by a stochastic -step size determined by the target output size. -The number of output features is equal to the number of input planes.

    - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nnf_fractional_max_pool3d.html b/docs/reference/nnf_fractional_max_pool3d.html deleted file mode 100644 index a7c86c2a51b250d05506856aa962efb1e45b6d77..0000000000000000000000000000000000000000 --- a/docs/reference/nnf_fractional_max_pool3d.html +++ /dev/null @@ -1,243 +0,0 @@ - - - - - - - - -Fractional_max_pool3d — nnf_fractional_max_pool3d • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Applies 3D fractional max pooling over an input signal composed of several input planes.

    -
    - -
    nnf_fractional_max_pool3d(
    -  input,
    -  kernel_size,
    -  output_size = NULL,
    -  output_ratio = NULL,
    -  return_indices = FALSE,
    -  random_samples = NULL
    -)
    - -

    Arguments

    - - - - - - - - - - - - - - - - - - - - - - - - - - -
    input

    the input tensor

    kernel_size

    the size of the window to take a max over. Can be a single number \(k\) -(for a square kernel of \(k * k * k\)) or a tuple (kT, kH, kW)

    output_size

    the target output size of the form \(oT * oH * oW\). -Can be a tuple (oT, oH, oW) or a single number \(oH\) for a cubic output -\(oH * oH * oH\)

    output_ratio

    If one wants to have an output size as a ratio of the -input size, this option can be given. This has to be a number or tuple in the -range (0, 1)

    return_indices

    if True, will return the indices along with the outputs.

    random_samples

    undocumented argument.

    - -

    Details

    - -

    Fractional MaxPooling is described in detail in the paper Fractional MaxPooling_ by Ben Graham

    -

    The max-pooling operation is applied in \(kT * kH * kW\) regions by a stochastic -step size determined by the target output size. -The number of output features is equal to the number of input planes.

    - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nnf_gelu.html b/docs/reference/nnf_gelu.html deleted file mode 100644 index d855b1ccb650fd56394902e0536a072520428706..0000000000000000000000000000000000000000 --- a/docs/reference/nnf_gelu.html +++ /dev/null @@ -1,216 +0,0 @@ - - - - - - - - -Gelu — nnf_gelu • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Gelu

    -
    - -
    nnf_gelu(input)
    - -

    Arguments

    - - - - - - -
    input

    (N,*) tensor, where * means, any number of additional -dimensions

    - -

    gelu(input) -> Tensor

    - - - - -

    Applies element-wise the function -\(GELU(x) = x * \Phi(x)\)

    -

    where \(\Phi(x)\) is the Cumulative Distribution Function for -Gaussian Distribution.

    -

    See Gaussian Error Linear Units (GELUs).

    - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nnf_glu.html b/docs/reference/nnf_glu.html deleted file mode 100644 index f507f43d9e21c760a26d39bbdb8c0b3f1f6a17cb..0000000000000000000000000000000000000000 --- a/docs/reference/nnf_glu.html +++ /dev/null @@ -1,216 +0,0 @@ - - - - - - - - -Glu — nnf_glu • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    The gated linear unit. Computes:

    -
    - -
    nnf_glu(input, dim = -1)
    - -

    Arguments

    - - - - - - - - - - -
    input

    (Tensor) input tensor

    dim

    (int) dimension on which to split the input. Default: -1

    - -

    Details

    - -

    $$GLU(a, b) = a \otimes \sigma(b)$$

    -

    where input is split in half along dim to form a and b, \(\sigma\) -is the sigmoid function and \(\otimes\) is the element-wise product -between matrices.

    -

    See Language Modeling with Gated Convolutional Networks.

    - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nnf_grid_sample.html b/docs/reference/nnf_grid_sample.html deleted file mode 100644 index 995435513b5c98a16a06901ac86179ce90cde533..0000000000000000000000000000000000000000 --- a/docs/reference/nnf_grid_sample.html +++ /dev/null @@ -1,277 +0,0 @@ - - - - - - - - -Grid_sample — nnf_grid_sample • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Given an input and a flow-field grid, computes the -output using input values and pixel locations from grid.

    -
    - -
    nnf_grid_sample(
    -  input,
    -  grid,
    -  mode = c("bilinear", "nearest"),
    -  padding_mode = c("zeros", "border", "reflection"),
    -  align_corners = FALSE
    -)
    - -

    Arguments

    - - - - - - - - - - - - - - - - - - - - - - -
    input

    (Tensor) input of shape \((N, C, H_{\mbox{in}}, W_{\mbox{in}})\) (4-D case) or \((N, C, D_{\mbox{in}}, H_{\mbox{in}}, W_{\mbox{in}})\) (5-D case)

    grid

    (Tensor) flow-field of shape \((N, H_{\mbox{out}}, W_{\mbox{out}}, 2)\) (4-D case) or \((N, D_{\mbox{out}}, H_{\mbox{out}}, W_{\mbox{out}}, 3)\) (5-D case)

    mode

    (str) interpolation mode to calculate output values 'bilinear' | 'nearest'. -Default: 'bilinear'

    padding_mode

    (str) padding mode for outside grid values 'zeros' | 'border' -| 'reflection'. Default: 'zeros'

    align_corners

    (bool, optional) Geometrically, we consider the pixels of the -input as squares rather than points. If set to True, the extrema (-1 and -1) are considered as referring to the center points of the input's corner pixels. -If set to False, they are instead considered as referring to the corner -points of the input's corner pixels, making the sampling more resolution -agnostic. This option parallels the align_corners option in nnf_interpolate(), and -so whichever option is used here should also be used there to resize the input -image before grid sampling. Default: False

    - -

    Details

    - -

    Currently, only spatial (4-D) and volumetric (5-D) input are -supported.

    -

    In the spatial (4-D) case, for input with shape -\((N, C, H_{\mbox{in}}, W_{\mbox{in}})\) and grid with shape -\((N, H_{\mbox{out}}, W_{\mbox{out}}, 2)\), the output will have shape -\((N, C, H_{\mbox{out}}, W_{\mbox{out}})\).

    -

    For each output location output[n, :, h, w], the size-2 vector -grid[n, h, w] specifies input pixel locations x and y, -which are used to interpolate the output value output[n, :, h, w]. -In the case of 5D inputs, grid[n, d, h, w] specifies the -x, y, z pixel locations for interpolating -output[n, :, d, h, w]. mode argument specifies nearest or -bilinear interpolation method to sample the input pixels.

    -

    grid specifies the sampling pixel locations normalized by the -input spatial dimensions. Therefore, it should have most values in -the range of [-1, 1]. For example, values x = -1, y = -1 is the -left-top pixel of input, and values x = 1, y = 1 is the -right-bottom pixel of input.

    -

    If grid has values outside the range of [-1, 1], the corresponding -outputs are handled as defined by padding_mode. Options are

      -
    • padding_mode="zeros": use 0 for out-of-bound grid locations,

    • -
    • padding_mode="border": use border values for out-of-bound grid locations,

    • -
    • padding_mode="reflection": use values at locations reflected by -the border for out-of-bound grid locations. For location far away -from the border, it will keep being reflected until becoming in bound, -e.g., (normalized) pixel location x = -3.5 reflects by border -1 -and becomes x' = 1.5, then reflects by border 1 and becomes -x'' = -0.5.

    • -
    - -

    Note

    - - - - -

    This function is often used in conjunction with nnf_affine_grid() -to build Spatial Transformer Networks_ .

    - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nnf_group_norm.html b/docs/reference/nnf_group_norm.html deleted file mode 100644 index 0a9d8db920e6f50f767ce287146eec91928181c9..0000000000000000000000000000000000000000 --- a/docs/reference/nnf_group_norm.html +++ /dev/null @@ -1,221 +0,0 @@ - - - - - - - - -Group_norm — nnf_group_norm • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Applies Group Normalization for last certain number of dimensions.

    -
    - -
    nnf_group_norm(input, num_groups, weight = NULL, bias = NULL, eps = 1e-05)
    - -

    Arguments

    - - - - - - - - - - - - - - - - - - - - - - -
    input

    the input tensor

    num_groups

    number of groups to separate the channels into

    weight

    the weight tensor

    bias

    the bias tensor

    eps

    a value added to the denominator for numerical stability. Default: 1e-5

    - - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nnf_gumbel_softmax.html b/docs/reference/nnf_gumbel_softmax.html deleted file mode 100644 index 6422a08ed9c4c233862477311c9d96f2bfc1857a..0000000000000000000000000000000000000000 --- a/docs/reference/nnf_gumbel_softmax.html +++ /dev/null @@ -1,219 +0,0 @@ - - - - - - - - -Gumbel_softmax — nnf_gumbel_softmax • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Samples from the Gumbel-Softmax distribution and -optionally discretizes.

    -
    - -
    nnf_gumbel_softmax(logits, tau = 1, hard = FALSE, dim = -1)
    - -

    Arguments

    - - - - - - - - - - - - - - - - - - -
    logits

    [..., num_features] unnormalized log probabilities

    tau

    non-negative scalar temperature

    hard

    if True, the returned samples will be discretized as one-hot vectors, but will be differentiated as if it is the soft sample in autograd

    dim

    (int) A dimension along which softmax will be computed. Default: -1.

    - - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nnf_hardshrink.html b/docs/reference/nnf_hardshrink.html deleted file mode 100644 index bce87313478f57fb89acb3e8709dfa3b38c0ed88..0000000000000000000000000000000000000000 --- a/docs/reference/nnf_hardshrink.html +++ /dev/null @@ -1,210 +0,0 @@ - - - - - - - - -Hardshrink — nnf_hardshrink • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Applies the hard shrinkage function element-wise

    -
    - -
    nnf_hardshrink(input, lambd = 0.5)
    - -

    Arguments

    - - - - - - - - - - -
    input

    (N,*) tensor, where * means, any number of additional -dimensions

    lambd

    the lambda value for the Hardshrink formulation. Default: 0.5

    - - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nnf_hardsigmoid.html b/docs/reference/nnf_hardsigmoid.html deleted file mode 100644 index 0f5cfd882ba4f367f0cc85a65a7c54c39bfecd76..0000000000000000000000000000000000000000 --- a/docs/reference/nnf_hardsigmoid.html +++ /dev/null @@ -1,210 +0,0 @@ - - - - - - - - -Hardsigmoid — nnf_hardsigmoid • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Applies the element-wise function \(\mbox{Hardsigmoid}(x) = \frac{ReLU6(x + 3)}{6}\)

    -
    - -
    nnf_hardsigmoid(input, inplace = FALSE)
    - -

    Arguments

    - - - - - - - - - - -
    input

    (N,*) tensor, where * means, any number of additional -dimensions

    inplace

    NA If set to True, will do this operation in-place. Default: False

    - - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nnf_hardswish.html b/docs/reference/nnf_hardswish.html deleted file mode 100644 index 2da9a192c6fdd9aa4e9d5721d59095eddf0d1259..0000000000000000000000000000000000000000 --- a/docs/reference/nnf_hardswish.html +++ /dev/null @@ -1,221 +0,0 @@ - - - - - - - - -Hardswish — nnf_hardswish • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Applies the hardswish function, element-wise, as described in the paper: -Searching for MobileNetV3.

    -
    - -
    nnf_hardswish(input, inplace = FALSE)
    - -

    Arguments

    - - - - - - - - - - -
    input

    (N,*) tensor, where * means, any number of additional -dimensions

    inplace

    can optionally do the operation in-place. Default: FALSE

    - -

    Details

    - -

    $$ \mbox{Hardswish}(x) = \left\{ - \begin{array}{ll} - 0 & \mbox{if } x \le -3, \\ - x & \mbox{if } x \ge +3, \\ - x \cdot (x + 3)/6 & \mbox{otherwise} - \end{array} - \right. $$

    - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nnf_hardtanh.html b/docs/reference/nnf_hardtanh.html deleted file mode 100644 index dbbe61da1793a46884117c7cc3d29705304891d5..0000000000000000000000000000000000000000 --- a/docs/reference/nnf_hardtanh.html +++ /dev/null @@ -1,220 +0,0 @@ - - - - - - - - -Hardtanh — nnf_hardtanh • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Applies the HardTanh function element-wise.

    -
    - -
    nnf_hardtanh(input, min_val = -1, max_val = 1, inplace = FALSE)
    -
    -nnf_hardtanh_(input, min_val = -1, max_val = 1)
    - -

    Arguments

    - - - - - - - - - - - - - - - - - - -
    input

    (N,*) tensor, where * means, any number of additional -dimensions

    min_val

    minimum value of the linear region range. Default: -1

    max_val

    maximum value of the linear region range. Default: 1

    inplace

    can optionally do the operation in-place. Default: FALSE

    - - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nnf_hinge_embedding_loss.html b/docs/reference/nnf_hinge_embedding_loss.html deleted file mode 100644 index 6c27c9c36c40aef6d78d3c989a971a5068eb2ee0..0000000000000000000000000000000000000000 --- a/docs/reference/nnf_hinge_embedding_loss.html +++ /dev/null @@ -1,226 +0,0 @@ - - - - - - - - -Hinge_embedding_loss — nnf_hinge_embedding_loss • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Measures the loss given an input tensor xx and a labels tensor yy (containing 1 or -1). -This is usually used for measuring whether two inputs are similar or dissimilar, e.g. -using the L1 pairwise distance as xx , and is typically used for learning nonlinear -embeddings or semi-supervised learning.

    -
    - -
    nnf_hinge_embedding_loss(input, target, margin = 1, reduction = "mean")
    - -

    Arguments

    - - - - - - - - - - - - - - - - - - -
    input

    tensor (N,*) where ** means, any number of additional dimensions

    target

    tensor (N,*) , same shape as the input

    margin

    Has a default value of 1.

    reduction

    (string, optional) – Specifies the reduction to apply to the -output: 'none' | 'mean' | 'sum'. 'none': no reduction will be applied, 'mean': -the sum of the output will be divided by the number of elements in the output, -'sum': the output will be summed. Default: 'mean'

    - - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nnf_instance_norm.html b/docs/reference/nnf_instance_norm.html deleted file mode 100644 index 37f814f9f1696a60eab1d4429613d15eddedfede..0000000000000000000000000000000000000000 --- a/docs/reference/nnf_instance_norm.html +++ /dev/null @@ -1,244 +0,0 @@ - - - - - - - - -Instance_norm — nnf_instance_norm • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Applies Instance Normalization for each channel in each data sample in a -batch.

    -
    - -
    nnf_instance_norm(
    -  input,
    -  running_mean = NULL,
    -  running_var = NULL,
    -  weight = NULL,
    -  bias = NULL,
    -  use_input_stats = TRUE,
    -  momentum = 0.1,
    -  eps = 1e-05
    -)
    - -

    Arguments

    - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    input

    the input tensor

    running_mean

    the running_mean tensor

    running_var

    the running var tensor

    weight

    the weight tensor

    bias

    the bias tensor

    use_input_stats

    whether to use input stats

    momentum

    a double for the momentum

    eps

    an eps double for numerical stability

    - - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nnf_interpolate.html b/docs/reference/nnf_interpolate.html deleted file mode 100644 index 55c1540a5ed74457669d5b2d83cf6433e4f5616f..0000000000000000000000000000000000000000 --- a/docs/reference/nnf_interpolate.html +++ /dev/null @@ -1,263 +0,0 @@ - - - - - - - - -Interpolate — nnf_interpolate • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Down/up samples the input to either the given size or the given -scale_factor

    -
    - -
    nnf_interpolate(
    -  input,
    -  size = NULL,
    -  scale_factor = NULL,
    -  mode = "nearest",
    -  align_corners = FALSE,
    -  recompute_scale_factor = NULL
    -)
    - -

    Arguments

    - - - - - - - - - - - - - - - - - - - - - - - - - - -
    input

    (Tensor) the input tensor

    size

    (int or Tuple[int] or Tuple[int, int] or Tuple[int, int, int]) -output spatial size.

    scale_factor

    (float or Tuple[float]) multiplier for spatial size. -Has to match input size if it is a tuple.

    mode

    (str) algorithm used for upsampling: 'nearest' | 'linear' | 'bilinear' -| 'bicubic' | 'trilinear' | 'area' Default: 'nearest'

    align_corners

    (bool, optional) Geometrically, we consider the pixels -of the input and output as squares rather than points. If set to TRUE, -the input and output tensors are aligned by the center points of their corner -pixels, preserving the values at the corner pixels. If set to False, the -input and output tensors are aligned by the corner points of their corner pixels, -and the interpolation uses edge value padding for out-of-boundary values, -making this operation independent of input size when scale_factor is kept -the same. This only has an effect when mode is 'linear', 'bilinear', -'bicubic' or 'trilinear'. Default: False

    recompute_scale_factor

    (bool, optional) recompute the scale_factor -for use in the interpolation calculation. When scale_factor is passed -as a parameter, it is used to compute the output_size. If recompute_scale_factor -is ```True`` or not specified, a new scale_factor will be computed based on -the output and input sizes for use in the interpolation computation (i.e. the -computation will be identical to if the computed `output_size` were passed-in -explicitly). Otherwise, the passed-in `scale_factor` will be used in the -interpolation computation. Note that when `scale_factor` is floating-point, -the recomputed scale_factor may differ from the one passed in due to rounding -and precision issues.

    - -

    Details

    - -

    The algorithm used for interpolation is determined by mode.

    -

    Currently temporal, spatial and volumetric sampling are supported, i.e. -expected inputs are 3-D, 4-D or 5-D in shape.

    -

    The input dimensions are interpreted in the form: -mini-batch x channels x [optional depth] x [optional height] x width.

    -

    The modes available for resizing are: nearest, linear (3D-only), -bilinear, bicubic (4D-only), trilinear (5D-only), area

    - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nnf_kl_div.html b/docs/reference/nnf_kl_div.html deleted file mode 100644 index aa5eabc89dffe968e1b424ec804683fda9168b3f..0000000000000000000000000000000000000000 --- a/docs/reference/nnf_kl_div.html +++ /dev/null @@ -1,216 +0,0 @@ - - - - - - - - -Kl_div — nnf_kl_div • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    The Kullback-Leibler divergence Loss.

    -
    - -
    nnf_kl_div(input, target, reduction = "mean")
    - -

    Arguments

    - - - - - - - - - - - - - - -
    input

    tensor (N,*) where ** means, any number of additional dimensions

    target

    tensor (N,*) , same shape as the input

    reduction

    (string, optional) – Specifies the reduction to apply to the -output: 'none' | 'mean' | 'sum'. 'none': no reduction will be applied, 'mean': -the sum of the output will be divided by the number of elements in the output, -'sum': the output will be summed. Default: 'mean'

    - - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nnf_l1_loss.html b/docs/reference/nnf_l1_loss.html deleted file mode 100644 index 5913fa9253a5538124b976519f16c7c5c44501ce..0000000000000000000000000000000000000000 --- a/docs/reference/nnf_l1_loss.html +++ /dev/null @@ -1,216 +0,0 @@ - - - - - - - - -L1_loss — nnf_l1_loss • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Function that takes the mean element-wise absolute value difference.

    -
    - -
    nnf_l1_loss(input, target, reduction = "mean")
    - -

    Arguments

    - - - - - - - - - - - - - - -
    input

    tensor (N,*) where ** means, any number of additional dimensions

    target

    tensor (N,*) , same shape as the input

    reduction

    (string, optional) – Specifies the reduction to apply to the -output: 'none' | 'mean' | 'sum'. 'none': no reduction will be applied, 'mean': -the sum of the output will be divided by the number of elements in the output, -'sum': the output will be summed. Default: 'mean'

    - - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nnf_layer_norm.html b/docs/reference/nnf_layer_norm.html deleted file mode 100644 index cdf70afeee629c2c5bcd875b7d03e7e9d03bad95..0000000000000000000000000000000000000000 --- a/docs/reference/nnf_layer_norm.html +++ /dev/null @@ -1,229 +0,0 @@ - - - - - - - - -Layer_norm — nnf_layer_norm • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Applies Layer Normalization for last certain number of dimensions.

    -
    - -
    nnf_layer_norm(
    -  input,
    -  normalized_shape,
    -  weight = NULL,
    -  bias = NULL,
    -  eps = 1e-05
    -)
    - -

    Arguments

    - - - - - - - - - - - - - - - - - - - - - - -
    input

    the input tensor

    normalized_shape

    input shape from an expected input of size. If a single -integer is used, it is treated as a singleton list, and this module will normalize -over the last dimension which is expected to be of that specific size.

    weight

    the weight tensor

    bias

    the bias tensor

    eps

    a value added to the denominator for numerical stability. Default: 1e-5

    - - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nnf_leaky_relu.html b/docs/reference/nnf_leaky_relu.html deleted file mode 100644 index cf4c4b7271652a899d4a96738bc02bd9a0d166a4..0000000000000000000000000000000000000000 --- a/docs/reference/nnf_leaky_relu.html +++ /dev/null @@ -1,216 +0,0 @@ - - - - - - - - -Leaky_relu — nnf_leaky_relu • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Applies element-wise, -\(LeakyReLU(x) = max(0, x) + negative_slope * min(0, x)\)

    -
    - -
    nnf_leaky_relu(input, negative_slope = 0.01, inplace = FALSE)
    - -

    Arguments

    - - - - - - - - - - - - - - -
    input

    (N,*) tensor, where * means, any number of additional -dimensions

    negative_slope

    Controls the angle of the negative slope. Default: 1e-2

    inplace

    can optionally do the operation in-place. Default: FALSE

    - - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nnf_linear.html b/docs/reference/nnf_linear.html deleted file mode 100644 index fa05e842530e5614d4a8af9c2cf39560442a39d6..0000000000000000000000000000000000000000 --- a/docs/reference/nnf_linear.html +++ /dev/null @@ -1,214 +0,0 @@ - - - - - - - - -Linear — nnf_linear • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Applies a linear transformation to the incoming data: \(y = xA^T + b\).

    -
    - -
    nnf_linear(input, weight, bias = NULL)
    - -

    Arguments

    - - - - - - - - - - - - - - -
    input

    \((N, *, in\_features)\) where * means any number of -additional dimensions

    weight

    \((out\_features, in\_features)\) the weights tensor.

    bias

    optional tensor \((out\_features)\)

    - - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nnf_local_response_norm.html b/docs/reference/nnf_local_response_norm.html deleted file mode 100644 index f4fe80c1a0ffd4e536058b69e78e77ab36d2b100..0000000000000000000000000000000000000000 --- a/docs/reference/nnf_local_response_norm.html +++ /dev/null @@ -1,225 +0,0 @@ - - - - - - - - -Local_response_norm — nnf_local_response_norm • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Applies local response normalization over an input signal composed of -several input planes, where channels occupy the second dimension. -Applies normalization across channels.

    -
    - -
    nnf_local_response_norm(input, size, alpha = 1e-04, beta = 0.75, k = 1)
    - -

    Arguments

    - - - - - - - - - - - - - - - - - - - - - - -
    input

    the input tensor

    size

    amount of neighbouring channels used for normalization

    alpha

    multiplicative factor. Default: 0.0001

    beta

    exponent. Default: 0.75

    k

    additive factor. Default: 1

    - - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nnf_log_softmax.html b/docs/reference/nnf_log_softmax.html deleted file mode 100644 index 0269266470b141b9da1eec44e0ff024bae1bedaf..0000000000000000000000000000000000000000 --- a/docs/reference/nnf_log_softmax.html +++ /dev/null @@ -1,221 +0,0 @@ - - - - - - - - -Log_softmax — nnf_log_softmax • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Applies a softmax followed by a logarithm.

    -
    - -
    nnf_log_softmax(input, dim = NULL, dtype = NULL)
    - -

    Arguments

    - - - - - - - - - - - - - - -
    input

    (Tensor) input

    dim

    (int) A dimension along which log_softmax will be computed.

    dtype

    (torch.dtype, optional) the desired data type of returned tensor. -If specified, the input tensor is casted to dtype before the operation -is performed. This is useful for preventing data type overflows. -Default: NULL.

    - -

    Details

    - -

    While mathematically equivalent to log(softmax(x)), doing these two -operations separately is slower, and numerically unstable. This function -uses an alternative formulation to compute the output and gradient correctly.

    - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nnf_logsigmoid.html b/docs/reference/nnf_logsigmoid.html deleted file mode 100644 index 159a3a9b583b58d1c874cda2107b2de1ce73cbb8..0000000000000000000000000000000000000000 --- a/docs/reference/nnf_logsigmoid.html +++ /dev/null @@ -1,206 +0,0 @@ - - - - - - - - -Logsigmoid — nnf_logsigmoid • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Applies element-wise \(LogSigmoid(x_i) = log(\frac{1}{1 + exp(-x_i)})\)

    -
    - -
    nnf_logsigmoid(input)
    - -

    Arguments

    - - - - - - -
    input

    (N,*) tensor, where * means, any number of additional -dimensions

    - - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nnf_lp_pool1d.html b/docs/reference/nnf_lp_pool1d.html deleted file mode 100644 index b28b18b71e486c0f76cacfcab58336d43f7f328d..0000000000000000000000000000000000000000 --- a/docs/reference/nnf_lp_pool1d.html +++ /dev/null @@ -1,226 +0,0 @@ - - - - - - - - -Lp_pool1d — nnf_lp_pool1d • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Applies a 1D power-average pooling over an input signal composed of -several input planes. If the sum of all inputs to the power of p is -zero, the gradient is set to zero as well.

    -
    - -
    nnf_lp_pool1d(input, norm_type, kernel_size, stride = NULL, ceil_mode = FALSE)
    - -

    Arguments

    - - - - - - - - - - - - - - - - - - - - - - -
    input

    the input tensor

    norm_type

    if inf than one gets max pooling if 0 you get sum pooling ( -proportional to the avg pooling)

    kernel_size

    a single int, the size of the window

    stride

    a single int, the stride of the window. Default value is kernel_size

    ceil_mode

    when True, will use ceil instead of floor to compute the output shape

    - - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nnf_lp_pool2d.html b/docs/reference/nnf_lp_pool2d.html deleted file mode 100644 index 38afd20aeef579f7cf997bc100bda282ab7cbf95..0000000000000000000000000000000000000000 --- a/docs/reference/nnf_lp_pool2d.html +++ /dev/null @@ -1,226 +0,0 @@ - - - - - - - - -Lp_pool2d — nnf_lp_pool2d • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Applies a 2D power-average pooling over an input signal composed of -several input planes. If the sum of all inputs to the power of p is -zero, the gradient is set to zero as well.

    -
    - -
    nnf_lp_pool2d(input, norm_type, kernel_size, stride = NULL, ceil_mode = FALSE)
    - -

    Arguments

    - - - - - - - - - - - - - - - - - - - - - - -
    input

    the input tensor

    norm_type

    if inf than one gets max pooling if 0 you get sum pooling ( -proportional to the avg pooling)

    kernel_size

    a single int, the size of the window

    stride

    a single int, the stride of the window. Default value is kernel_size

    ceil_mode

    when True, will use ceil instead of floor to compute the output shape

    - - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nnf_margin_ranking_loss.html b/docs/reference/nnf_margin_ranking_loss.html deleted file mode 100644 index 7d9ab8622779bf01cc4de2bc3c003ea5a486ad35..0000000000000000000000000000000000000000 --- a/docs/reference/nnf_margin_ranking_loss.html +++ /dev/null @@ -1,226 +0,0 @@ - - - - - - - - -Margin_ranking_loss — nnf_margin_ranking_loss • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Creates a criterion that measures the loss given inputs x1 , x2 , two 1D -mini-batch Tensors, and a label 1D mini-batch tensor y (containing 1 or -1).

    -
    - -
    nnf_margin_ranking_loss(input1, input2, target, margin = 0, reduction = "mean")
    - -

    Arguments

    - - - - - - - - - - - - - - - - - - - - - - -
    input1

    the first tensor

    input2

    the second input tensor

    target

    the target tensor

    margin

    Has a default value of 00 .

    reduction

    (string, optional) – Specifies the reduction to apply to the -output: 'none' | 'mean' | 'sum'. 'none': no reduction will be applied, 'mean': -the sum of the output will be divided by the number of elements in the output, -'sum': the output will be summed. Default: 'mean'

    - - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nnf_max_pool1d.html b/docs/reference/nnf_max_pool1d.html deleted file mode 100644 index 3bb6df07c7eb8b24ea98c682b7ece215a3f57b46..0000000000000000000000000000000000000000 --- a/docs/reference/nnf_max_pool1d.html +++ /dev/null @@ -1,244 +0,0 @@ - - - - - - - - -Max_pool1d — nnf_max_pool1d • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Applies a 1D max pooling over an input signal composed of several input -planes.

    -
    - -
    nnf_max_pool1d(
    -  input,
    -  kernel_size,
    -  stride = NULL,
    -  padding = 0,
    -  dilation = 1,
    -  ceil_mode = FALSE,
    -  return_indices = FALSE
    -)
    - -

    Arguments

    - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    input

    input tensor of shape (minibatch , in_channels , iW)

    kernel_size

    the size of the window. Can be a single number or a -tuple (kW,).

    stride

    the stride of the window. Can be a single number or a tuple -(sW,). Default: kernel_size

    padding

    implicit zero paddings on both sides of the input. Can be a -single number or a tuple (padW,). Default: 0

    dilation

    controls the spacing between the kernel points; also known as -the à trous algorithm.

    ceil_mode

    when True, will use ceil instead of floor to compute the -output shape. Default: FALSE

    return_indices

    whether to return the indices where the max occurs.

    - - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nnf_max_pool2d.html b/docs/reference/nnf_max_pool2d.html deleted file mode 100644 index 745db5f012170a9e8577baa7e869f53f710d7290..0000000000000000000000000000000000000000 --- a/docs/reference/nnf_max_pool2d.html +++ /dev/null @@ -1,244 +0,0 @@ - - - - - - - - -Max_pool2d — nnf_max_pool2d • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Applies a 2D max pooling over an input signal composed of several input -planes.

    -
    - -
    nnf_max_pool2d(
    -  input,
    -  kernel_size,
    -  stride = kernel_size,
    -  padding = 0,
    -  dilation = 1,
    -  ceil_mode = FALSE,
    -  return_indices = FALSE
    -)
    - -

    Arguments

    - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    input

    input tensor (minibatch, in_channels , iH , iW)

    kernel_size

    size of the pooling region. Can be a single number or a -tuple (kH, kW)

    stride

    stride of the pooling operation. Can be a single number or a -tuple (sH, sW). Default: kernel_size

    padding

    implicit zero paddings on both sides of the input. Can be a -single number or a tuple (padH, padW). Default: 0

    dilation

    controls the spacing between the kernel points; also known as -the à trous algorithm.

    ceil_mode

    when True, will use ceil instead of floor in the formula -to compute the output shape. Default: FALSE

    return_indices

    whether to return the indices where the max occurs.

    - - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nnf_max_pool3d.html b/docs/reference/nnf_max_pool3d.html deleted file mode 100644 index 900fc670b4861ea0704f9511e5da183443a3976f..0000000000000000000000000000000000000000 --- a/docs/reference/nnf_max_pool3d.html +++ /dev/null @@ -1,244 +0,0 @@ - - - - - - - - -Max_pool3d — nnf_max_pool3d • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Applies a 3D max pooling over an input signal composed of several input -planes.

    -
    - -
    nnf_max_pool3d(
    -  input,
    -  kernel_size,
    -  stride = NULL,
    -  padding = 0,
    -  dilation = 1,
    -  ceil_mode = FALSE,
    -  return_indices = FALSE
    -)
    - -

    Arguments

    - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    input

    input tensor (minibatch, in_channels , iT * iH , iW)

    kernel_size

    size of the pooling region. Can be a single number or a -tuple (kT, kH, kW)

    stride

    stride of the pooling operation. Can be a single number or a -tuple (sT, sH, sW). Default: kernel_size

    padding

    implicit zero paddings on both sides of the input. Can be a -single number or a tuple (padT, padH, padW), Default: 0

    dilation

    controls the spacing between the kernel points; also known as -the à trous algorithm.

    ceil_mode

    when True, will use ceil instead of floor in the formula -to compute the output shape

    return_indices

    whether to return the indices where the max occurs.

    - - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nnf_max_unpool1d.html b/docs/reference/nnf_max_unpool1d.html deleted file mode 100644 index 9baabdb74e76cd48bdef4a9f2f67b7d1576c8aac..0000000000000000000000000000000000000000 --- a/docs/reference/nnf_max_unpool1d.html +++ /dev/null @@ -1,232 +0,0 @@ - - - - - - - - -Max_unpool1d — nnf_max_unpool1d • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Computes a partial inverse of MaxPool1d.

    -
    - -
    nnf_max_unpool1d(
    -  input,
    -  indices,
    -  kernel_size,
    -  stride = NULL,
    -  padding = 0,
    -  output_size = NULL
    -)
    - -

    Arguments

    - - - - - - - - - - - - - - - - - - - - - - - - - - -
    input

    the input Tensor to invert

    indices

    the indices given out by max pool

    kernel_size

    Size of the max pooling window.

    stride

    Stride of the max pooling window. It is set to kernel_size by default.

    padding

    Padding that was added to the input

    output_size

    the targeted output size

    - - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nnf_max_unpool2d.html b/docs/reference/nnf_max_unpool2d.html deleted file mode 100644 index cd9c095d9098c9f48ee0a3551cf869363545d790..0000000000000000000000000000000000000000 --- a/docs/reference/nnf_max_unpool2d.html +++ /dev/null @@ -1,232 +0,0 @@ - - - - - - - - -Max_unpool2d — nnf_max_unpool2d • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Computes a partial inverse of MaxPool2d.

    -
    - -
    nnf_max_unpool2d(
    -  input,
    -  indices,
    -  kernel_size,
    -  stride = NULL,
    -  padding = 0,
    -  output_size = NULL
    -)
    - -

    Arguments

    - - - - - - - - - - - - - - - - - - - - - - - - - - -
    input

    the input Tensor to invert

    indices

    the indices given out by max pool

    kernel_size

    Size of the max pooling window.

    stride

    Stride of the max pooling window. It is set to kernel_size by default.

    padding

    Padding that was added to the input

    output_size

    the targeted output size

    - - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nnf_max_unpool3d.html b/docs/reference/nnf_max_unpool3d.html deleted file mode 100644 index de981fa25b9011e312e3a26c1d422c541d3ed33b..0000000000000000000000000000000000000000 --- a/docs/reference/nnf_max_unpool3d.html +++ /dev/null @@ -1,232 +0,0 @@ - - - - - - - - -Max_unpool3d — nnf_max_unpool3d • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Computes a partial inverse of MaxPool3d.

    -
    - -
    nnf_max_unpool3d(
    -  input,
    -  indices,
    -  kernel_size,
    -  stride = NULL,
    -  padding = 0,
    -  output_size = NULL
    -)
    - -

    Arguments

    - - - - - - - - - - - - - - - - - - - - - - - - - - -
    input

    the input Tensor to invert

    indices

    the indices given out by max pool

    kernel_size

    Size of the max pooling window.

    stride

    Stride of the max pooling window. It is set to kernel_size by default.

    padding

    Padding that was added to the input

    output_size

    the targeted output size

    - - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nnf_mse_loss.html b/docs/reference/nnf_mse_loss.html deleted file mode 100644 index 91809077584ffb6fd1de8d239a13bea9c4e3d33b..0000000000000000000000000000000000000000 --- a/docs/reference/nnf_mse_loss.html +++ /dev/null @@ -1,216 +0,0 @@ - - - - - - - - -Mse_loss — nnf_mse_loss • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Measures the element-wise mean squared error.

    -
    - -
    nnf_mse_loss(input, target, reduction = "mean")
    - -

    Arguments

    - - - - - - - - - - - - - - -
    input

    tensor (N,*) where ** means, any number of additional dimensions

    target

    tensor (N,*) , same shape as the input

    reduction

    (string, optional) – Specifies the reduction to apply to the -output: 'none' | 'mean' | 'sum'. 'none': no reduction will be applied, 'mean': -the sum of the output will be divided by the number of elements in the output, -'sum': the output will be summed. Default: 'mean'

    - - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nnf_multi_head_attention_forward.html b/docs/reference/nnf_multi_head_attention_forward.html deleted file mode 100644 index 65ae1ae73ea87912886034c0e558cf6cce00c1f1..0000000000000000000000000000000000000000 --- a/docs/reference/nnf_multi_head_attention_forward.html +++ /dev/null @@ -1,334 +0,0 @@ - - - - - - - - -Multi head attention forward — nnf_multi_head_attention_forward • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Allows the model to jointly attend to information from different representation -subspaces. See reference: Attention Is All You Need

    -
    - -
    nnf_multi_head_attention_forward(
    -  query,
    -  key,
    -  value,
    -  embed_dim_to_check,
    -  num_heads,
    -  in_proj_weight,
    -  in_proj_bias,
    -  bias_k,
    -  bias_v,
    -  add_zero_attn,
    -  dropout_p,
    -  out_proj_weight,
    -  out_proj_bias,
    -  training = TRUE,
    -  key_padding_mask = NULL,
    -  need_weights = TRUE,
    -  attn_mask = NULL,
    -  use_separate_proj_weight = FALSE,
    -  q_proj_weight = NULL,
    -  k_proj_weight = NULL,
    -  v_proj_weight = NULL,
    -  static_k = NULL,
    -  static_v = NULL
    -)
    - -

    Arguments

    - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    query

    \((L, N, E)\) where L is the target sequence length, N is the batch size, E is -the embedding dimension.

    key

    \((S, N, E)\), where S is the source sequence length, N is the batch size, E is -the embedding dimension.

    value

    \((S, N, E)\) where S is the source sequence length, N is the batch size, E is -the embedding dimension.

    embed_dim_to_check

    total dimension of the model.

    num_heads

    parallel attention heads.

    in_proj_weight

    input projection weight and bias.

    in_proj_bias

    currently undocumented.

    bias_k

    bias of the key and value sequences to be added at dim=0.

    bias_v

    currently undocumented.

    add_zero_attn

    add a new batch of zeros to the key and -value sequences at dim=1.

    dropout_p

    probability of an element to be zeroed.

    out_proj_weight

    the output projection weight and bias.

    out_proj_bias

    currently undocumented.

    training

    apply dropout if is TRUE.

    key_padding_mask

    \((N, S)\) where N is the batch size, S is the source sequence length. -If a ByteTensor is provided, the non-zero positions will be ignored while the position -with the zero positions will be unchanged. If a BoolTensor is provided, the positions with the -value of True will be ignored while the position with the value of False will be unchanged.

    need_weights

    output attn_output_weights.

    attn_mask

    2D mask \((L, S)\) where L is the target sequence length, S is the source sequence length. -3D mask \((N*num_heads, L, S)\) where N is the batch size, L is the target sequence length, -S is the source sequence length. attn_mask ensure that position i is allowed to attend the unmasked -positions. If a ByteTensor is provided, the non-zero positions are not allowed to attend -while the zero positions will be unchanged. If a BoolTensor is provided, positions with True -is not allowed to attend while False values will be unchanged. If a FloatTensor -is provided, it will be added to the attention weight.

    use_separate_proj_weight

    the function accept the proj. weights for -query, key, and value in different forms. If false, in_proj_weight will be used, -which is a combination of q_proj_weight, k_proj_weight, v_proj_weight.

    q_proj_weight

    input projection weight and bias.

    k_proj_weight

    currently undocumented.

    v_proj_weight

    currently undocumented.

    static_k

    static key and value used for attention operators.

    static_v

    currently undocumented.

    - - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nnf_multi_margin_loss.html b/docs/reference/nnf_multi_margin_loss.html deleted file mode 100644 index 05a13bbcb8ed891511a377b18ec79acd4048e878..0000000000000000000000000000000000000000 --- a/docs/reference/nnf_multi_margin_loss.html +++ /dev/null @@ -1,240 +0,0 @@ - - - - - - - - -Multi_margin_loss — nnf_multi_margin_loss • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Creates a criterion that optimizes a multi-class classification hinge loss -(margin-based loss) between input x (a 2D mini-batch Tensor) and output y -(which is a 1D tensor of target class indices, 0 <= y <= x$size(2) - 1 ).

    -
    - -
    nnf_multi_margin_loss(
    -  input,
    -  target,
    -  p = 1,
    -  margin = 1,
    -  weight = NULL,
    -  reduction = "mean"
    -)
    - -

    Arguments

    - - - - - - - - - - - - - - - - - - - - - - - - - - -
    input

    tensor (N,*) where ** means, any number of additional dimensions

    target

    tensor (N,*) , same shape as the input

    p

    Has a default value of 1. 1 and 2 are the only supported values.

    margin

    Has a default value of 1.

    weight

    a manual rescaling weight given to each class. If given, it has to -be a Tensor of size C. Otherwise, it is treated as if having all ones.

    reduction

    (string, optional) – Specifies the reduction to apply to the -output: 'none' | 'mean' | 'sum'. 'none': no reduction will be applied, 'mean': -the sum of the output will be divided by the number of elements in the output, -'sum': the output will be summed. Default: 'mean'

    - - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nnf_multilabel_margin_loss.html b/docs/reference/nnf_multilabel_margin_loss.html deleted file mode 100644 index 096c4f8931a984e416bd0d73033688bd1c4019f2..0000000000000000000000000000000000000000 --- a/docs/reference/nnf_multilabel_margin_loss.html +++ /dev/null @@ -1,220 +0,0 @@ - - - - - - - - -Multilabel_margin_loss — nnf_multilabel_margin_loss • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Creates a criterion that optimizes a multi-class multi-classification hinge loss -(margin-based loss) between input x (a 2D mini-batch Tensor) and output y (which -is a 2D Tensor of target class indices).

    -
    - -
    nnf_multilabel_margin_loss(input, target, reduction = "mean")
    - -

    Arguments

    - - - - - - - - - - - - - - -
    input

    tensor (N,*) where ** means, any number of additional dimensions

    target

    tensor (N,*) , same shape as the input

    reduction

    (string, optional) – Specifies the reduction to apply to the -output: 'none' | 'mean' | 'sum'. 'none': no reduction will be applied, 'mean': -the sum of the output will be divided by the number of elements in the output, -'sum': the output will be summed. Default: 'mean'

    - - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nnf_multilabel_soft_margin_loss.html b/docs/reference/nnf_multilabel_soft_margin_loss.html deleted file mode 100644 index 957fce8279bb45af031d48c255c10e9ff2fa2303..0000000000000000000000000000000000000000 --- a/docs/reference/nnf_multilabel_soft_margin_loss.html +++ /dev/null @@ -1,222 +0,0 @@ - - - - - - - - -Multilabel_soft_margin_loss — nnf_multilabel_soft_margin_loss • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Creates a criterion that optimizes a multi-label one-versus-all loss based on -max-entropy, between input x and target y of size (N, C).

    -
    - -
    nnf_multilabel_soft_margin_loss(input, target, weight, reduction = "mean")
    - -

    Arguments

    - - - - - - - - - - - - - - - - - - -
    input

    tensor (N,*) where ** means, any number of additional dimensions

    target

    tensor (N,*) , same shape as the input

    weight

    weight tensor to apply on the loss.

    reduction

    (string, optional) – Specifies the reduction to apply to the -output: 'none' | 'mean' | 'sum'. 'none': no reduction will be applied, 'mean': -the sum of the output will be divided by the number of elements in the output, -'sum': the output will be summed. Default: 'mean'

    - - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nnf_nll_loss.html b/docs/reference/nnf_nll_loss.html deleted file mode 100644 index 0391362ae77303677daf8531080939081894ebcb..0000000000000000000000000000000000000000 --- a/docs/reference/nnf_nll_loss.html +++ /dev/null @@ -1,235 +0,0 @@ - - - - - - - - -Nll_loss — nnf_nll_loss • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    The negative log likelihood loss.

    -
    - -
    nnf_nll_loss(
    -  input,
    -  target,
    -  weight = NULL,
    -  ignore_index = -100,
    -  reduction = "mean"
    -)
    - -

    Arguments

    - - - - - - - - - - - - - - - - - - - - - - -
    input

    \((N, C)\) where C = number of classes or \((N, C, H, W)\) in -case of 2D Loss, or \((N, C, d_1, d_2, ..., d_K)\) where \(K \geq 1\) in -the case of K-dimensional loss.

    target

    \((N)\) where each value is \(0 \leq \mbox{targets}[i] \leq C-1\), -or \((N, d_1, d_2, ..., d_K)\) where \(K \geq 1\) for K-dimensional loss.

    weight

    (Tensor, optional) a manual rescaling weight given to each class. -If given, has to be a Tensor of size C

    ignore_index

    (int, optional) Specifies a target value that is ignored and -does not contribute to the input gradient.

    reduction

    (string, optional) – Specifies the reduction to apply to the -output: 'none' | 'mean' | 'sum'. 'none': no reduction will be applied, 'mean': -the sum of the output will be divided by the number of elements in the output, -'sum': the output will be summed. Default: 'mean'

    - - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nnf_normalize.html b/docs/reference/nnf_normalize.html deleted file mode 100644 index 4791597cfa4654cc55a10cb0c7c8496a6759fc4c..0000000000000000000000000000000000000000 --- a/docs/reference/nnf_normalize.html +++ /dev/null @@ -1,230 +0,0 @@ - - - - - - - - -Normalize — nnf_normalize • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Performs \(L_p\) normalization of inputs over specified dimension.

    -
    - -
    nnf_normalize(input, p = 2, dim = 1, eps = 1e-12, out = NULL)
    - -

    Arguments

    - - - - - - - - - - - - - - - - - - - - - - -
    input

    input tensor of any shape

    p

    (float) the exponent value in the norm formulation. Default: 2

    dim

    (int) the dimension to reduce. Default: 1

    eps

    (float) small value to avoid division by zero. Default: 1e-12

    out

    (Tensor, optional) the output tensor. If out is used, this operation won't be differentiable.

    - -

    Details

    - -

    For a tensor input of sizes \((n_0, ..., n_{dim}, ..., n_k)\), each -\(n_{dim}\) -element vector \(v\) along dimension dim is transformed as

    -

    $$ - v = \frac{v}{\max(\Vert v \Vert_p, \epsilon)}. -$$

    -

    With the default arguments it uses the Euclidean norm over vectors along -dimension \(1\) for normalization.

    - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nnf_one_hot.html b/docs/reference/nnf_one_hot.html deleted file mode 100644 index acf802adeadbd0fbdb49d5da1a4c47293cc9f9a8..0000000000000000000000000000000000000000 --- a/docs/reference/nnf_one_hot.html +++ /dev/null @@ -1,220 +0,0 @@ - - - - - - - - -One_hot — nnf_one_hot • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Takes LongTensor with index values of shape (*) and returns a tensor -of shape (*, num_classes) that have zeros everywhere except where the -index of last dimension matches the corresponding value of the input tensor, -in which case it will be 1.

    -
    - -
    nnf_one_hot(tensor, num_classes = -1)
    - -

    Arguments

    - - - - - - - - - - -
    tensor

    (LongTensor) class values of any shape.

    num_classes

    (int) Total number of classes. If set to -1, the number -of classes will be inferred as one greater than the largest class value in -the input tensor.

    - -

    Details

    - -

    One-hot on Wikipedia: https://en.wikipedia.org/wiki/One-hot

    - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nnf_pad.html b/docs/reference/nnf_pad.html deleted file mode 100644 index 00b712046fc75e8a2067dc5d9e49a9649fd24ce5..0000000000000000000000000000000000000000 --- a/docs/reference/nnf_pad.html +++ /dev/null @@ -1,248 +0,0 @@ - - - - - - - - -Pad — nnf_pad • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Pads tensor.

    -
    - -
    nnf_pad(input, pad, mode = "constant", value = 0)
    - -

    Arguments

    - - - - - - - - - - - - - - - - - - -
    input

    (Tensor) N-dimensional tensor

    pad

    (tuple) m-elements tuple, where \(\frac{m}{2} \leq\) input dimensions -and \(m\) is even.

    mode

    'constant', 'reflect', 'replicate' or 'circular'. Default: 'constant'

    value

    fill value for 'constant' padding. Default: 0.

    - -

    Padding size

    - - - - -

    The padding size by which to pad some dimensions of input -are described starting from the last dimension and moving forward. -\(\left\lfloor\frac{\mbox{len(pad)}}{2}\right\rfloor\) dimensions -of input will be padded. -For example, to pad only the last dimension of the input tensor, then -pad has the form -\((\mbox{padding\_left}, \mbox{padding\_right})\); -to pad the last 2 dimensions of the input tensor, then use -\((\mbox{padding\_left}, \mbox{padding\_right},\) -\(\mbox{padding\_top}, \mbox{padding\_bottom})\); -to pad the last 3 dimensions, use -\((\mbox{padding\_left}, \mbox{padding\_right},\) -\(\mbox{padding\_top}, \mbox{padding\_bottom}\) -\(\mbox{padding\_front}, \mbox{padding\_back})\).

    -

    Padding mode

    - - - - -

    See nn_constant_pad_2d, nn_reflection_pad_2d, and -nn_replication_pad_2d for concrete examples on how each of the -padding modes works. Constant padding is implemented for arbitrary dimensions. -tensor, or the last 2 dimensions of 4D input tensor, or the last dimension of -3D input tensor. Reflect padding is only implemented for padding the last 2 -dimensions of 4D input tensor, or the last dimension of 3D input tensor.

    - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nnf_pairwise_distance.html b/docs/reference/nnf_pairwise_distance.html deleted file mode 100644 index 79bfbca797fc9ac5936b5eb6ae5c11f60b4f76ab..0000000000000000000000000000000000000000 --- a/docs/reference/nnf_pairwise_distance.html +++ /dev/null @@ -1,222 +0,0 @@ - - - - - - - - -Pairwise_distance — nnf_pairwise_distance • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Computes the batchwise pairwise distance between vectors using the p-norm.

    -
    - -
    nnf_pairwise_distance(x1, x2, p = 2, eps = 1e-06, keepdim = FALSE)
    - -

    Arguments

    - - - - - - - - - - - - - - - - - - - - - - -
    x1

    (Tensor) First input.

    x2

    (Tensor) Second input (of size matching x1).

    p

    the norm degree. Default: 2

    eps

    (float, optional) Small value to avoid division by zero. -Default: 1e-8

    keepdim

    Determines whether or not to keep the vector dimension. Default: False

    - - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nnf_pdist.html b/docs/reference/nnf_pdist.html deleted file mode 100644 index fdb32b012bafb7a77499e135df94306b47df9d18..0000000000000000000000000000000000000000 --- a/docs/reference/nnf_pdist.html +++ /dev/null @@ -1,220 +0,0 @@ - - - - - - - - -Pdist — nnf_pdist • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Computes the p-norm distance between every pair of row vectors in the input. -This is identical to the upper triangular portion, excluding the diagonal, of -torch_norm(input[:, None] - input, dim=2, p=p). This function will be faster -if the rows are contiguous.

    -
    - -
    nnf_pdist(input, p = 2)
    - -

    Arguments

    - - - - - - - - - - -
    input

    input tensor of shape \(N \times M\).

    p

    p value for the p-norm distance to calculate between each vector pair -\(\in [0, \infty]\).

    - -

    Details

    - -

    If input has shape \(N \times M\) then the output will have shape -\(\frac{1}{2} N (N - 1)\).

    - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nnf_pixel_shuffle.html b/docs/reference/nnf_pixel_shuffle.html deleted file mode 100644 index e6f00ba1cf09c157349267535b4fb76070817b65..0000000000000000000000000000000000000000 --- a/docs/reference/nnf_pixel_shuffle.html +++ /dev/null @@ -1,211 +0,0 @@ - - - - - - - - -Pixel_shuffle — nnf_pixel_shuffle • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Rearranges elements in a tensor of shape \((*, C \times r^2, H, W)\) to a -tensor of shape \((*, C, H \times r, W \times r)\).

    -
    - -
    nnf_pixel_shuffle(input, upscale_factor)
    - -

    Arguments

    - - - - - - - - - - -
    input

    (Tensor) the input tensor

    upscale_factor

    (int) factor to increase spatial resolution by

    - - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nnf_poisson_nll_loss.html b/docs/reference/nnf_poisson_nll_loss.html deleted file mode 100644 index b440e9213d309f6f7f51c008a6b56f3ac036a2c1..0000000000000000000000000000000000000000 --- a/docs/reference/nnf_poisson_nll_loss.html +++ /dev/null @@ -1,239 +0,0 @@ - - - - - - - - -Poisson_nll_loss — nnf_poisson_nll_loss • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Poisson negative log likelihood loss.

    -
    - -
    nnf_poisson_nll_loss(
    -  input,
    -  target,
    -  log_input = TRUE,
    -  full = FALSE,
    -  eps = 1e-08,
    -  reduction = "mean"
    -)
    - -

    Arguments

    - - - - - - - - - - - - - - - - - - - - - - - - - - -
    input

    tensor (N,*) where ** means, any number of additional dimensions

    target

    tensor (N,*) , same shape as the input

    log_input

    if TRUE the loss is computed as \(\exp(\mbox{input}) - \mbox{target} * \mbox{input}\), -if FALSE then loss is \(\mbox{input} - \mbox{target} * \log(\mbox{input}+\mbox{eps})\). -Default: TRUE.

    full

    whether to compute full loss, i. e. to add the Stirling approximation -term. Default: FALSE.

    eps

    (float, optional) Small value to avoid evaluation of \(\log(0)\) when -log_input=FALSE. Default: 1e-8

    reduction

    (string, optional) – Specifies the reduction to apply to the -output: 'none' | 'mean' | 'sum'. 'none': no reduction will be applied, 'mean': -the sum of the output will be divided by the number of elements in the output, -'sum': the output will be summed. Default: 'mean'

    - - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nnf_prelu.html b/docs/reference/nnf_prelu.html deleted file mode 100644 index 673939269665564be97854267b93f1f0efd276d1..0000000000000000000000000000000000000000 --- a/docs/reference/nnf_prelu.html +++ /dev/null @@ -1,214 +0,0 @@ - - - - - - - - -Prelu — nnf_prelu • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Applies element-wise the function -\(PReLU(x) = max(0,x) + weight * min(0,x)\) -where weight is a learnable parameter.

    -
    - -
    nnf_prelu(input, weight)
    - -

    Arguments

    - - - - - - - - - - -
    input

    (N,*) tensor, where * means, any number of additional -dimensions

    weight

    (Tensor) the learnable weights

    - - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nnf_relu.html b/docs/reference/nnf_relu.html deleted file mode 100644 index aa6ed2c21baa0151d39d25ef9804f79df4ab270f..0000000000000000000000000000000000000000 --- a/docs/reference/nnf_relu.html +++ /dev/null @@ -1,212 +0,0 @@ - - - - - - - - -Relu — nnf_relu • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Applies the rectified linear unit function element-wise.

    -
    - -
    nnf_relu(input, inplace = FALSE)
    -
    -nnf_relu_(input)
    - -

    Arguments

    - - - - - - - - - - -
    input

    (N,*) tensor, where * means, any number of additional -dimensions

    inplace

    can optionally do the operation in-place. Default: FALSE

    - - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nnf_relu6.html b/docs/reference/nnf_relu6.html deleted file mode 100644 index 258d84665adc40d390dd85dfeb57e75ca89a46ab..0000000000000000000000000000000000000000 --- a/docs/reference/nnf_relu6.html +++ /dev/null @@ -1,210 +0,0 @@ - - - - - - - - -Relu6 — nnf_relu6 • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Applies the element-wise function \(ReLU6(x) = min(max(0,x), 6)\).

    -
    - -
    nnf_relu6(input, inplace = FALSE)
    - -

    Arguments

    - - - - - - - - - - -
    input

    (N,*) tensor, where * means, any number of additional -dimensions

    inplace

    can optionally do the operation in-place. Default: FALSE

    - - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nnf_rrelu.html b/docs/reference/nnf_rrelu.html deleted file mode 100644 index b635fbbb6b2cdcab81046f3269ec71291e8b566f..0000000000000000000000000000000000000000 --- a/docs/reference/nnf_rrelu.html +++ /dev/null @@ -1,224 +0,0 @@ - - - - - - - - -Rrelu — nnf_rrelu • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Randomized leaky ReLU.

    -
    - -
    nnf_rrelu(input, lower = 1/8, upper = 1/3, training = FALSE, inplace = FALSE)
    -
    -nnf_rrelu_(input, lower = 1/8, upper = 1/3, training = FALSE)
    - -

    Arguments

    - - - - - - - - - - - - - - - - - - - - - - -
    input

    (N,*) tensor, where * means, any number of additional -dimensions

    lower

    lower bound of the uniform distribution. Default: 1/8

    upper

    upper bound of the uniform distribution. Default: 1/3

    training

    bool wether it's a training pass. DEfault: FALSE

    inplace

    can optionally do the operation in-place. Default: FALSE

    - - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nnf_selu.html b/docs/reference/nnf_selu.html deleted file mode 100644 index 09ca4fa96bfa86287310552d40971243b2f4df96..0000000000000000000000000000000000000000 --- a/docs/reference/nnf_selu.html +++ /dev/null @@ -1,228 +0,0 @@ - - - - - - - - -Selu — nnf_selu • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Applies element-wise, -$$SELU(x) = scale * (max(0,x) + min(0, \alpha * (exp(x) - 1)))$$, -with \(\alpha=1.6732632423543772848170429916717\) and -\(scale=1.0507009873554804934193349852946\).

    -
    - -
    nnf_selu(input, inplace = FALSE)
    -
    -nnf_selu_(input)
    - -

    Arguments

    - - - - - - - - - - -
    input

    (N,*) tensor, where * means, any number of additional -dimensions

    inplace

    can optionally do the operation in-place. Default: FALSE

    - - -

    Examples

    -
    # \dontrun{ -x <- torch_randn(2, 2) -y <- nnf_selu(x) -nnf_selu_(x)
    #> torch_tensor -#> 0.3549 0.1844 -#> -1.4839 -0.8035 -#> [ CPUFloatType{2,2} ]
    #> [1] TRUE
    -# } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nnf_smooth_l1_loss.html b/docs/reference/nnf_smooth_l1_loss.html deleted file mode 100644 index 8648514dcb3578e7693f36b35d6cfdd150feb234..0000000000000000000000000000000000000000 --- a/docs/reference/nnf_smooth_l1_loss.html +++ /dev/null @@ -1,218 +0,0 @@ - - - - - - - - -Smooth_l1_loss — nnf_smooth_l1_loss • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Function that uses a squared term if the absolute -element-wise error falls below 1 and an L1 term otherwise.

    -
    - -
    nnf_smooth_l1_loss(input, target, reduction = "mean")
    - -

    Arguments

    - - - - - - - - - - - - - - -
    input

    tensor (N,*) where ** means, any number of additional dimensions

    target

    tensor (N,*) , same shape as the input

    reduction

    (string, optional) – Specifies the reduction to apply to the -output: 'none' | 'mean' | 'sum'. 'none': no reduction will be applied, 'mean': -the sum of the output will be divided by the number of elements in the output, -'sum': the output will be summed. Default: 'mean'

    - - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nnf_soft_margin_loss.html b/docs/reference/nnf_soft_margin_loss.html deleted file mode 100644 index 54ef73a66baecce6fad6e63ac697ee020c823740..0000000000000000000000000000000000000000 --- a/docs/reference/nnf_soft_margin_loss.html +++ /dev/null @@ -1,218 +0,0 @@ - - - - - - - - -Soft_margin_loss — nnf_soft_margin_loss • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Creates a criterion that optimizes a two-class classification logistic loss -between input tensor x and target tensor y (containing 1 or -1).

    -
    - -
    nnf_soft_margin_loss(input, target, reduction = "mean")
    - -

    Arguments

    - - - - - - - - - - - - - - -
    input

    tensor (N,*) where ** means, any number of additional dimensions

    target

    tensor (N,*) , same shape as the input

    reduction

    (string, optional) – Specifies the reduction to apply to the -output: 'none' | 'mean' | 'sum'. 'none': no reduction will be applied, 'mean': -the sum of the output will be divided by the number of elements in the output, -'sum': the output will be summed. Default: 'mean'

    - - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nnf_softmax.html b/docs/reference/nnf_softmax.html deleted file mode 100644 index 96192890ff7ee47af4b59a410a8a4b893adfc76e..0000000000000000000000000000000000000000 --- a/docs/reference/nnf_softmax.html +++ /dev/null @@ -1,220 +0,0 @@ - - - - - - - - -Softmax — nnf_softmax • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Applies a softmax function.

    -
    - -
    nnf_softmax(input, dim, dtype = NULL)
    - -

    Arguments

    - - - - - - - - - - - - - - -
    input

    (Tensor) input

    dim

    (int) A dimension along which softmax will be computed.

    dtype

    (torch.dtype, optional) the desired data type of returned tensor. If specified, the input tensor is casted to dtype before the operation is performed. This is useful for preventing data type overflows. -Default: NULL.

    - -

    Details

    - -

    Softmax is defined as:

    -

    $$Softmax(x_{i}) = exp(x_i)/\sum_j exp(x_j)$$

    -

    It is applied to all slices along dim, and will re-scale them so that the elements -lie in the range [0, 1] and sum to 1.

    - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nnf_softmin.html b/docs/reference/nnf_softmin.html deleted file mode 100644 index df967c8ec49e2ef17eb3f73a3a53c4881956f7ac..0000000000000000000000000000000000000000 --- a/docs/reference/nnf_softmin.html +++ /dev/null @@ -1,220 +0,0 @@ - - - - - - - - -Softmin — nnf_softmin • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Applies a softmin function.

    -
    - -
    nnf_softmin(input, dim, dtype = NULL)
    - -

    Arguments

    - - - - - - - - - - - - - - -
    input

    (Tensor) input

    dim

    (int) A dimension along which softmin will be computed -(so every slice along dim will sum to 1).

    dtype

    (torch.dtype, optional) the desired data type of returned tensor. If specified, the input tensor is casted to dtype before the operation is performed. -This is useful for preventing data type overflows. Default: NULL.

    - -

    Details

    - -

    Note that

    -

    $$Softmin(x) = Softmax(-x)$$.

    -

    See nnf_softmax definition for mathematical formula.

    - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nnf_softplus.html b/docs/reference/nnf_softplus.html deleted file mode 100644 index 4a0100d1555952d189ad46e28c847d2344b32859..0000000000000000000000000000000000000000 --- a/docs/reference/nnf_softplus.html +++ /dev/null @@ -1,218 +0,0 @@ - - - - - - - - -Softplus — nnf_softplus • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Applies element-wise, the function \(Softplus(x) = 1/\beta * log(1 + exp(\beta * x))\).

    -
    - -
    nnf_softplus(input, beta = 1, threshold = 20)
    - -

    Arguments

    - - - - - - - - - - - - - - -
    input

    (N,*) tensor, where * means, any number of additional -dimensions

    beta

    the beta value for the Softplus formulation. Default: 1

    threshold

    values above this revert to a linear function. Default: 20

    - -

    Details

    - -

    For numerical stability the implementation reverts to the linear function -when \(input * \beta > threshold\).

    - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nnf_softshrink.html b/docs/reference/nnf_softshrink.html deleted file mode 100644 index 32230837cf56d5ec4dd23eb9da74b241f86c066f..0000000000000000000000000000000000000000 --- a/docs/reference/nnf_softshrink.html +++ /dev/null @@ -1,211 +0,0 @@ - - - - - - - - -Softshrink — nnf_softshrink • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Applies the soft shrinkage function elementwise

    -
    - -
    nnf_softshrink(input, lambd = 0.5)
    - -

    Arguments

    - - - - - - - - - - -
    input

    (N,*) tensor, where * means, any number of additional -dimensions

    lambd

    the lambda (must be no less than zero) value for the Softshrink -formulation. Default: 0.5

    - - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nnf_softsign.html b/docs/reference/nnf_softsign.html deleted file mode 100644 index ce468c4af2ba06eb41359ffd1fcd7b68310fe475..0000000000000000000000000000000000000000 --- a/docs/reference/nnf_softsign.html +++ /dev/null @@ -1,206 +0,0 @@ - - - - - - - - -Softsign — nnf_softsign • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Applies element-wise, the function \(SoftSign(x) = x/(1 + |x|\)

    -
    - -
    nnf_softsign(input)
    - -

    Arguments

    - - - - - - -
    input

    (N,*) tensor, where * means, any number of additional -dimensions

    - - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nnf_tanhshrink.html b/docs/reference/nnf_tanhshrink.html deleted file mode 100644 index a95dbc3b6aed1a72bd5a0b1469b4d6d4158de4c8..0000000000000000000000000000000000000000 --- a/docs/reference/nnf_tanhshrink.html +++ /dev/null @@ -1,206 +0,0 @@ - - - - - - - - -Tanhshrink — nnf_tanhshrink • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Applies element-wise, \(Tanhshrink(x) = x - Tanh(x)\)

    -
    - -
    nnf_tanhshrink(input)
    - -

    Arguments

    - - - - - - -
    input

    (N,*) tensor, where * means, any number of additional -dimensions

    - - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nnf_threshold.html b/docs/reference/nnf_threshold.html deleted file mode 100644 index 5d7c6e79044e89763cd453cbb1c613b44d5742f4..0000000000000000000000000000000000000000 --- a/docs/reference/nnf_threshold.html +++ /dev/null @@ -1,220 +0,0 @@ - - - - - - - - -Threshold — nnf_threshold • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Thresholds each element of the input Tensor.

    -
    - -
    nnf_threshold(input, threshold, value, inplace = FALSE)
    -
    -nnf_threshold_(input, threshold, value)
    - -

    Arguments

    - - - - - - - - - - - - - - - - - - -
    input

    (N,*) tensor, where * means, any number of additional -dimensions

    threshold

    The value to threshold at

    value

    The value to replace with

    inplace

    can optionally do the operation in-place. Default: FALSE

    - - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nnf_triplet_margin_loss.html b/docs/reference/nnf_triplet_margin_loss.html deleted file mode 100644 index ba202431a7ba32865158d471ac1b2ff51f333da9..0000000000000000000000000000000000000000 --- a/docs/reference/nnf_triplet_margin_loss.html +++ /dev/null @@ -1,255 +0,0 @@ - - - - - - - - -Triplet_margin_loss — nnf_triplet_margin_loss • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Creates a criterion that measures the triplet loss given an input tensors x1 , -x2 , x3 and a margin with a value greater than 0 . This is used for measuring -a relative similarity between samples. A triplet is composed by a, p and n (i.e., -anchor, positive examples and negative examples respectively). The shapes of all -input tensors should be (N, D).

    -
    - -
    nnf_triplet_margin_loss(
    -  anchor,
    -  positive,
    -  negative,
    -  margin = 1,
    -  p = 2,
    -  eps = 1e-06,
    -  swap = FALSE,
    -  reduction = "mean"
    -)
    - -

    Arguments

    - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    anchor

    the anchor input tensor

    positive

    the positive input tensor

    negative

    the negative input tensor

    margin

    Default: 1.

    p

    The norm degree for pairwise distance. Default: 2.

    eps

    (float, optional) Small value to avoid division by zero.

    swap

    The distance swap is described in detail in the paper Learning shallow -convolutional feature descriptors with triplet losses by V. Balntas, E. Riba et al. -Default: FALSE.

    reduction

    (string, optional) – Specifies the reduction to apply to the -output: 'none' | 'mean' | 'sum'. 'none': no reduction will be applied, 'mean': -the sum of the output will be divided by the number of elements in the output, -'sum': the output will be summed. Default: 'mean'

    - - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/nnf_unfold.html b/docs/reference/nnf_unfold.html deleted file mode 100644 index 662109dff7a71c52269913436ba0ee69a4b70af8..0000000000000000000000000000000000000000 --- a/docs/reference/nnf_unfold.html +++ /dev/null @@ -1,237 +0,0 @@ - - - - - - - - -Unfold — nnf_unfold • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Extracts sliding local blocks from an batched input tensor.

    -
    - -
    nnf_unfold(input, kernel_size, dilation = 1, padding = 0, stride = 1)
    - -

    Arguments

    - - - - - - - - - - - - - - - - - - - - - - -
    input

    the input tensor

    kernel_size

    the size of the sliding blocks

    dilation

    a parameter that controls the stride of elements within the -neighborhood. Default: 1

    padding

    implicit zero padding to be added on both sides of input. -Default: 0

    stride

    the stride of the sliding blocks in the input spatial dimensions. -Default: 1

    - -

    Warning

    - - - - -

    Currently, only 4-D input tensors (batched image-like tensors) are -supported.

    - - -

    More than one element of the unfolded tensor may refer to a single -memory location. As a result, in-place operations (especially ones that -are vectorized) may result in incorrect behavior. If you need to write -to the tensor, please clone it first.

    - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/optim_adam.html b/docs/reference/optim_adam.html deleted file mode 100644 index 51b804c19c5658d8db423a7a948ae11f0d5b2927..0000000000000000000000000000000000000000 --- a/docs/reference/optim_adam.html +++ /dev/null @@ -1,239 +0,0 @@ - - - - - - - - -Implements Adam algorithm. — optim_adam • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    It has been proposed in Adam: A Method for Stochastic Optimization.

    -
    - -
    optim_adam(
    -  params,
    -  lr = 0.001,
    -  betas = c(0.9, 0.999),
    -  eps = 1e-08,
    -  weight_decay = 0,
    -  amsgrad = FALSE
    -)
    - -

    Arguments

    - - - - - - - - - - - - - - - - - - - - - - - - - - -
    params

    (iterable): iterable of parameters to optimize or dicts defining -parameter groups

    lr

    (float, optional): learning rate (default: 1e-3)

    betas

    (Tuple[float, float], optional): coefficients used for computing -running averages of gradient and its square (default: (0.9, 0.999))

    eps

    (float, optional): term added to the denominator to improve -numerical stability (default: 1e-8)

    weight_decay

    (float, optional): weight decay (L2 penalty) (default: 0)

    amsgrad

    (boolean, optional): whether to use the AMSGrad variant of this -algorithm from the paper On the Convergence of Adam and Beyond -(default: FALSE)

    - - -

    Examples

    -
    
    -  
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/optim_required.html b/docs/reference/optim_required.html deleted file mode 100644 index 8316f053ec54ff9396626f39ff4367a3919da9dc..0000000000000000000000000000000000000000 --- a/docs/reference/optim_required.html +++ /dev/null @@ -1,197 +0,0 @@ - - - - - - - - -Dummy value indicating a required value. — optim_required • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    export

    -
    - -
    optim_required()
    - - - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/optim_sgd.html b/docs/reference/optim_sgd.html deleted file mode 100644 index fef15d1e7203d55bad049cee715e188db59c9305..0000000000000000000000000000000000000000 --- a/docs/reference/optim_sgd.html +++ /dev/null @@ -1,264 +0,0 @@ - - - - - - - - -SGD optimizer — optim_sgd • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Implements stochastic gradient descent (optionally with momentum). -Nesterov momentum is based on the formula from -On the importance of initialization and momentum in deep learning.

    -
    - -
    optim_sgd(
    -  params,
    -  lr = optim_required(),
    -  momentum = 0,
    -  dampening = 0,
    -  weight_decay = 0,
    -  nesterov = FALSE
    -)
    - -

    Arguments

    - - - - - - - - - - - - - - - - - - - - - - - - - - -
    params

    (iterable): iterable of parameters to optimize or dicts defining -parameter groups

    lr

    (float): learning rate

    momentum

    (float, optional): momentum factor (default: 0)

    dampening

    (float, optional): dampening for momentum (default: 0)

    weight_decay

    (float, optional): weight decay (L2 penalty) (default: 0)

    nesterov

    (bool, optional): enables Nesterov momentum (default: FALSE)

    - -

    Note

    - - - - -

    The implementation of SGD with Momentum-Nesterov subtly differs from -Sutskever et. al. and implementations in some other frameworks.

    -

    Considering the specific case of Momentum, the update can be written as -$$ - \begin{array}{ll} -v_{t+1} & = \mu * v_{t} + g_{t+1}, \\ -p_{t+1} & = p_{t} - \mbox{lr} * v_{t+1}, -\end{array} -$$

    -

    where \(p\), \(g\), \(v\) and \(\mu\) denote the -parameters, gradient, velocity, and momentum respectively.

    -

    This is in contrast to Sutskever et. al. and -other frameworks which employ an update of the form

    -

    $$ - \begin{array}{ll} -v_{t+1} & = \mu * v_{t} + \mbox{lr} * g_{t+1}, \\ -p_{t+1} & = p_{t} - v_{t+1}. -\end{array} -$$ -The Nesterov version is analogously modified.

    - -

    Examples

    -
    
    -  
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/tensor_dataset.html b/docs/reference/tensor_dataset.html deleted file mode 100644 index 4d4eaa0810a4221eaa30221b4481c130a3f3ef52..0000000000000000000000000000000000000000 --- a/docs/reference/tensor_dataset.html +++ /dev/null @@ -1,205 +0,0 @@ - - - - - - - - -Dataset wrapping tensors. — tensor_dataset • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Each sample will be retrieved by indexing tensors along the first dimension.

    -
    - -
    tensor_dataset(...)
    - -

    Arguments

    - - - - - - -
    ...

    tensors that have the same size of the first dimension.

    - - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_abs.html b/docs/reference/torch_abs.html deleted file mode 100644 index 86289261f4190b876a72754ba5d00032bf94715a..0000000000000000000000000000000000000000 --- a/docs/reference/torch_abs.html +++ /dev/null @@ -1,226 +0,0 @@ - - - - - - - - -Abs — torch_abs • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Abs

    -
    - - -

    Arguments

    - - - - - - - - - - -
    input

    (Tensor) the input tensor.

    out

    (Tensor, optional) the output tensor.

    - -

    abs(input, out=None) -> Tensor

    - - - - -

    Computes the element-wise absolute value of the given input tensor.

    -

    $$ - \mbox{out}_{i} = |\mbox{input}_{i}| -$$

    - -

    Examples

    -
    # \dontrun{ - -torch_abs(torch_tensor(c(-1, -2, 3)))
    #> torch_tensor -#> 1 -#> 2 -#> 3 -#> [ CPUFloatType{3} ]
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_acos.html b/docs/reference/torch_acos.html deleted file mode 100644 index 7f0a038344c6ab0e3d83c0b8198f855b86bd0f51..0000000000000000000000000000000000000000 --- a/docs/reference/torch_acos.html +++ /dev/null @@ -1,233 +0,0 @@ - - - - - - - - -Acos — torch_acos • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Acos

    -
    - - -

    Arguments

    - - - - - - - - - - -
    input

    (Tensor) the input tensor.

    out

    (Tensor, optional) the output tensor.

    - -

    acos(input, out=None) -> Tensor

    - - - - -

    Returns a new tensor with the arccosine of the elements of input.

    -

    $$ - \mbox{out}_{i} = \cos^{-1}(\mbox{input}_{i}) -$$

    - -

    Examples

    -
    # \dontrun{ - -a = torch_randn(c(4)) -a
    #> torch_tensor -#> -0.6137 -#> 1.1598 -#> 0.0958 -#> -0.2733 -#> [ CPUFloatType{4} ]
    torch_acos(a)
    #> torch_tensor -#> 2.2315 -#> nan -#> 1.4748 -#> 1.8476 -#> [ CPUFloatType{4} ]
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_adaptive_avg_pool1d.html b/docs/reference/torch_adaptive_avg_pool1d.html deleted file mode 100644 index c46cb1abac3f804286e8fd5c4c108e94d1674eb5..0000000000000000000000000000000000000000 --- a/docs/reference/torch_adaptive_avg_pool1d.html +++ /dev/null @@ -1,212 +0,0 @@ - - - - - - - - -Adaptive_avg_pool1d — torch_adaptive_avg_pool1d • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Adaptive_avg_pool1d

    -
    - - -

    Arguments

    - - - - - - -
    output_size

    NA the target output size (single integer)

    - -

    adaptive_avg_pool1d(input, output_size) -> Tensor

    - - - - -

    Applies a 1D adaptive average pooling over an input signal composed of -several input planes.

    -

    See ~torch.nn.AdaptiveAvgPool1d for details and output shape.

    - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_add.html b/docs/reference/torch_add.html deleted file mode 100644 index 57bab752ad47d494e64b9b3dc7bc2bdbc90802d1..0000000000000000000000000000000000000000 --- a/docs/reference/torch_add.html +++ /dev/null @@ -1,278 +0,0 @@ - - - - - - - - -Add — torch_add • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Add

    -
    - - -

    Arguments

    - - - - - - - - - - - - - - - - - - -
    input

    (Tensor) the input tensor.

    value

    (Number) the number to be added to each element of input

    other

    (Tensor) the second input tensor

    alpha

    (Number) the scalar multiplier for other

    - -

    add(input, other, out=None)

    - - - - -

    Adds the scalar other to each element of the input input -and returns a new resulting tensor.

    -

    $$ - \mbox{out} = \mbox{input} + \mbox{other} -$$ -If input is of type FloatTensor or DoubleTensor, other must be -a real number, otherwise it should be an integer.

    -

    add(input, other, *, alpha=1, out=None)

    - - - - -

    Each element of the tensor other is multiplied by the scalar -alpha and added to each element of the tensor input. -The resulting tensor is returned.

    -

    The shapes of input and other must be -broadcastable .

    -

    $$ - \mbox{out} = \mbox{input} + \mbox{alpha} \times \mbox{other} -$$ -If other is of type FloatTensor or DoubleTensor, alpha must be -a real number, otherwise it should be an integer.

    - -

    Examples

    -
    # \dontrun{ - -a = torch_randn(c(4)) -a
    #> torch_tensor -#> 0.2160 -#> 0.1973 -#> -0.1795 -#> -0.9024 -#> [ CPUFloatType{4} ]
    torch_add(a, 20)
    #> torch_tensor -#> 20.2160 -#> 20.1973 -#> 19.8204 -#> 19.0976 -#> [ CPUFloatType{4} ]
    - -a = torch_randn(c(4)) -a
    #> torch_tensor -#> -1.6998 -#> -0.1848 -#> -0.4348 -#> -0.7475 -#> [ CPUFloatType{4} ]
    b = torch_randn(c(4, 1)) -b
    #> torch_tensor -#> 0.9213 -#> 0.5193 -#> 0.3855 -#> -1.5317 -#> [ CPUFloatType{4,1} ]
    torch_add(a, b)
    #> torch_tensor -#> -0.7785 0.7364 0.4865 0.1738 -#> -1.1805 0.3345 0.0845 -0.2282 -#> -1.3142 0.2007 -0.0492 -0.3619 -#> -3.2315 -1.7166 -1.9665 -2.2792 -#> [ CPUFloatType{4,4} ]
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_addbmm.html b/docs/reference/torch_addbmm.html deleted file mode 100644 index 3ddde568f0883d9f4274e86ddc997e76dc7b6b13..0000000000000000000000000000000000000000 --- a/docs/reference/torch_addbmm.html +++ /dev/null @@ -1,257 +0,0 @@ - - - - - - - - -Addbmm — torch_addbmm • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Addbmm

    -
    - - -

    Arguments

    - - - - - - - - - - - - - - - - - - - - - - - - - - -
    batch1

    (Tensor) the first batch of matrices to be multiplied

    batch2

    (Tensor) the second batch of matrices to be multiplied

    beta

    (Number, optional) multiplier for input (\(\beta\))

    input

    (Tensor) matrix to be added

    alpha

    (Number, optional) multiplier for batch1 @ batch2 (\(\alpha\))

    out

    (Tensor, optional) the output tensor.

    - -

    addbmm(input, batch1, batch2, *, beta=1, alpha=1, out=None) -> Tensor

    - - - - -

    Performs a batch matrix-matrix product of matrices stored -in batch1 and batch2, -with a reduced add step (all matrix multiplications get accumulated -along the first dimension). -input is added to the final result.

    -

    batch1 and batch2 must be 3-D tensors each containing the -same number of matrices.

    -

    If batch1 is a \((b \times n \times m)\) tensor, batch2 is a -\((b \times m \times p)\) tensor, input must be -broadcastable with a \((n \times p)\) tensor -and out will be a \((n \times p)\) tensor.

    -

    $$ - out = \beta\ \mbox{input} + \alpha\ (\sum_{i=0}^{b-1} \mbox{batch1}_i \mathbin{@} \mbox{batch2}_i) -$$ -For inputs of type FloatTensor or DoubleTensor, arguments beta and alpha -must be real numbers, otherwise they should be integers.

    - -

    Examples

    -
    # \dontrun{ - -M = torch_randn(c(3, 5)) -batch1 = torch_randn(c(10, 3, 4)) -batch2 = torch_randn(c(10, 4, 5)) -torch_addbmm(M, batch1, batch2)
    #> torch_tensor -#> 5.7025 7.7808 5.3946 0.1290 2.4487 -#> -1.9730 -3.0379 -5.4090 0.6009 -3.1469 -#> 4.6785 6.4997 -11.4732 6.3957 10.2272 -#> [ CPUFloatType{3,5} ]
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_addcdiv.html b/docs/reference/torch_addcdiv.html deleted file mode 100644 index 4f76063f7073d59f8baad80f60d7aa9336a170b1..0000000000000000000000000000000000000000 --- a/docs/reference/torch_addcdiv.html +++ /dev/null @@ -1,260 +0,0 @@ - - - - - - - - -Addcdiv — torch_addcdiv • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Addcdiv

    -
    - - -

    Arguments

    - - - - - - - - - - - - - - - - - - - - - - -
    input

    (Tensor) the tensor to be added

    tensor1

    (Tensor) the numerator tensor

    tensor2

    (Tensor) the denominator tensor

    value

    (Number, optional) multiplier for \(\mbox{tensor1} / \mbox{tensor2}\)

    out

    (Tensor, optional) the output tensor.

    - -

    addcdiv(input, tensor1, tensor2, *, value=1, out=None) -> Tensor

    - - - - -

    Performs the element-wise division of tensor1 by tensor2, -multiply the result by the scalar value and add it to input.

    -

    Warning

    - - - -

    Integer division with addcdiv is deprecated, and in a future release -addcdiv will perform a true division of tensor1 and tensor2. -The current addcdiv behavior can be replicated using torch_floor_divide() -for integral inputs -(input + value * tensor1 // tensor2) -and torch_div() for float inputs -(input + value * tensor1 / tensor2). -The new addcdiv behavior can be implemented with torch_true_divide() -(input + value * torch.true_divide(tensor1, -tensor2).

    -

    $$ - \mbox{out}_i = \mbox{input}_i + \mbox{value} \times \frac{\mbox{tensor1}_i}{\mbox{tensor2}_i} -$$

    -

    The shapes of input, tensor1, and tensor2 must be -broadcastable .

    -

    For inputs of type FloatTensor or DoubleTensor, value must be -a real number, otherwise an integer.

    - -

    Examples

    -
    # \dontrun{ - -t = torch_randn(c(1, 3)) -t1 = torch_randn(c(3, 1)) -t2 = torch_randn(c(1, 3)) -torch_addcdiv(t, t1, t2, 0.1)
    #> torch_tensor -#> -0.3050 -0.1424 0.5617 -#> -0.3424 0.0045 1.0519 -#> -0.2932 -0.1885 0.4079 -#> [ CPUFloatType{3,3} ]
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_addcmul.html b/docs/reference/torch_addcmul.html deleted file mode 100644 index d3d5f082ff53595211965b18a23178f25993b059..0000000000000000000000000000000000000000 --- a/docs/reference/torch_addcmul.html +++ /dev/null @@ -1,247 +0,0 @@ - - - - - - - - -Addcmul — torch_addcmul • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Addcmul

    -
    - - -

    Arguments

    - - - - - - - - - - - - - - - - - - - - - - -
    input

    (Tensor) the tensor to be added

    tensor1

    (Tensor) the tensor to be multiplied

    tensor2

    (Tensor) the tensor to be multiplied

    value

    (Number, optional) multiplier for \(tensor1 .* tensor2\)

    out

    (Tensor, optional) the output tensor.

    - -

    addcmul(input, tensor1, tensor2, *, value=1, out=None) -> Tensor

    - - - - -

    Performs the element-wise multiplication of tensor1 -by tensor2, multiply the result by the scalar value -and add it to input.

    -

    $$ - \mbox{out}_i = \mbox{input}_i + \mbox{value} \times \mbox{tensor1}_i \times \mbox{tensor2}_i -$$ -The shapes of tensor, tensor1, and tensor2 must be -broadcastable .

    -

    For inputs of type FloatTensor or DoubleTensor, value must be -a real number, otherwise an integer.

    - -

    Examples

    -
    # \dontrun{ - -t = torch_randn(c(1, 3)) -t1 = torch_randn(c(3, 1)) -t2 = torch_randn(c(1, 3)) -torch_addcmul(t, t1, t2, 0.1)
    #> torch_tensor -#> -0.2008 -0.8560 0.9351 -#> -0.1664 -0.8433 0.9042 -#> -0.5670 -0.9908 1.2630 -#> [ CPUFloatType{3,3} ]
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_addmm.html b/docs/reference/torch_addmm.html deleted file mode 100644 index 5bafc455fd4e403ad0549072acc2a3373092edf2..0000000000000000000000000000000000000000 --- a/docs/reference/torch_addmm.html +++ /dev/null @@ -1,253 +0,0 @@ - - - - - - - - -Addmm — torch_addmm • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Addmm

    -
    - - -

    Arguments

    - - - - - - - - - - - - - - - - - - - - - - - - - - -
    input

    (Tensor) matrix to be added

    mat1

    (Tensor) the first matrix to be multiplied

    mat2

    (Tensor) the second matrix to be multiplied

    beta

    (Number, optional) multiplier for input (\(\beta\))

    alpha

    (Number, optional) multiplier for \(mat1 @ mat2\) (\(\alpha\))

    out

    (Tensor, optional) the output tensor.

    - -

    addmm(input, mat1, mat2, *, beta=1, alpha=1, out=None) -> Tensor

    - - - - -

    Performs a matrix multiplication of the matrices mat1 and mat2. -The matrix input is added to the final result.

    -

    If mat1 is a \((n \times m)\) tensor, mat2 is a -\((m \times p)\) tensor, then input must be -broadcastable with a \((n \times p)\) tensor -and out will be a \((n \times p)\) tensor.

    -

    alpha and beta are scaling factors on matrix-vector product between -mat1 and mat2 and the added matrix input respectively.

    -

    $$ - \mbox{out} = \beta\ \mbox{input} + \alpha\ (\mbox{mat1}_i \mathbin{@} \mbox{mat2}_i) -$$ -For inputs of type FloatTensor or DoubleTensor, arguments beta and -alpha must be real numbers, otherwise they should be integers.

    - -

    Examples

    -
    # \dontrun{ - -M = torch_randn(c(2, 3)) -mat1 = torch_randn(c(2, 3)) -mat2 = torch_randn(c(3, 3)) -torch_addmm(M, mat1, mat2)
    #> torch_tensor -#> -1.4411 0.9520 5.5685 -#> 2.0314 0.6255 2.2542 -#> [ CPUFloatType{2,3} ]
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_addmv.html b/docs/reference/torch_addmv.html deleted file mode 100644 index 27c8c5c2aacca511687422fcba91aa59923d3499..0000000000000000000000000000000000000000 --- a/docs/reference/torch_addmv.html +++ /dev/null @@ -1,254 +0,0 @@ - - - - - - - - -Addmv — torch_addmv • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Addmv

    -
    - - -

    Arguments

    - - - - - - - - - - - - - - - - - - - - - - - - - - -
    input

    (Tensor) vector to be added

    mat

    (Tensor) matrix to be multiplied

    vec

    (Tensor) vector to be multiplied

    beta

    (Number, optional) multiplier for input (\(\beta\))

    alpha

    (Number, optional) multiplier for \(mat @ vec\) (\(\alpha\))

    out

    (Tensor, optional) the output tensor.

    - -

    addmv(input, mat, vec, *, beta=1, alpha=1, out=None) -> Tensor

    - - - - -

    Performs a matrix-vector product of the matrix mat and -the vector vec. -The vector input is added to the final result.

    -

    If mat is a \((n \times m)\) tensor, vec is a 1-D tensor of -size m, then input must be -broadcastable with a 1-D tensor of size n and -out will be 1-D tensor of size n.

    -

    alpha and beta are scaling factors on matrix-vector product between -mat and vec and the added tensor input respectively.

    -

    $$ - \mbox{out} = \beta\ \mbox{input} + \alpha\ (\mbox{mat} \mathbin{@} \mbox{vec}) -$$ -For inputs of type FloatTensor or DoubleTensor, arguments beta and -alpha must be real numbers, otherwise they should be integers

    - -

    Examples

    -
    # \dontrun{ - -M = torch_randn(c(2)) -mat = torch_randn(c(2, 3)) -vec = torch_randn(c(3)) -torch_addmv(M, mat, vec)
    #> torch_tensor -#> 1.9265 -#> 1.5524 -#> [ CPUFloatType{2} ]
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_addr.html b/docs/reference/torch_addr.html deleted file mode 100644 index 6ac26d9629d10efd6281759ead6d9732dd5b4dd8..0000000000000000000000000000000000000000 --- a/docs/reference/torch_addr.html +++ /dev/null @@ -1,256 +0,0 @@ - - - - - - - - -Addr — torch_addr • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Addr

    -
    - - -

    Arguments

    - - - - - - - - - - - - - - - - - - - - - - - - - - -
    input

    (Tensor) matrix to be added

    vec1

    (Tensor) the first vector of the outer product

    vec2

    (Tensor) the second vector of the outer product

    beta

    (Number, optional) multiplier for input (\(\beta\))

    alpha

    (Number, optional) multiplier for \(\mbox{vec1} \otimes \mbox{vec2}\) (\(\alpha\))

    out

    (Tensor, optional) the output tensor.

    - -

    addr(input, vec1, vec2, *, beta=1, alpha=1, out=None) -> Tensor

    - - - - -

    Performs the outer-product of vectors vec1 and vec2 -and adds it to the matrix input.

    -

    Optional values beta and alpha are scaling factors on the -outer product between vec1 and vec2 and the added matrix -input respectively.

    -

    $$ - \mbox{out} = \beta\ \mbox{input} + \alpha\ (\mbox{vec1} \otimes \mbox{vec2}) -$$ -If vec1 is a vector of size n and vec2 is a vector -of size m, then input must be -broadcastable with a matrix of size -\((n \times m)\) and out will be a matrix of size -\((n \times m)\).

    -

    For inputs of type FloatTensor or DoubleTensor, arguments beta and -alpha must be real numbers, otherwise they should be integers

    - -

    Examples

    -
    # \dontrun{ - -vec1 = torch_arange(1., 4.) -vec2 = torch_arange(1., 3.) -M = torch_zeros(c(3, 2)) -torch_addr(M, vec1, vec2)
    #> torch_tensor -#> 1 2 -#> 2 4 -#> 3 6 -#> [ CPUFloatType{3,2} ]
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_allclose.html b/docs/reference/torch_allclose.html deleted file mode 100644 index 975ed4f9a8db58897037b0ec1368f2d7be487249..0000000000000000000000000000000000000000 --- a/docs/reference/torch_allclose.html +++ /dev/null @@ -1,236 +0,0 @@ - - - - - - - - -Allclose — torch_allclose • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Allclose

    -
    - - -

    Arguments

    - - - - - - - - - - - - - - - - - - - - - - -
    input

    (Tensor) first tensor to compare

    other

    (Tensor) second tensor to compare

    atol

    (float, optional) absolute tolerance. Default: 1e-08

    rtol

    (float, optional) relative tolerance. Default: 1e-05

    equal_nan

    (bool, optional) if True, then two NaN s will be compared as equal. Default: False

    - -

    allclose(input, other, rtol=1e-05, atol=1e-08, equal_nan=False) -> bool

    - - - - -

    This function checks if all input and other satisfy the condition:

    -

    $$ - \vert \mbox{input} - \mbox{other} \vert \leq \mbox{atol} + \mbox{rtol} \times \vert \mbox{other} \vert -$$ -elementwise, for all elements of input and other. The behaviour of this function is analogous to -numpy.allclose <https://docs.scipy.org/doc/numpy/reference/generated/numpy.allclose.html>_

    - -

    Examples

    -
    # \dontrun{ - -torch_allclose(torch_tensor(c(10000., 1e-07)), torch_tensor(c(10000.1, 1e-08)))
    #> [1] FALSE
    torch_allclose(torch_tensor(c(10000., 1e-08)), torch_tensor(c(10000.1, 1e-09)))
    #> [1] FALSE
    torch_allclose(torch_tensor(c(1.0, NaN)), torch_tensor(c(1.0, NaN)))
    #> [1] FALSE
    torch_allclose(torch_tensor(c(1.0, NaN)), torch_tensor(c(1.0, NaN)), equal_nan=TRUE)
    #> [1] TRUE
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_angle.html b/docs/reference/torch_angle.html deleted file mode 100644 index e0c61819ae0e2b4ad48c84e10754a96892ee6fd0..0000000000000000000000000000000000000000 --- a/docs/reference/torch_angle.html +++ /dev/null @@ -1,219 +0,0 @@ - - - - - - - - -Angle — torch_angle • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Angle

    -
    - - -

    Arguments

    - - - - - - - - - - -
    input

    (Tensor) the input tensor.

    out

    (Tensor, optional) the output tensor.

    - -

    angle(input, out=None) -> Tensor

    - - - - -

    Computes the element-wise angle (in radians) of the given input tensor.

    -

    $$ - \mbox{out}_{i} = angle(\mbox{input}_{i}) -$$

    - -

    Examples

    -
    
    -  
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_arange.html b/docs/reference/torch_arange.html deleted file mode 100644 index 743159b5db8453970000f8c27354d98273502ab9..0000000000000000000000000000000000000000 --- a/docs/reference/torch_arange.html +++ /dev/null @@ -1,265 +0,0 @@ - - - - - - - - -Arange — torch_arange • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Arange

    -
    - - -

    Arguments

    - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    start

    (Number) the starting value for the set of points. Default: 0.

    end

    (Number) the ending value for the set of points

    step

    (Number) the gap between each pair of adjacent points. Default: 1.

    out

    (Tensor, optional) the output tensor.

    dtype

    (torch.dtype, optional) the desired data type of returned tensor. Default: if None, uses a global default (see torch_set_default_tensor_type). If dtype is not given, infer the data type from the other input arguments. If any of start, end, or stop are floating-point, the dtype is inferred to be the default dtype, see ~torch.get_default_dtype. Otherwise, the dtype is inferred to be torch.int64.

    layout

    (torch.layout, optional) the desired layout of returned Tensor. Default: torch_strided.

    device

    (torch.device, optional) the desired device of returned tensor. Default: if None, uses the current device for the default tensor type (see torch_set_default_tensor_type). device will be the CPU for CPU tensor types and the current CUDA device for CUDA tensor types.

    requires_grad

    (bool, optional) If autograd should record operations on the returned tensor. Default: False.

    - -

    arange(start=0, end, step=1, out=None, dtype=None, layout=torch.strided, device=None, requires_grad=False) -> Tensor

    - - - - -

    Returns a 1-D tensor of size \(\left\lceil \frac{\mbox{end} - \mbox{start}}{\mbox{step}} \right\rceil\) -with values from the interval [start, end) taken with common difference -step beginning from start.

    -

    Note that non-integer step is subject to floating point rounding errors when -comparing against end; to avoid inconsistency, we advise adding a small epsilon to end -in such cases.

    -

    $$ - \mbox{out}_{{i+1}} = \mbox{out}_{i} + \mbox{step} -$$

    - -

    Examples

    -
    # \dontrun{ - -torch_arange(start = 0, end = 5)
    #> torch_tensor -#> 0 -#> 1 -#> 2 -#> 3 -#> 4 -#> [ CPUFloatType{5} ]
    torch_arange(1, 4)
    #> torch_tensor -#> 1 -#> 2 -#> 3 -#> [ CPUFloatType{3} ]
    torch_arange(1, 2.5, 0.5)
    #> torch_tensor -#> 1.0000 -#> 1.5000 -#> 2.0000 -#> [ CPUFloatType{3} ]
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_argmax.html b/docs/reference/torch_argmax.html deleted file mode 100644 index 6e99af2f94ea02892c9aa9916017782d4d9912b0..0000000000000000000000000000000000000000 --- a/docs/reference/torch_argmax.html +++ /dev/null @@ -1,230 +0,0 @@ - - - - - - - - -Argmax — torch_argmax • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Argmax

    -
    - - -

    Arguments

    - - - - - - - - - - - - - - -
    input

    (Tensor) the input tensor.

    dim

    (int) the dimension to reduce. If None, the argmax of the flattened input is returned.

    keepdim

    (bool) whether the output tensor has dim retained or not. Ignored if dim=None.

    - -

    argmax(input) -> LongTensor

    - - - - -

    Returns the indices of the maximum value of all elements in the input tensor.

    -

    This is the second value returned by torch_max. See its -documentation for the exact semantics of this method.

    -

    argmax(input, dim, keepdim=False) -> LongTensor

    - - - - -

    Returns the indices of the maximum values of a tensor across a dimension.

    -

    This is the second value returned by torch_max. See its -documentation for the exact semantics of this method.

    - -

    Examples

    -
    
    -  
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_argmin.html b/docs/reference/torch_argmin.html deleted file mode 100644 index 8a096e32c7bdcacca2dc593b3726f5c96d78422f..0000000000000000000000000000000000000000 --- a/docs/reference/torch_argmin.html +++ /dev/null @@ -1,254 +0,0 @@ - - - - - - - - -Argmin — torch_argmin • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Argmin

    -
    - - -

    Arguments

    - - - - - - - - - - - - - - -
    input

    (Tensor) the input tensor.

    dim

    (int) the dimension to reduce. If None, the argmin of the flattened input is returned.

    keepdim

    (bool) whether the output tensor has dim retained or not. Ignored if dim=None.

    - -

    argmin(input) -> LongTensor

    - - - - -

    Returns the indices of the minimum value of all elements in the input tensor.

    -

    This is the second value returned by torch_min. See its -documentation for the exact semantics of this method.

    -

    argmin(input, dim, keepdim=False, out=None) -> LongTensor

    - - - - -

    Returns the indices of the minimum values of a tensor across a dimension.

    -

    This is the second value returned by torch_min. See its -documentation for the exact semantics of this method.

    - -

    Examples

    -
    # \dontrun{ - -a = torch_randn(c(4, 4)) -a
    #> torch_tensor -#> 1.6530 -1.9398 -0.7858 -0.6979 -#> 1.3467 2.4378 2.4695 -0.0903 -#> 0.5428 -0.8464 -0.8918 -0.2703 -#> 1.0460 -0.3144 0.2131 -0.1355 -#> [ CPUFloatType{4,4} ]
    torch_argmin(a)
    #> torch_tensor -#> 1 -#> [ CPULongType{} ]
    - -a = torch_randn(c(4, 4)) -a
    #> torch_tensor -#> 0.3917 -1.7360 2.1245 -0.4908 -#> -1.2249 -0.1974 0.3145 -1.2540 -#> 2.5169 0.8670 1.2077 0.5393 -#> 0.2843 -0.6558 -0.7945 1.3721 -#> [ CPUFloatType{4,4} ]
    torch_argmin(a, dim=1)
    #> torch_tensor -#> 1 -#> 0 -#> 3 -#> 1 -#> [ CPULongType{4} ]
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_argsort.html b/docs/reference/torch_argsort.html deleted file mode 100644 index b8a5f3ab5d700e2ab2dbca3c1694126af71f3a05..0000000000000000000000000000000000000000 --- a/docs/reference/torch_argsort.html +++ /dev/null @@ -1,237 +0,0 @@ - - - - - - - - -Argsort — torch_argsort • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Argsort

    -
    - - -

    Arguments

    - - - - - - - - - - - - - - -
    input

    (Tensor) the input tensor.

    dim

    (int, optional) the dimension to sort along

    descending

    (bool, optional) controls the sorting order (ascending or descending)

    - -

    argsort(input, dim=-1, descending=False) -> LongTensor

    - - - - -

    Returns the indices that sort a tensor along a given dimension in ascending -order by value.

    -

    This is the second value returned by torch_sort. See its documentation -for the exact semantics of this method.

    - -

    Examples

    -
    # \dontrun{ - -a = torch_randn(c(4, 4)) -a
    #> torch_tensor -#> -0.8250 -0.5984 -1.2454 0.4598 -#> -0.9256 0.0695 -1.6829 1.5544 -#> 2.1622 0.7200 0.7667 -0.4872 -#> 1.1699 0.8607 2.5965 0.0434 -#> [ CPUFloatType{4,4} ]
    torch_argsort(a, dim=1)
    #> torch_tensor -#> 1 0 1 2 -#> 0 1 0 3 -#> 3 2 2 0 -#> 2 3 3 1 -#> [ CPULongType{4,4} ]
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_as_strided.html b/docs/reference/torch_as_strided.html deleted file mode 100644 index 954e35c20b7fa67b826b53453620c6f7ac7e8b72..0000000000000000000000000000000000000000 --- a/docs/reference/torch_as_strided.html +++ /dev/null @@ -1,254 +0,0 @@ - - - - - - - - -As_strided — torch_as_strided • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    As_strided

    -
    - - -

    Arguments

    - - - - - - - - - - - - - - - - - - -
    input

    (Tensor) the input tensor.

    size

    (tuple or ints) the shape of the output tensor

    stride

    (tuple or ints) the stride of the output tensor

    storage_offset

    (int, optional) the offset in the underlying storage of the output tensor

    - -

    as_strided(input, size, stride, storage_offset=0) -> Tensor

    - - - - -

    Create a view of an existing torch_Tensor input with specified -size, stride and storage_offset.

    -

    Warning

    - - - -

    More than one element of a created tensor may refer to a single memory -location. As a result, in-place operations (especially ones that are -vectorized) may result in incorrect behavior. If you need to write to -the tensors, please clone them first.

    Many PyTorch functions, which return a view of a tensor, are internally
    -implemented with this function. Those functions, like
    -`torch_Tensor.expand`, are easier to read and are therefore more
    -advisable to use.
    -
    - - -

    Examples

    -
    # \dontrun{ - -x = torch_randn(c(3, 3)) -x
    #> torch_tensor -#> -1.5576 0.5216 -0.6254 -#> 0.5108 -0.2964 0.5801 -#> -0.7827 0.2806 0.0976 -#> [ CPUFloatType{3,3} ]
    t = torch_as_strided(x, list(2, 2), list(1, 2)) -t
    #> torch_tensor -#> -1.5576 -0.6254 -#> 0.5216 0.5108 -#> [ CPUFloatType{2,2} ]
    t = torch_as_strided(x, list(2, 2), list(1, 2), 1) -t
    #> torch_tensor -#> 0.5216 0.5108 -#> -0.6254 -0.2964 -#> [ CPUFloatType{2,2} ]
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_asin.html b/docs/reference/torch_asin.html deleted file mode 100644 index fe3c2598bc89c49ab7ffa102777d2e611203ab2b..0000000000000000000000000000000000000000 --- a/docs/reference/torch_asin.html +++ /dev/null @@ -1,233 +0,0 @@ - - - - - - - - -Asin — torch_asin • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Asin

    -
    - - -

    Arguments

    - - - - - - - - - - -
    input

    (Tensor) the input tensor.

    out

    (Tensor, optional) the output tensor.

    - -

    asin(input, out=None) -> Tensor

    - - - - -

    Returns a new tensor with the arcsine of the elements of input.

    -

    $$ - \mbox{out}_{i} = \sin^{-1}(\mbox{input}_{i}) -$$

    - -

    Examples

    -
    # \dontrun{ - -a = torch_randn(c(4)) -a
    #> torch_tensor -#> -0.3857 -#> 1.8551 -#> 0.4113 -#> 0.7013 -#> [ CPUFloatType{4} ]
    torch_asin(a)
    #> torch_tensor -#> -0.3959 -#> nan -#> 0.4239 -#> 0.7773 -#> [ CPUFloatType{4} ]
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_atan.html b/docs/reference/torch_atan.html deleted file mode 100644 index d759b640802937397fa7064df55e8988bc3e14f8..0000000000000000000000000000000000000000 --- a/docs/reference/torch_atan.html +++ /dev/null @@ -1,233 +0,0 @@ - - - - - - - - -Atan — torch_atan • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Atan

    -
    - - -

    Arguments

    - - - - - - - - - - -
    input

    (Tensor) the input tensor.

    out

    (Tensor, optional) the output tensor.

    - -

    atan(input, out=None) -> Tensor

    - - - - -

    Returns a new tensor with the arctangent of the elements of input.

    -

    $$ - \mbox{out}_{i} = \tan^{-1}(\mbox{input}_{i}) -$$

    - -

    Examples

    -
    # \dontrun{ - -a = torch_randn(c(4)) -a
    #> torch_tensor -#> -0.7742 -#> 0.3914 -#> -0.0984 -#> 0.7190 -#> [ CPUFloatType{4} ]
    torch_atan(a)
    #> torch_tensor -#> -0.6588 -#> 0.3730 -#> -0.0980 -#> 0.6234 -#> [ CPUFloatType{4} ]
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_atan2.html b/docs/reference/torch_atan2.html deleted file mode 100644 index 0e0582ddc63debbbe0ac6bb4a5702fbfcd3dbc84..0000000000000000000000000000000000000000 --- a/docs/reference/torch_atan2.html +++ /dev/null @@ -1,241 +0,0 @@ - - - - - - - - -Atan2 — torch_atan2 • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Atan2

    -
    - - -

    Arguments

    - - - - - - - - - - - - - - -
    input

    (Tensor) the first input tensor

    other

    (Tensor) the second input tensor

    out

    (Tensor, optional) the output tensor.

    - -

    atan2(input, other, out=None) -> Tensor

    - - - - -

    Element-wise arctangent of \(\mbox{input}_{i} / \mbox{other}_{i}\) -with consideration of the quadrant. Returns a new tensor with the signed angles -in radians between vector \((\mbox{other}_{i}, \mbox{input}_{i})\) -and vector \((1, 0)\). (Note that \(\mbox{other}_{i}\), the second -parameter, is the x-coordinate, while \(\mbox{input}_{i}\), the first -parameter, is the y-coordinate.)

    -

    The shapes of input and other must be -broadcastable .

    - -

    Examples

    -
    # \dontrun{ - -a = torch_randn(c(4)) -a
    #> torch_tensor -#> 0.7384 -#> 1.5533 -#> 0.0480 -#> 0.5090 -#> [ CPUFloatType{4} ]
    torch_atan2(a, torch_randn(c(4)))
    #> torch_tensor -#> 0.3252 -#> 1.4989 -#> 3.0686 -#> 1.3146 -#> [ CPUFloatType{4} ]
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_avg_pool1d.html b/docs/reference/torch_avg_pool1d.html deleted file mode 100644 index 2799e95fa15f5e4b67ff0b31682d623db7443480..0000000000000000000000000000000000000000 --- a/docs/reference/torch_avg_pool1d.html +++ /dev/null @@ -1,232 +0,0 @@ - - - - - - - - -Avg_pool1d — torch_avg_pool1d • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Avg_pool1d

    -
    - - -

    Arguments

    - - - - - - - - - - - - - - - - - - - - - - - - - - -
    input

    NA input tensor of shape \((\mbox{minibatch} , \mbox{in\_channels} , iW)\)

    kernel_size

    NA the size of the window. Can be a single number or a tuple (kW,)

    stride

    NA the stride of the window. Can be a single number or a tuple (sW,). Default: kernel_size

    padding

    NA implicit zero paddings on both sides of the input. Can be a single number or a tuple (padW,). Default: 0

    ceil_mode

    NA when True, will use ceil instead of floor to compute the output shape. Default: False

    count_include_pad

    NA when True, will include the zero-padding in the averaging calculation. Default: True

    - -

    avg_pool1d(input, kernel_size, stride=None, padding=0, ceil_mode=False, count_include_pad=True) -> Tensor

    - - - - -

    Applies a 1D average pooling over an input signal composed of several -input planes.

    -

    See ~torch.nn.AvgPool1d for details and output shape.

    - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_baddbmm.html b/docs/reference/torch_baddbmm.html deleted file mode 100644 index 522f09378207b5d8ca93e80134872c88d54210eb..0000000000000000000000000000000000000000 --- a/docs/reference/torch_baddbmm.html +++ /dev/null @@ -1,303 +0,0 @@ - - - - - - - - -Baddbmm — torch_baddbmm • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Baddbmm

    -
    - - -

    Arguments

    - - - - - - - - - - - - - - - - - - - - - - - - - - -
    input

    (Tensor) the tensor to be added

    batch1

    (Tensor) the first batch of matrices to be multiplied

    batch2

    (Tensor) the second batch of matrices to be multiplied

    beta

    (Number, optional) multiplier for input (\(\beta\))

    alpha

    (Number, optional) multiplier for \(\mbox{batch1} \mathbin{@} \mbox{batch2}\) (\(\alpha\))

    out

    (Tensor, optional) the output tensor.

    - -

    baddbmm(input, batch1, batch2, *, beta=1, alpha=1, out=None) -> Tensor

    - - - - -

    Performs a batch matrix-matrix product of matrices in batch1 -and batch2. -input is added to the final result.

    -

    batch1 and batch2 must be 3-D tensors each containing the same -number of matrices.

    -

    If batch1 is a \((b \times n \times m)\) tensor, batch2 is a -\((b \times m \times p)\) tensor, then input must be -broadcastable with a -\((b \times n \times p)\) tensor and out will be a -\((b \times n \times p)\) tensor. Both alpha and beta mean the -same as the scaling factors used in torch_addbmm.

    -

    $$ - \mbox{out}_i = \beta\ \mbox{input}_i + \alpha\ (\mbox{batch1}_i \mathbin{@} \mbox{batch2}_i) -$$ -For inputs of type FloatTensor or DoubleTensor, arguments beta and -alpha must be real numbers, otherwise they should be integers.

    - -

    Examples

    -
    # \dontrun{ - -M = torch_randn(c(10, 3, 5)) -batch1 = torch_randn(c(10, 3, 4)) -batch2 = torch_randn(c(10, 4, 5)) -torch_baddbmm(M, batch1, batch2)
    #> torch_tensor -#> (1,.,.) = -#> 3.2697 -5.0643 0.0743 -0.2398 -2.5402 -#> -0.3596 -0.1524 -2.2537 -0.6132 0.4815 -#> 1.1825 -2.2500 0.8243 1.5010 -2.4894 -#> -#> (2,.,.) = -#> 0.4770 -0.8900 3.0012 2.0244 2.9934 -#> -2.0624 -0.7371 -0.6249 -1.4119 -1.0305 -#> 0.4525 1.1938 1.2075 2.4423 0.5840 -#> -#> (3,.,.) = -#> -1.5483 0.0002 -0.7736 -0.1712 2.3502 -#> 1.3820 1.9069 -1.1504 2.8244 -0.5037 -#> -0.7816 0.0485 3.1307 -0.7125 2.3957 -#> -#> (4,.,.) = -#> 3.2263 1.9973 -2.7929 -0.6880 -1.8358 -#> 3.9498 0.1835 -3.6300 -0.7907 -2.9265 -#> 1.5720 -1.5571 -0.5235 0.2169 -0.7204 -#> -#> (5,.,.) = -#> -1.5198 -1.4044 0.6454 1.6571 1.6412 -#> 0.6481 -0.1620 0.7348 -2.5747 -1.5232 -#> -3.9663 0.6486 -0.1782 -0.2130 -0.2005 -#> -#> (6,.,.) = -#> -0.7923 -0.1696 -0.0210 -1.4651 0.1979 -#> -0.2874 2.4903 -2.5324 0.1213 4.3363 -#> 0.8367 0.5843 2.6930 -0.5081 -0.7514 -#> -#> (7,.,.) = -#> 0.9376 -5.8062 -2.4161 -2.2368 1.7258 -#> -0.5255 2.0584 1.1016 -2.1323 -1.1418 -#> -1.8125 0.8110 -0.2142 1.9131 2.4363 -#> -#> (8,.,.) = -#> -1.3094 0.0064 -0.5161 4.1986 -0.5380 -#> -4.8329 -0.6216 0.9426 -3.9339 -1.2310 -#> 7.8403 0.3146 -1.0314 3.4608 2.3111 -#> -#> (9,.,.) = -#> -0.9910 -4.0243 4.6838 -4.8655 0.7247 -#> 1.0314 0.6343 0.5493 -0.0418 -0.5915 -#> 0.1801 0.7773 -1.0913 0.2247 -2.3853 -#> -#> (10,.,.) = -#> -1.1792 0.4361 -0.6693 -0.4414 0.9327 -#> -4.9029 -6.8475 -4.1729 -2.2513 0.6501 -#> 1.3470 1.4167 -0.9282 0.5063 -2.4436 -#> [ CPUFloatType{10,3,5} ]
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_bartlett_window.html b/docs/reference/torch_bartlett_window.html deleted file mode 100644 index db4c45d51a6eb039e9efb1009dc0b39fc21f0162..0000000000000000000000000000000000000000 --- a/docs/reference/torch_bartlett_window.html +++ /dev/null @@ -1,252 +0,0 @@ - - - - - - - - -Bartlett_window — torch_bartlett_window • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Bartlett_window

    -
    - - -

    Arguments

    - - - - - - - - - - - - - - - - - - - - - - - - - - -
    window_length

    (int) the size of returned window

    periodic

    (bool, optional) If True, returns a window to be used as periodic function. If False, return a symmetric window.

    dtype

    (torch.dtype, optional) the desired data type of returned tensor. Default: if None, uses a global default (see torch_set_default_tensor_type). Only floating point types are supported.

    layout

    (torch.layout, optional) the desired layout of returned window tensor. Only torch_strided (dense layout) is supported.

    device

    (torch.device, optional) the desired device of returned tensor. Default: if None, uses the current device for the default tensor type (see torch_set_default_tensor_type). device will be the CPU for CPU tensor types and the current CUDA device for CUDA tensor types.

    requires_grad

    (bool, optional) If autograd should record operations on the returned tensor. Default: False.

    - -

    Note

    - - -
    If `window_length` \eqn{=1}, the returned window contains a single value 1.
    -
    - -

    bartlett_window(window_length, periodic=True, dtype=None, layout=torch.strided, device=None, requires_grad=False) -> Tensor

    - - - - -

    Bartlett window function.

    -

    $$ - w[n] = 1 - \left| \frac{2n}{N-1} - 1 \right| = \left\{ \begin{array}{ll} - \frac{2n}{N - 1} & \mbox{if } 0 \leq n \leq \frac{N - 1}{2} \\ - 2 - \frac{2n}{N - 1} & \mbox{if } \frac{N - 1}{2} < n < N \\ - \end{array} - \right. , -$$ -where \(N\) is the full window size.

    -

    The input window_length is a positive integer controlling the -returned window size. periodic flag determines whether the returned -window trims off the last duplicate value from the symmetric window and is -ready to be used as a periodic window with functions like -torch_stft. Therefore, if periodic is true, the \(N\) in -above formula is in fact \(\mbox{window\_length} + 1\). Also, we always have -torch_bartlett_window(L, periodic=True) equal to -torch_bartlett_window(L + 1, periodic=False)[:-1]).

    - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_bernoulli.html b/docs/reference/torch_bernoulli.html deleted file mode 100644 index 2064f9d4a5b589b0c188ed848e1cdaf46e03bde4..0000000000000000000000000000000000000000 --- a/docs/reference/torch_bernoulli.html +++ /dev/null @@ -1,256 +0,0 @@ - - - - - - - - -Bernoulli — torch_bernoulli • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Bernoulli

    -
    - - -

    Arguments

    - - - - - - - - - - - - - - -
    input

    (Tensor) the input tensor of probability values for the Bernoulli distribution

    generator

    (torch.Generator, optional) a pseudorandom number generator for sampling

    out

    (Tensor, optional) the output tensor.

    - -

    bernoulli(input, *, generator=None, out=None) -> Tensor

    - - - - -

    Draws binary random numbers (0 or 1) from a Bernoulli distribution.

    -

    The input tensor should be a tensor containing probabilities -to be used for drawing the binary random number. -Hence, all values in input have to be in the range: -\(0 \leq \mbox{input}_i \leq 1\).

    -

    The \(\mbox{i}^{th}\) element of the output tensor will draw a -value \(1\) according to the \(\mbox{i}^{th}\) probability value given -in input.

    -

    $$ - \mbox{out}_{i} \sim \mathrm{Bernoulli}(p = \mbox{input}_{i}) -$$ -The returned out tensor only has values 0 or 1 and is of the same -shape as input.

    -

    out can have integral dtype, but input must have floating -point dtype.

    - -

    Examples

    -
    # \dontrun{ - -a = torch_empty(c(3, 3))$uniform_(0, 1) # generate a uniform random matrix with range c(0, 1) -a
    #> torch_tensor -#> 0.8765 0.8092 0.3962 -#> 0.4623 0.3192 0.0298 -#> 0.7755 0.1732 0.0310 -#> [ CPUFloatType{3,3} ]
    torch_bernoulli(a)
    #> torch_tensor -#> 0 1 1 -#> 0 1 0 -#> 1 0 0 -#> [ CPUFloatType{3,3} ]
    a = torch_ones(c(3, 3)) # probability of drawing "1" is 1 -torch_bernoulli(a)
    #> torch_tensor -#> 1 1 1 -#> 1 1 1 -#> 1 1 1 -#> [ CPUFloatType{3,3} ]
    a = torch_zeros(c(3, 3)) # probability of drawing "1" is 0 -torch_bernoulli(a)
    #> torch_tensor -#> 0 0 0 -#> 0 0 0 -#> 0 0 0 -#> [ CPUFloatType{3,3} ]
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_bincount.html b/docs/reference/torch_bincount.html deleted file mode 100644 index e5588960fa217d166432a796252b0b69cfb1c92c..0000000000000000000000000000000000000000 --- a/docs/reference/torch_bincount.html +++ /dev/null @@ -1,263 +0,0 @@ - - - - - - - - -Bincount — torch_bincount • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Bincount

    -
    - - -

    Arguments

    - - - - - - - - - - - - - - -
    input

    (Tensor) 1-d int tensor

    weights

    (Tensor) optional, weight for each value in the input tensor. Should be of same size as input tensor.

    minlength

    (int) optional, minimum number of bins. Should be non-negative.

    - -

    bincount(input, weights=None, minlength=0) -> Tensor

    - - - - -

    Count the frequency of each value in an array of non-negative ints.

    -

    The number of bins (size 1) is one larger than the largest value in -input unless input is empty, in which case the result is a -tensor of size 0. If minlength is specified, the number of bins is at least -minlength and if input is empty, then the result is tensor of size -minlength filled with zeros. If n is the value at position i, -out[n] += weights[i] if weights is specified else -out[n] += 1.

    -

    .. include:: cuda_deterministic.rst

    - -

    Examples

    -
    # \dontrun{ - -input = torch_randint(0, 8, list(5), dtype=torch_int64()) -weights = torch_linspace(0, 1, steps=5) -input
    #> torch_tensor -#> 2 -#> 7 -#> 5 -#> 3 -#> 6 -#> [ CPULongType{5} ]
    weights
    #> torch_tensor -#> 0.0000 -#> 0.2500 -#> 0.5000 -#> 0.7500 -#> 1.0000 -#> [ CPUFloatType{5} ]
    torch_bincount(input, weights)
    #> torch_tensor -#> 0.0000 -#> 0.0000 -#> 0.0000 -#> 0.7500 -#> 0.0000 -#> 0.5000 -#> 1.0000 -#> 0.2500 -#> [ CPUFloatType{8} ]
    input$bincount(weights)
    #> torch_tensor -#> 0.0000 -#> 0.0000 -#> 0.0000 -#> 0.7500 -#> 0.0000 -#> 0.5000 -#> 1.0000 -#> 0.2500 -#> [ CPUFloatType{8} ]
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_bitwise_and.html b/docs/reference/torch_bitwise_and.html deleted file mode 100644 index c0d1f81d5a4ebf3d1afe7d949d872933639a991d..0000000000000000000000000000000000000000 --- a/docs/reference/torch_bitwise_and.html +++ /dev/null @@ -1,219 +0,0 @@ - - - - - - - - -Bitwise_and — torch_bitwise_and • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Bitwise_and

    -
    - - -

    Arguments

    - - - - - - - - - - - - - - -
    input

    NA the first input tensor

    other

    NA the second input tensor

    out

    (Tensor, optional) the output tensor.

    - -

    bitwise_and(input, other, out=None) -> Tensor

    - - - - -

    Computes the bitwise AND of input and other. The input tensor must be of -integral or Boolean types. For bool tensors, it computes the logical AND.

    - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_bitwise_not.html b/docs/reference/torch_bitwise_not.html deleted file mode 100644 index 1740d2d3635a97b83880e032724b0b37175b7a45..0000000000000000000000000000000000000000 --- a/docs/reference/torch_bitwise_not.html +++ /dev/null @@ -1,215 +0,0 @@ - - - - - - - - -Bitwise_not — torch_bitwise_not • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Bitwise_not

    -
    - - -

    Arguments

    - - - - - - - - - - -
    input

    (Tensor) the input tensor.

    out

    (Tensor, optional) the output tensor.

    - -

    bitwise_not(input, out=None) -> Tensor

    - - - - -

    Computes the bitwise NOT of the given input tensor. The input tensor must be of -integral or Boolean types. For bool tensors, it computes the logical NOT.

    - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_bitwise_or.html b/docs/reference/torch_bitwise_or.html deleted file mode 100644 index b036f68ad764c534317ae23c5f4b51d2d07ca2e0..0000000000000000000000000000000000000000 --- a/docs/reference/torch_bitwise_or.html +++ /dev/null @@ -1,219 +0,0 @@ - - - - - - - - -Bitwise_or — torch_bitwise_or • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Bitwise_or

    -
    - - -

    Arguments

    - - - - - - - - - - - - - - -
    input

    NA the first input tensor

    other

    NA the second input tensor

    out

    (Tensor, optional) the output tensor.

    - -

    bitwise_or(input, other, out=None) -> Tensor

    - - - - -

    Computes the bitwise OR of input and other. The input tensor must be of -integral or Boolean types. For bool tensors, it computes the logical OR.

    - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_bitwise_xor.html b/docs/reference/torch_bitwise_xor.html deleted file mode 100644 index 5c2d5ba04c6ab2476f159069323a02bc65000a90..0000000000000000000000000000000000000000 --- a/docs/reference/torch_bitwise_xor.html +++ /dev/null @@ -1,219 +0,0 @@ - - - - - - - - -Bitwise_xor — torch_bitwise_xor • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Bitwise_xor

    -
    - - -

    Arguments

    - - - - - - - - - - - - - - -
    input

    NA the first input tensor

    other

    NA the second input tensor

    out

    (Tensor, optional) the output tensor.

    - -

    bitwise_xor(input, other, out=None) -> Tensor

    - - - - -

    Computes the bitwise XOR of input and other. The input tensor must be of -integral or Boolean types. For bool tensors, it computes the logical XOR.

    - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_blackman_window.html b/docs/reference/torch_blackman_window.html deleted file mode 100644 index 1d4b34c6e56273e562245e7850646f54f986d992..0000000000000000000000000000000000000000 --- a/docs/reference/torch_blackman_window.html +++ /dev/null @@ -1,248 +0,0 @@ - - - - - - - - -Blackman_window — torch_blackman_window • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Blackman_window

    -
    - - -

    Arguments

    - - - - - - - - - - - - - - - - - - - - - - - - - - -
    window_length

    (int) the size of returned window

    periodic

    (bool, optional) If True, returns a window to be used as periodic function. If False, return a symmetric window.

    dtype

    (torch.dtype, optional) the desired data type of returned tensor. Default: if None, uses a global default (see torch_set_default_tensor_type). Only floating point types are supported.

    layout

    (torch.layout, optional) the desired layout of returned window tensor. Only torch_strided (dense layout) is supported.

    device

    (torch.device, optional) the desired device of returned tensor. Default: if None, uses the current device for the default tensor type (see torch_set_default_tensor_type). device will be the CPU for CPU tensor types and the current CUDA device for CUDA tensor types.

    requires_grad

    (bool, optional) If autograd should record operations on the returned tensor. Default: False.

    - -

    Note

    - - -
    If `window_length` \eqn{=1}, the returned window contains a single value 1.
    -
    - -

    blackman_window(window_length, periodic=True, dtype=None, layout=torch.strided, device=None, requires_grad=False) -> Tensor

    - - - - -

    Blackman window function.

    -

    $$ - w[n] = 0.42 - 0.5 \cos \left( \frac{2 \pi n}{N - 1} \right) + 0.08 \cos \left( \frac{4 \pi n}{N - 1} \right) -$$ -where \(N\) is the full window size.

    -

    The input window_length is a positive integer controlling the -returned window size. periodic flag determines whether the returned -window trims off the last duplicate value from the symmetric window and is -ready to be used as a periodic window with functions like -torch_stft. Therefore, if periodic is true, the \(N\) in -above formula is in fact \(\mbox{window\_length} + 1\). Also, we always have -torch_blackman_window(L, periodic=True) equal to -torch_blackman_window(L + 1, periodic=False)[:-1]).

    - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_bmm.html b/docs/reference/torch_bmm.html deleted file mode 100644 index 1f93f8d97f054f6f0234a2f83be4bcd49a828bfe..0000000000000000000000000000000000000000 --- a/docs/reference/torch_bmm.html +++ /dev/null @@ -1,289 +0,0 @@ - - - - - - - - -Bmm — torch_bmm • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Bmm

    -
    - - -

    Arguments

    - - - - - - - - - - - - - - -
    input

    (Tensor) the first batch of matrices to be multiplied

    mat2

    (Tensor) the second batch of matrices to be multiplied

    out

    (Tensor, optional) the output tensor.

    - -

    Note

    - -

    This function does not broadcast . -For broadcasting matrix products, see torch_matmul.

    -

    bmm(input, mat2, out=None) -> Tensor

    - - - - -

    Performs a batch matrix-matrix product of matrices stored in input -and mat2.

    -

    input and mat2 must be 3-D tensors each containing -the same number of matrices.

    -

    If input is a \((b \times n \times m)\) tensor, mat2 is a -\((b \times m \times p)\) tensor, out will be a -\((b \times n \times p)\) tensor.

    -

    $$ - \mbox{out}_i = \mbox{input}_i \mathbin{@} \mbox{mat2}_i -$$

    - -

    Examples

    -
    # \dontrun{ - -input = torch_randn(c(10, 3, 4)) -mat2 = torch_randn(c(10, 4, 5)) -res = torch_bmm(input, mat2) -res
    #> torch_tensor -#> (1,.,.) = -#> -1.1937 1.0490 -1.3460 -1.3636 0.2908 -#> -0.7399 -0.3916 -0.0894 1.5547 -0.5792 -#> 1.6370 -1.8825 0.6914 0.4735 0.2958 -#> -#> (2,.,.) = -#> 3.0209 -2.4298 0.3410 0.0615 -2.5501 -#> -3.8228 0.7082 0.9869 0.1536 1.1400 -#> 2.5718 -0.3476 1.3377 0.6290 -0.2315 -#> -#> (3,.,.) = -#> -0.2813 -0.3510 -0.6811 -0.8482 1.3861 -#> 3.3843 -1.2077 -1.9622 -1.1351 -1.8477 -#> 2.6732 -2.4184 0.7855 2.8759 -1.4808 -#> -#> (4,.,.) = -#> -6.8546 1.0791 2.1027 -2.8185 0.7520 -#> -0.9041 0.8896 2.4743 0.6284 0.2519 -#> 2.6052 -1.4564 -1.6375 1.3288 0.3487 -#> -#> (5,.,.) = -#> 2.5222 -1.6164 -2.2116 -1.0754 0.7719 -#> -3.6324 2.5302 0.9988 -2.1378 0.6788 -#> 5.6221 0.7932 2.1447 4.9035 -5.1887 -#> -#> (6,.,.) = -#> 0.2683 -1.0509 2.6643 -0.2398 0.4529 -#> -2.3240 -3.0188 2.6981 1.3544 0.8555 -#> -0.4469 0.3477 1.0020 4.7555 1.9801 -#> -#> (7,.,.) = -#> -1.5234 0.5375 0.0234 2.3384 -3.3980 -#> 1.3228 3.1686 1.4053 -2.2938 7.3319 -#> 1.9968 -5.2192 -0.6723 -1.0900 -3.2833 -#> -#> (8,.,.) = -#> -2.3741 2.0837 -0.4425 -1.5224 2.2040 -#> -0.7937 1.1621 6.6647 0.5726 1.9161 -#> -1.2275 -0.9221 -1.9841 -0.4629 0.8082 -#> -#> (9,.,.) = -#> -1.4034 -0.0159 -0.7663 0.1020 0.4187 -#> -1.1376 2.4816 1.0544 1.9942 -1.1878 -#> -0.8727 0.0230 -0.5804 0.0939 0.1387 -#> -#> (10,.,.) = -#> -0.4053 -0.3109 -2.1835 -0.1594 1.8523 -#> -1.8857 1.2782 3.0087 -2.7136 -0.9552 -#> -1.9781 1.0313 2.1664 -2.4844 -0.7711 -#> [ CPUFloatType{10,3,5} ]
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_broadcast_tensors.html b/docs/reference/torch_broadcast_tensors.html deleted file mode 100644 index 164b14018a174ef96b75b934042ae0894291fd96..0000000000000000000000000000000000000000 --- a/docs/reference/torch_broadcast_tensors.html +++ /dev/null @@ -1,221 +0,0 @@ - - - - - - - - -Broadcast_tensors — torch_broadcast_tensors • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Broadcast_tensors

    -
    - - -

    Arguments

    - - - - - - -
    *tensors

    NA any number of tensors of the same type

    - -

    broadcast_tensors(*tensors) -> List of Tensors

    - - - - -

    Broadcasts the given tensors according to broadcasting-semantics.

    - -

    Examples

    -
    # \dontrun{ - -x = torch_arange(0, 3)$view(c(1, 3)) -y = torch_arange(0, 2)$view(c(2, 1)) -out = torch_broadcast_tensors(list(x, y)) -out[[1]]
    #> torch_tensor -#> 0 1 2 -#> 0 1 2 -#> [ CPUFloatType{2,3} ]
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_can_cast.html b/docs/reference/torch_can_cast.html deleted file mode 100644 index 79fb56f536f5bf1df83b12f62a10ecc9a43f0c78..0000000000000000000000000000000000000000 --- a/docs/reference/torch_can_cast.html +++ /dev/null @@ -1,220 +0,0 @@ - - - - - - - - -Can_cast — torch_can_cast • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Can_cast

    -
    - - -

    Arguments

    - - - - - - - - - - -
    from

    (dtype) The original torch_dtype.

    to

    (dtype) The target torch_dtype.

    - -

    can_cast(from, to) -> bool

    - - - - -

    Determines if a type conversion is allowed under PyTorch casting rules -described in the type promotion documentation .

    - -

    Examples

    -
    # \dontrun{ - -torch_can_cast(torch_double(), torch_float())
    #> [1] TRUE
    torch_can_cast(torch_float(), torch_int())
    #> [1] FALSE
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_cartesian_prod.html b/docs/reference/torch_cartesian_prod.html deleted file mode 100644 index 16340dd69e95d51d074c719ff597d38e63e84ea4..0000000000000000000000000000000000000000 --- a/docs/reference/torch_cartesian_prod.html +++ /dev/null @@ -1,227 +0,0 @@ - - - - - - - - -Cartesian_prod — torch_cartesian_prod • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Cartesian_prod

    -
    - - -

    Arguments

    - - - - - - -
    *tensors

    NA any number of 1 dimensional tensors.

    - -

    TEST

    - - - - -

    Do cartesian product of the given sequence of tensors. The behavior is similar to -python's itertools.product.

    - -

    Examples

    -
    # \dontrun{ - -a = c(1, 2, 3) -b = c(4, 5) -tensor_a = torch_tensor(a) -tensor_b = torch_tensor(b) -torch_cartesian_prod(list(tensor_a, tensor_b))
    #> torch_tensor -#> 1 4 -#> 1 5 -#> 2 4 -#> 2 5 -#> 3 4 -#> 3 5 -#> [ CPUFloatType{6,2} ]
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_cat.html b/docs/reference/torch_cat.html deleted file mode 100644 index 4fded00c9f98ae55531dc2803498ba2726d862a8..0000000000000000000000000000000000000000 --- a/docs/reference/torch_cat.html +++ /dev/null @@ -1,242 +0,0 @@ - - - - - - - - -Cat — torch_cat • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Cat

    -
    - - -

    Arguments

    - - - - - - - - - - - - - - -
    tensors

    (sequence of Tensors) any python sequence of tensors of the same type. Non-empty tensors provided must have the same shape, except in the cat dimension.

    dim

    (int, optional) the dimension over which the tensors are concatenated

    out

    (Tensor, optional) the output tensor.

    - -

    cat(tensors, dim=0, out=None) -> Tensor

    - - - - -

    Concatenates the given sequence of seq tensors in the given dimension. -All tensors must either have the same shape (except in the concatenating -dimension) or be empty.

    -

    torch_cat can be seen as an inverse operation for torch_split() -and torch_chunk.

    -

    torch_cat can be best understood via examples.

    - -

    Examples

    -
    # \dontrun{ - -x = torch_randn(c(2, 3)) -x
    #> torch_tensor -#> 1.5078 0.8533 1.7774 -#> -0.7864 1.4110 0.6703 -#> [ CPUFloatType{2,3} ]
    torch_cat(list(x, x, x), 1)
    #> torch_tensor -#> 1.5078 0.8533 1.7774 -#> -0.7864 1.4110 0.6703 -#> 1.5078 0.8533 1.7774 -#> -0.7864 1.4110 0.6703 -#> 1.5078 0.8533 1.7774 -#> -0.7864 1.4110 0.6703 -#> [ CPUFloatType{6,3} ]
    torch_cat(list(x, x, x), 2)
    #> torch_tensor -#> 1.5078 0.8533 1.7774 1.5078 0.8533 1.7774 1.5078 0.8533 1.7774 -#> -0.7864 1.4110 0.6703 -0.7864 1.4110 0.6703 -0.7864 1.4110 0.6703 -#> [ CPUFloatType{2,9} ]
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_cdist.html b/docs/reference/torch_cdist.html deleted file mode 100644 index 3857005762576c42de194fe92e47160080b76ba1..0000000000000000000000000000000000000000 --- a/docs/reference/torch_cdist.html +++ /dev/null @@ -1,222 +0,0 @@ - - - - - - - - -Cdist — torch_cdist • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Cdist

    -
    - - -

    Arguments

    - - - - - - - - - - - - - - - - - - -
    x1

    (Tensor) input tensor of shape \(B \times P \times M\).

    x2

    (Tensor) input tensor of shape \(B \times R \times M\).

    p

    NA p value for the p-norm distance to calculate between each vector pair \(\in [0, \infty]\).

    compute_mode

    NA 'use_mm_for_euclid_dist_if_necessary' - will use matrix multiplication approach to calculate euclidean distance (p = 2) if P > 25 or R > 25 'use_mm_for_euclid_dist' - will always use matrix multiplication approach to calculate euclidean distance (p = 2) 'donot_use_mm_for_euclid_dist' - will never use matrix multiplication approach to calculate euclidean distance (p = 2) Default: use_mm_for_euclid_dist_if_necessary.

    - -

    TEST

    - - - - -

    Computes batched the p-norm distance between each pair of the two collections of row vectors.

    - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_ceil.html b/docs/reference/torch_ceil.html deleted file mode 100644 index 0c413d78a68a37834cd5f76b8bbf9f1a26a48c4b..0000000000000000000000000000000000000000 --- a/docs/reference/torch_ceil.html +++ /dev/null @@ -1,234 +0,0 @@ - - - - - - - - -Ceil — torch_ceil • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Ceil

    -
    - - -

    Arguments

    - - - - - - - - - - -
    input

    (Tensor) the input tensor.

    out

    (Tensor, optional) the output tensor.

    - -

    ceil(input, out=None) -> Tensor

    - - - - -

    Returns a new tensor with the ceil of the elements of input, -the smallest integer greater than or equal to each element.

    -

    $$ - \mbox{out}_{i} = \left\lceil \mbox{input}_{i} \right\rceil = \left\lfloor \mbox{input}_{i} \right\rfloor + 1 -$$

    - -

    Examples

    -
    # \dontrun{ - -a = torch_randn(c(4)) -a
    #> torch_tensor -#> 0.9465 -#> 1.5480 -#> -0.6969 -#> -0.4820 -#> [ CPUFloatType{4} ]
    torch_ceil(a)
    #> torch_tensor -#> 1 -#> 2 -#> -0 -#> -0 -#> [ CPUFloatType{4} ]
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_celu_.html b/docs/reference/torch_celu_.html deleted file mode 100644 index 30d7f467201dc2ec0935732fdfdccc3cc2a89e45..0000000000000000000000000000000000000000 --- a/docs/reference/torch_celu_.html +++ /dev/null @@ -1,202 +0,0 @@ - - - - - - - - -Celu_ — torch_celu_ • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Celu_

    -
    - - - -

    celu_(input, alpha=1.) -> Tensor

    - - - - -

    In-place version of torch_celu.

    - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_chain_matmul.html b/docs/reference/torch_chain_matmul.html deleted file mode 100644 index 72b4647aaf2dcd17072e8af3eab16be1c9b165a3..0000000000000000000000000000000000000000 --- a/docs/reference/torch_chain_matmul.html +++ /dev/null @@ -1,227 +0,0 @@ - - - - - - - - -Chain_matmul — torch_chain_matmul • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Chain_matmul

    -
    - - -

    Arguments

    - - - - - - -
    matrices

    (Tensors...) a sequence of 2 or more 2-D tensors whose product is to be determined.

    - -

    TEST

    - - - - -

    Returns the matrix product of the \(N\) 2-D tensors. This product is efficiently computed -using the matrix chain order algorithm which selects the order in which incurs the lowest cost in terms -of arithmetic operations ([CLRS]_). Note that since this is a function to compute the product, \(N\) -needs to be greater than or equal to 2; if equal to 2 then a trivial matrix-matrix product is returned. -If \(N\) is 1, then this is a no-op - the original matrix is returned as is.

    - -

    Examples

    -
    # \dontrun{ - -a = torch_randn(c(3, 4)) -b = torch_randn(c(4, 5)) -c = torch_randn(c(5, 6)) -d = torch_randn(c(6, 7)) -torch_chain_matmul(list(a, b, c, d))
    #> torch_tensor -#> 2.2025 6.9263 -12.0433 -1.8318 6.1157 -1.9091 -2.5474 -#> -9.2675 3.6580 -10.9555 3.7499 -0.9984 -2.1468 18.3629 -#> -4.3318 -10.0159 20.3315 2.5116 -9.5372 3.4920 7.3516 -#> [ CPUFloatType{3,7} ]
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_cholesky.html b/docs/reference/torch_cholesky.html deleted file mode 100644 index 25dd309624728007ae76e3d5b653a79703ecfc8c..0000000000000000000000000000000000000000 --- a/docs/reference/torch_cholesky.html +++ /dev/null @@ -1,236 +0,0 @@ - - - - - - - - -Cholesky — torch_cholesky • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Cholesky

    -
    - - -

    Arguments

    - - - - - - - - - - - - - - -
    input

    (Tensor) the input tensor \(A\) of size \((*, n, n)\) where * is zero or more batch dimensions consisting of symmetric positive-definite matrices.

    upper

    (bool, optional) flag that indicates whether to return a upper or lower triangular matrix. Default: False

    out

    (Tensor, optional) the output matrix

    - -

    cholesky(input, upper=False, out=None) -> Tensor

    - - - - -

    Computes the Cholesky decomposition of a symmetric positive-definite -matrix \(A\) or for batches of symmetric positive-definite matrices.

    -

    If upper is True, the returned matrix U is upper-triangular, and -the decomposition has the form:

    -

    $$ - A = U^TU -$$ -If upper is False, the returned matrix L is lower-triangular, and -the decomposition has the form:

    -

    $$ - A = LL^T -$$ -If upper is True, and \(A\) is a batch of symmetric positive-definite -matrices, then the returned tensor will be composed of upper-triangular Cholesky factors -of each of the individual matrices. Similarly, when upper is False, the returned -tensor will be composed of lower-triangular Cholesky factors of each of the individual -matrices.

    - -

    Examples

    -
    
    -  
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_cholesky_inverse.html b/docs/reference/torch_cholesky_inverse.html deleted file mode 100644 index 6c559ffd69c9f6fb2a795f71fb9f290096dc5a1f..0000000000000000000000000000000000000000 --- a/docs/reference/torch_cholesky_inverse.html +++ /dev/null @@ -1,232 +0,0 @@ - - - - - - - - -Cholesky_inverse — torch_cholesky_inverse • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Cholesky_inverse

    -
    - - -

    Arguments

    - - - - - - - - - - - - - - -
    input

    (Tensor) the input 2-D tensor \(u\), a upper or lower triangular Cholesky factor

    upper

    (bool, optional) whether to return a lower (default) or upper triangular matrix

    out

    (Tensor, optional) the output tensor for inv

    - -

    cholesky_inverse(input, upper=False, out=None) -> Tensor

    - - - - -

    Computes the inverse of a symmetric positive-definite matrix \(A\) using its -Cholesky factor \(u\): returns matrix inv. The inverse is computed using -LAPACK routines dpotri and spotri (and the corresponding MAGMA routines).

    -

    If upper is False, \(u\) is lower triangular -such that the returned tensor is

    -

    $$ - inv = (uu^{{T}})^{{-1}} -$$ -If upper is True or not provided, \(u\) is upper -triangular such that the returned tensor is

    -

    $$ - inv = (u^T u)^{{-1}} -$$

    - -

    Examples

    -
    
    -  
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_cholesky_solve.html b/docs/reference/torch_cholesky_solve.html deleted file mode 100644 index 02ae006d1262a1973058296f4c951c472bc15915..0000000000000000000000000000000000000000 --- a/docs/reference/torch_cholesky_solve.html +++ /dev/null @@ -1,261 +0,0 @@ - - - - - - - - -Cholesky_solve — torch_cholesky_solve • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Cholesky_solve

    -
    - - -

    Arguments

    - - - - - - - - - - - - - - - - - - -
    input

    (Tensor) input matrix \(b\) of size \((*, m, k)\), where \(*\) is zero or more batch dimensions

    input2

    (Tensor) input matrix \(u\) of size \((*, m, m)\), where \(*\) is zero of more batch dimensions composed of upper or lower triangular Cholesky factor

    upper

    (bool, optional) whether to consider the Cholesky factor as a lower or upper triangular matrix. Default: False.

    out

    (Tensor, optional) the output tensor for c

    - -

    cholesky_solve(input, input2, upper=False, out=None) -> Tensor

    - - - - -

    Solves a linear system of equations with a positive semidefinite -matrix to be inverted given its Cholesky factor matrix \(u\).

    -

    If upper is False, \(u\) is and lower triangular and c is -returned such that:

    -

    $$ - c = (u u^T)^{{-1}} b -$$ -If upper is True or not provided, \(u\) is upper triangular -and c is returned such that:

    -

    $$ - c = (u^T u)^{{-1}} b -$$ -torch_cholesky_solve(b, u) can take in 2D inputs b, u or inputs that are -batches of 2D matrices. If the inputs are batches, then returns -batched outputs c

    - -

    Examples

    -
    # \dontrun{ - -a = torch_randn(c(3, 3)) -a = torch_mm(a, a$t()) # make symmetric positive definite -u = torch_cholesky(a) -a
    #> torch_tensor -#> 4.8833 -0.7896 -0.4785 -#> -0.7896 1.0348 -0.2048 -#> -0.4785 -0.2048 0.8552 -#> [ CPUFloatType{3,3} ]
    b = torch_randn(c(3, 2)) -b
    #> torch_tensor -#> 0.5712 -0.1153 -#> -1.2014 0.0291 -#> 1.1547 0.9237 -#> [ CPUFloatType{3,2} ]
    torch_cholesky_solve(b, u)
    #> torch_tensor -#> 0.0975 0.1667 -#> -0.8489 0.4068 -#> 1.2015 1.2708 -#> [ CPUFloatType{3,2} ]
    torch_mm(a$inverse(), b)
    #> torch_tensor -#> 0.0975 0.1667 -#> -0.8489 0.4068 -#> 1.2015 1.2708 -#> [ CPUFloatType{3,2} ]
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_chunk.html b/docs/reference/torch_chunk.html deleted file mode 100644 index 796c98a337775f327c646ea4921b425f357d9e47..0000000000000000000000000000000000000000 --- a/docs/reference/torch_chunk.html +++ /dev/null @@ -1,221 +0,0 @@ - - - - - - - - -Chunk — torch_chunk • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Chunk

    -
    - - -

    Arguments

    - - - - - - - - - - - - - - -
    input

    (Tensor) the tensor to split

    chunks

    (int) number of chunks to return

    dim

    (int) dimension along which to split the tensor

    - -

    chunk(input, chunks, dim=0) -> List of Tensors

    - - - - -

    Splits a tensor into a specific number of chunks. Each chunk is a view of -the input tensor.

    -

    Last chunk will be smaller if the tensor size along the given dimension -dim is not divisible by chunks.

    - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_clamp.html b/docs/reference/torch_clamp.html deleted file mode 100644 index 77e9449e77a451d4fcebea6fd4bf38886ca91910..0000000000000000000000000000000000000000 --- a/docs/reference/torch_clamp.html +++ /dev/null @@ -1,295 +0,0 @@ - - - - - - - - -Clamp — torch_clamp • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Clamp

    -
    - - -

    Arguments

    - - - - - - - - - - - - - - - - - - - - - - -
    input

    (Tensor) the input tensor.

    min

    (Number) lower-bound of the range to be clamped to

    max

    (Number) upper-bound of the range to be clamped to

    out

    (Tensor, optional) the output tensor.

    value

    (Number) minimal value of each element in the output

    - -

    clamp(input, min, max, out=None) -> Tensor

    - - - - -

    Clamp all elements in input into the range [ min, max ] and return -a resulting tensor:

    -

    $$ - y_i = \left\{ \begin{array}{ll} - \mbox{min} & \mbox{if } x_i < \mbox{min} \\ - x_i & \mbox{if } \mbox{min} \leq x_i \leq \mbox{max} \\ - \mbox{max} & \mbox{if } x_i > \mbox{max} - \end{array} - \right. -$$ -If input is of type FloatTensor or DoubleTensor, args min -and max must be real numbers, otherwise they should be integers.

    -

    clamp(input, *, min, out=None) -> Tensor

    - - - - -

    Clamps all elements in input to be larger or equal min.

    -

    If input is of type FloatTensor or DoubleTensor, value -should be a real number, otherwise it should be an integer.

    -

    clamp(input, *, max, out=None) -> Tensor

    - - - - -

    Clamps all elements in input to be smaller or equal max.

    -

    If input is of type FloatTensor or DoubleTensor, value -should be a real number, otherwise it should be an integer.

    - -

    Examples

    -
    # \dontrun{ - -a = torch_randn(c(4)) -a
    #> torch_tensor -#> -0.9506 -#> 2.2284 -#> -0.7040 -#> -0.4355 -#> [ CPUFloatType{4} ]
    torch_clamp(a, min=-0.5, max=0.5)
    #> torch_tensor -#> -0.5000 -#> 0.5000 -#> -0.5000 -#> -0.4355 -#> [ CPUFloatType{4} ]
    - -a = torch_randn(c(4)) -a
    #> torch_tensor -#> 0.9982 -#> -1.4524 -#> -1.4201 -#> 0.5077 -#> [ CPUFloatType{4} ]
    torch_clamp(a, min=0.5)
    #> torch_tensor -#> 0.9982 -#> 0.5000 -#> 0.5000 -#> 0.5077 -#> [ CPUFloatType{4} ]
    - -a = torch_randn(c(4)) -a
    #> torch_tensor -#> 1.9805 -#> -1.3783 -#> 0.7469 -#> -0.5865 -#> [ CPUFloatType{4} ]
    torch_clamp(a, max=0.5)
    #> torch_tensor -#> 0.5000 -#> -1.3783 -#> 0.5000 -#> -0.5865 -#> [ CPUFloatType{4} ]
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_combinations.html b/docs/reference/torch_combinations.html deleted file mode 100644 index e4e0e223e5a98b96f5bb296d5ed0111615993fb8..0000000000000000000000000000000000000000 --- a/docs/reference/torch_combinations.html +++ /dev/null @@ -1,240 +0,0 @@ - - - - - - - - -Combinations — torch_combinations • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Combinations

    -
    - - -

    Arguments

    - - - - - - - - - - - - - - -
    input

    (Tensor) 1D vector.

    r

    (int, optional) number of elements to combine

    with_replacement

    (boolean, optional) whether to allow duplication in combination

    - -

    combinations(input, r=2, with_replacement=False) -> seq

    - - - - -

    Compute combinations of length \(r\) of the given tensor. The behavior is similar to -python's itertools.combinations when with_replacement is set to False, and -itertools.combinations_with_replacement when with_replacement is set to True.

    - -

    Examples

    -
    # \dontrun{ - -a = c(1, 2, 3) -tensor_a = torch_tensor(a) -torch_combinations(tensor_a)
    #> torch_tensor -#> 1 2 -#> 1 3 -#> 2 3 -#> [ CPUFloatType{3,2} ]
    torch_combinations(tensor_a, r=3)
    #> torch_tensor -#> 1 2 3 -#> [ CPUFloatType{1,3} ]
    torch_combinations(tensor_a, with_replacement=TRUE)
    #> torch_tensor -#> 1 1 -#> 1 2 -#> 1 3 -#> 2 2 -#> 2 3 -#> 3 3 -#> [ CPUFloatType{6,2} ]
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_conj.html b/docs/reference/torch_conj.html deleted file mode 100644 index 919a3dc4dc70b564ac5b89a2fa4850be8c027344..0000000000000000000000000000000000000000 --- a/docs/reference/torch_conj.html +++ /dev/null @@ -1,219 +0,0 @@ - - - - - - - - -Conj — torch_conj • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Conj

    -
    - - -

    Arguments

    - - - - - - - - - - -
    input

    (Tensor) the input tensor.

    out

    (Tensor, optional) the output tensor.

    - -

    conj(input, out=None) -> Tensor

    - - - - -

    Computes the element-wise conjugate of the given input tensor.

    -

    $$ - \mbox{out}_{i} = conj(\mbox{input}_{i}) -$$

    - -

    Examples

    -
    
    -  
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_conv1d.html b/docs/reference/torch_conv1d.html deleted file mode 100644 index 6d5137784bbe5f5bd0722bbfe909e6b0fe316dc9..0000000000000000000000000000000000000000 --- a/docs/reference/torch_conv1d.html +++ /dev/null @@ -1,4820 +0,0 @@ - - - - - - - - -Conv1d — torch_conv1d • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Conv1d

    -
    - - -

    Arguments

    - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    input

    NA input tensor of shape \((\mbox{minibatch} , \mbox{in\_channels} , iW)\)

    weight

    NA filters of shape \((\mbox{out\_channels} , \frac{\mbox{in\_channels}}{\mbox{groups}} , kW)\)

    bias

    NA optional bias of shape \((\mbox{out\_channels})\). Default: None

    stride

    NA the stride of the convolving kernel. Can be a single number or a one-element tuple (sW,). Default: 1

    padding

    NA implicit paddings on both sides of the input. Can be a single number or a one-element tuple (padW,). Default: 0

    dilation

    NA the spacing between kernel elements. Can be a single number or a one-element tuple (dW,). Default: 1

    groups

    NA split input into groups, \(\mbox{in\_channels}\) should be divisible by the number of groups. Default: 1

    - -

    conv1d(input, weight, bias=None, stride=1, padding=0, dilation=1, groups=1) -> Tensor

    - - - - -

    Applies a 1D convolution over an input signal composed of several input -planes.

    -

    See ~torch.nn.Conv1d for details and output shape.

    -

    .. include:: cudnn_deterministic.rst

    - -

    Examples

    -
    # \dontrun{ - -filters = torch_randn(c(33, 16, 3)) -inputs = torch_randn(c(20, 16, 50)) -nnf_conv1d(inputs, filters)
    #> torch_tensor -#> (1,.,.) = -#> Columns 1 to 8 8.7199 1.0094 -13.1435 0.0848 10.3420 0.6087 4.9930 -1.3194 -#> 4.9928 -0.9228 -0.9986 6.1375 -2.7563 6.2688 -4.3247 1.3281 -#> -9.2853 2.2857 5.2282 -3.3548 -1.1207 0.3963 0.6191 9.8601 -#> -2.9242 10.2787 0.3779 3.2870 -0.1333 5.6178 6.8783 0.1472 -#> -3.6903 -1.2163 11.9645 -7.1134 -7.8627 8.8960 -5.3701 6.3564 -#> 6.0586 -13.3361 8.7976 5.4943 -0.7770 16.2518 -1.2318 -3.5980 -#> -2.6689 7.9637 -3.4276 -8.6586 -11.7605 7.9358 3.0729 -3.9882 -#> 4.5930 -2.0741 -3.8087 2.1398 0.5135 -4.2471 -6.8810 -0.7988 -#> 8.2812 1.7671 0.9025 -1.8145 -4.3674 -2.3088 0.6176 3.8401 -#> 9.8749 2.9536 2.1752 0.1801 -4.7035 -4.8083 -2.0966 7.7104 -#> -4.8894 -5.0632 10.0122 -17.8657 14.6597 -3.3046 -14.7129 8.5806 -#> -8.5251 -10.4482 13.8896 8.8005 1.1426 10.7473 -8.9987 -1.4510 -#> -0.4190 6.8991 -5.3740 4.5935 -17.9217 -4.3224 -7.6038 5.2284 -#> 4.8968 7.2210 -3.0931 1.1218 -3.9569 -6.7213 7.2451 0.0891 -#> 2.0933 2.8404 -0.0990 2.7687 -4.0503 12.9643 9.7891 6.1938 -#> -0.1327 4.4571 -4.0763 0.9068 -4.0663 -10.0672 -0.0225 1.3389 -#> -9.7440 14.3030 -2.2208 2.3737 4.2655 -4.8818 -6.3157 -14.4964 -#> -3.8432 -18.8804 5.3663 1.4313 -1.3751 14.9081 -9.7717 -4.8676 -#> -3.7739 0.8794 1.7302 -10.1387 2.1366 -12.9728 -9.7771 22.9757 -#> 2.7529 9.6604 -0.1306 -3.2509 -3.3979 -0.0270 9.5813 1.2052 -#> -1.1247 -1.7801 1.5065 -2.9020 6.6147 0.8074 -6.8228 -8.4092 -#> 10.0025 6.9134 -9.1085 8.0784 -11.5030 -9.0789 10.6144 -9.2909 -#> -7.9290 6.5677 6.5561 -1.8320 11.5627 -5.7025 6.3629 4.9404 -#> -11.5437 8.3430 -13.9013 -4.2953 6.8770 -10.3218 6.7874 2.5812 -#> -1.8688 -0.4796 -2.9052 1.1717 10.4609 3.9477 2.4715 1.7198 -#> -2.8262 7.9209 -3.2288 -6.6791 -5.2385 -4.3199 2.6352 2.9522 -#> 6.2078 -0.4560 6.5245 -3.6146 -1.6950 14.1103 5.0830 -5.4658 -#> -11.5664 6.8556 2.3652 4.2702 1.0913 -5.0231 1.3486 -16.3077 -#> 8.8823 -0.4775 -1.9547 10.6024 1.7667 -2.6328 12.5331 9.6971 -#> -11.2919 12.3110 6.9462 -8.8800 -1.5003 -14.4211 -7.1624 7.2424 -#> 2.9990 -6.5829 10.0200 5.3001 2.1273 -3.6609 -12.9631 7.9098 -#> -3.6909 11.1726 7.8032 2.4063 7.1658 4.1017 4.1518 0.2259 -#> 1.0179 6.3694 -1.5287 5.6457 -8.5666 -2.8587 5.0281 7.5907 -#> -#> Columns 9 to 16 2.7100 12.4777 -0.7324 -7.0967 -8.4348 0.8085 6.8330 0.4832 -#> -8.0836 -5.8170 3.4022 -9.3509 10.5116 -4.9803 6.4073 0.3220 -#> 0.1177 -4.8427 -9.6902 -4.1170 4.2751 4.1760 7.5186 -2.3325 -#> -0.2762 -10.6201 -1.7163 0.3247 21.2705 -3.6339 2.3409 -1.9524 -#> -5.4666 1.8799 7.4607 6.1014 -5.0309 -0.7846 -7.0460 -4.5347 -#> -14.8343 1.0224 2.7581 1.0455 -8.7523 6.5568 2.4569 0.8886 -#> 1.1306 8.4311 6.7053 7.2054 4.8025 0.7849 12.7978 10.1318 -#> 1.7782 0.2690 -2.7999 2.6726 -4.2082 -5.3613 13.9660 -4.6798 -#> 2.3140 -8.0598 12.3022 4.7510 0.3521 0.5352 -5.9258 3.5649 -#> -3.9661 -1.8614 1.3968 -5.5604 3.0089 -9.1651 -4.2411 1.4253 -#> -6.1841 17.0519 3.7948 8.4733 -14.7091 8.7843 -6.6218 4.4110 -#> 1.9766 7.1204 -14.9206 -10.7534 -8.4386 -3.9708 4.7609 -10.6990 -#> 15.3699 -7.0543 5.9875 2.9363 9.7296 -1.8657 3.1239 -3.5793 -#> -3.2610 3.2806 7.3320 9.7861 -1.4700 4.6751 -1.0152 5.4633 -#> 4.6915 -7.1968 1.9738 9.5179 -6.9344 6.6227 -1.7615 -4.0994 -#> 12.9562 -5.4254 -4.4734 -3.8013 1.8657 8.0770 -13.3441 3.3371 -#> -8.7960 1.4661 7.6683 -12.3310 11.6751 -13.0036 -1.9325 -2.9310 -#> -8.7788 10.2001 7.3673 -9.2273 -6.8593 -1.2400 0.5811 -6.0189 -#> 2.1427 6.4209 0.6752 9.6093 6.0686 2.9096 9.4132 6.9639 -#> -6.5691 5.4340 -2.0061 -11.7639 -5.7487 -12.4004 12.9011 3.9542 -#> -7.6302 3.3180 -3.8970 -15.9881 -3.6270 -1.2098 1.5831 -4.1935 -#> 11.1702 -0.4212 1.5514 3.8853 4.7498 13.8274 -3.8186 8.3544 -#> -9.0928 2.6163 -5.1443 9.0578 6.0795 -3.2900 0.8558 5.4144 -#> -3.7635 14.0854 -0.3645 -2.8304 11.9087 6.8293 0.3143 4.1031 -#> -3.6399 3.6001 0.3297 -10.6195 -22.8159 -4.6401 -6.4364 -2.2262 -#> 4.5382 0.7714 -5.7943 15.3865 -6.2597 2.9687 7.7204 -7.6626 -#> -1.1954 1.3419 3.5047 6.7636 0.5723 -3.9505 2.3417 0.7854 -#> -3.7300 4.1058 -8.3983 2.9162 -12.2176 10.6979 -12.7928 -4.0314 -#> -6.2145 -8.7189 -11.9263 -8.0741 -15.0889 -15.3207 3.2689 -9.3045 -#> -3.6201 -1.4019 -4.1480 8.8221 3.6578 -1.4179 -12.9980 -2.9236 -#> 3.1550 -3.8547 -9.1263 5.0425 6.9819 -7.1466 -8.7011 -15.4836 -#> 3.3937 -11.3399 -8.4114 -9.2620 2.7354 -16.1142 -0.4008 -9.2902 -#> 14.1278 -11.2060 0.8862 -2.8045 -7.4519 9.9396 -3.1323 0.3984 -#> -#> Columns 17 to 24 3.1652 2.7114 3.4294 3.0415 -4.4903 7.3292 -7.9222 5.6389 -#> 1.4022 -2.9618 4.4276 5.1812 -3.8627 -7.9557 1.3882 -5.7022 -#> -0.7707 0.4761 1.1342 -13.3373 0.1454 -9.6590 -10.9706 -8.7114 -#> 2.3547 3.2183 11.7203 -4.3086 -10.5229 7.3772 -9.7069 -10.1956 -#> 6.7410 -11.6240 -0.2382 4.6611 7.6330 -1.6570 3.7988 14.0638 -#> 6.2723 5.4443 5.0403 -11.4443 9.5750 9.6919 -5.5292 4.9020 -#> 3.5478 9.5459 -7.2635 0.9142 7.1907 0.9225 -0.5947 9.1271 -#> -7.7508 13.0199 0.7921 -3.7426 -1.0259 -3.3604 -5.9647 5.9305 -#> 9.3377 0.7645 -0.9862 -2.8337 -1.9219 -2.1858 -2.1245 -5.5627 -#> -0.3122 -6.7192 1.9777 6.8633 12.8850 -9.8542 2.4686 -7.5124 -#> 11.0900 -14.9795 3.5174 -3.2619 16.1453 -6.4522 12.5797 12.6404 -#> -5.9324 -2.8032 5.1964 -1.0824 -1.9832 11.7757 -1.7279 -12.9944 -#> -0.9892 1.0205 2.8183 -7.3203 2.3917 -2.5111 1.4563 -3.8504 -#> 7.3162 1.6155 5.1525 -12.5349 11.1861 -7.4035 -3.2327 -2.9030 -#> 7.1859 5.1134 -0.2741 -4.0530 -1.2997 -3.2875 -5.5183 15.7646 -#> -8.1788 2.5854 5.3686 -2.9653 8.9160 -4.7501 -5.7845 -0.0098 -#> -12.0708 -1.6919 -3.5657 2.9483 10.7586 0.2813 2.4652 -9.4908 -#> 0.5983 -5.5621 -1.6855 12.0777 7.8527 6.7515 -3.6835 -2.2488 -#> 4.0004 13.8937 17.0219 -16.7366 1.2763 -9.2189 12.1063 6.0094 -#> -3.6126 -3.3000 -1.6039 10.1845 -2.7015 -11.2980 8.8211 -1.4717 -#> -4.4065 -2.8283 5.3857 7.3212 -15.1150 -6.8182 -3.2339 -11.4930 -#> 5.9961 14.4878 -7.1496 3.3353 -8.7106 1.7533 -4.1133 -3.2559 -#> 7.3600 19.3142 8.8778 -5.9481 8.3435 0.9410 11.6001 3.9670 -#> 10.8095 -7.0179 13.0225 -2.6706 6.3683 -3.4559 24.2514 -12.8641 -#> 3.2058 -7.5246 7.2564 2.0544 -1.8178 -7.5897 7.9397 -2.8945 -#> -5.4792 13.4542 5.1535 -3.1458 4.5727 7.0280 -3.7451 19.4841 -#> 4.7460 -2.8855 -13.2574 -4.7093 -8.3299 -5.1462 -5.0900 8.6444 -#> -10.8531 -6.6294 -9.2172 -6.9031 14.4248 -4.8154 -2.9415 1.8398 -#> -6.2820 -4.4764 -2.3139 4.8700 4.6623 -12.3261 -3.9658 -0.6185 -#> 1.0061 -5.7484 3.3581 0.1569 19.8596 7.6068 -0.7496 2.7470 -#> -2.2016 6.4260 -2.1709 -11.6076 -5.6707 0.9438 -6.6296 9.2304 -#> -13.8891 -0.0952 2.0144 -1.3961 -1.1307 0.9833 -5.6071 -2.3964 -#> 7.1253 3.4768 -1.7254 3.0641 2.3469 4.7614 0.0285 5.2199 -#> -#> Columns 25 to 32 -5.7640 1.9080 1.2179 -7.9288 12.6428 -4.0817 -2.0562 11.2799 -#> -2.2999 3.3650 -3.3909 4.6736 2.4997 -0.5393 -8.2946 11.0434 -#> 0.4672 3.7375 -6.1090 5.6665 0.1840 -1.2466 3.4214 2.1810 -#> 16.8931 1.4244 -2.1984 -6.3534 9.9076 5.2015 -8.9448 2.1251 -#> 3.4824 6.7484 -2.2112 3.6076 0.1031 6.6900 7.2862 -4.4783 -#> 1.2703 -7.0325 -10.5537 2.3520 -14.0131 -8.6101 2.5236 4.7246 -#> 6.5031 2.1143 -7.2054 -3.5723 1.5327 -2.3223 13.5636 1.2073 -#> 2.6270 -10.5701 0.3349 7.4437 1.2394 -11.9173 2.2026 1.5314 -#> 3.6801 -10.6600 7.3837 -3.9585 1.3472 3.7613 -11.5850 -5.6722 -#> -0.3205 0.1430 -8.6536 5.9411 0.8691 -1.4013 0.8575 9.8608 -#> 6.6989 4.2341 4.1919 9.8073 -13.1589 -0.6517 10.5697 -4.4361 -#> 10.4385 1.2819 -12.3164 7.7010 -3.0824 -7.2327 3.1755 0.4660 -#> -0.6344 11.4283 1.1485 3.0558 0.5925 -0.7444 -0.7708 4.2235 -#> 13.7591 -4.8287 2.2579 -7.3116 -9.8958 10.0413 -1.5291 -8.6872 -#> 2.3379 15.0105 6.5214 8.6823 -7.9255 -2.5331 6.9940 -9.1324 -#> 17.5126 -5.5142 8.5011 -5.0201 5.6609 9.2456 -6.1677 3.4066 -#> 2.7979 -5.5795 -8.3789 -8.7183 2.8860 1.6986 1.9887 8.5494 -#> 1.4318 -1.6808 -7.4402 -0.1204 7.6516 5.0772 13.7477 3.2697 -#> 12.2700 15.0287 -0.2295 15.0833 0.3382 -10.1724 -0.2717 -12.9545 -#> -2.9782 -0.0518 -3.3041 0.6671 -6.4112 -0.1517 10.1593 -1.5037 -#> -8.0853 -10.9545 3.6043 -9.2561 1.8679 5.7242 -10.7073 -2.8222 -#> 9.8356 -9.7044 16.5548 -3.7865 -4.3693 8.5636 -6.1174 -5.1056 -#> 7.9162 3.0483 -3.3383 4.8383 -7.7680 -6.6377 3.4518 -2.8278 -#> -7.1266 7.2581 -1.7810 -8.9348 4.6869 -5.0878 -7.0578 6.4613 -#> -14.1705 -1.9353 -5.3650 2.5194 0.2712 1.1505 -7.4139 -11.8516 -#> 6.0882 2.5719 -1.0495 12.0958 1.7734 -4.2679 16.0315 -4.3643 -#> 5.2256 -1.9650 0.6906 -2.4795 -8.6869 -5.6020 -11.8749 -6.4854 -#> 0.8150 -3.5122 -8.1720 -7.3898 -15.8716 -2.2908 14.4807 -9.1374 -#> -5.1087 -3.3034 0.2836 4.4421 -9.0222 7.1890 -0.6107 -11.7751 -#> -0.4007 4.7543 -6.2101 -4.1952 -6.2441 10.7278 12.6808 6.7488 -#> 12.5466 1.6176 -6.0263 5.8591 -2.0679 -16.3913 -3.3983 -2.6660 -#> -5.7903 3.3609 -9.2619 3.9730 -6.3071 -0.9969 -5.8838 -4.9410 -#> 1.3117 1.3398 10.9463 -3.0687 -8.4949 7.5141 -6.1267 0.4574 -#> -#> Columns 33 to 40 5.2150 -0.8799 8.5244 8.7060 7.5206 5.6204 3.4728 -2.8515 -#> -7.1945 -2.2905 -5.6205 -9.4417 2.0720 -7.6105 6.4689 -3.4050 -#> -3.1793 3.3743 1.3336 -13.1295 -2.0912 -1.7854 4.5045 -2.4045 -#> -12.8614 5.7727 -6.9273 -9.1353 4.9521 -4.8614 3.4216 3.9188 -#> -3.9518 -1.9401 0.1467 6.3261 1.4010 8.1285 11.6837 -2.6874 -#> 3.4330 -5.8609 -5.7545 0.2943 2.2022 -3.8233 4.7208 -8.1877 -#> -4.1200 -3.3579 -1.6158 -2.2032 -4.3891 2.5318 11.8702 -1.2812 -#> -4.3724 -2.4815 -7.6249 -0.3565 -6.4236 -6.1582 3.4271 -2.6058 -#> -10.7067 -2.7378 -0.1568 0.4353 -2.4969 13.3673 -4.5670 1.2707 -#> -15.9498 4.6342 -3.5398 -2.3704 2.9344 -3.9178 7.6102 3.9654 -#> 5.2910 -6.9972 -1.8721 4.3733 1.1638 4.6027 5.1656 7.6044 -#> -6.5072 0.3133 -9.6089 -2.3442 -2.4470 -1.7199 5.8528 -6.2555 -#> -4.4756 2.8104 8.7608 -6.9807 -1.2307 2.2073 -9.4628 14.5121 -#> -0.8752 0.9772 -0.6482 -1.2494 4.9640 4.4118 1.1982 -2.9184 -#> -1.4969 -2.8102 8.4710 -4.1947 13.4710 15.7840 8.0034 6.4894 -#> -0.2018 -2.0787 3.6000 -5.0466 -6.3977 -0.3397 -2.1683 5.1826 -#> -10.4754 -0.3989 -2.1770 7.0205 3.4139 -9.8878 -0.6335 6.0262 -#> -2.2778 7.0196 -4.0415 -0.7292 -0.8091 -2.1652 5.3104 -10.2415 -#> -4.7832 6.1902 -6.4281 -16.2450 -0.6198 -1.2905 7.3274 13.9358 -#> 0.0250 0.0059 2.8493 7.5505 7.3472 3.1271 3.9622 14.2142 -#> -7.1615 -4.8397 -3.9963 -2.8790 -11.1888 -3.5380 -8.0007 -17.4729 -#> 6.1410 -7.7925 3.5944 11.6301 6.0241 5.4524 -3.4412 1.0908 -#> 9.0182 -4.6797 -7.1141 7.6652 1.9939 -8.9450 4.6569 2.0778 -#> 8.6951 -0.7674 8.8390 -2.0409 -1.4114 -5.7077 -10.2438 -6.6126 -#> -0.7592 -7.9487 6.5387 -19.3390 0.3024 -2.0058 3.5009 -6.7498 -#> 6.0412 -5.7854 6.8616 3.3522 2.0884 2.2317 12.6853 5.7927 -#> -1.7431 -10.2967 -2.6837 13.5075 1.5764 -0.3518 1.4739 1.8059 -#> 11.0760 -16.0045 0.1230 -9.1960 4.6352 -1.8363 3.4795 -4.1980 -#> -6.6729 3.4266 -1.6780 -10.0093 6.0663 0.8971 13.7264 -5.1570 -#> 2.7899 -3.6841 7.6861 -0.9843 -1.9141 -4.7011 -3.1319 7.7179 -#> -9.1538 -4.4516 -8.9869 -2.2559 1.1936 0.6045 0.1807 -3.2672 -#> -8.1173 5.0338 -8.0610 1.1754 -1.3898 1.5216 -1.7338 3.0696 -#> 4.5485 -4.0602 14.4818 1.3694 -0.8208 3.5496 6.0590 -6.4393 -#> -#> Columns 41 to 48 9.4318 8.2171 4.8322 5.6457 -2.4645 -3.5706 -0.8075 4.8571 -#> -3.2900 0.4726 -2.3633 1.8494 -2.6713 -3.3732 -5.7716 -4.3053 -#> -0.2045 -11.8373 -13.0920 -5.4249 2.9510 -3.4678 -15.6115 -8.6613 -#> -1.0374 -2.1204 -9.4928 -0.0420 5.7950 3.1081 -1.5467 1.4160 -#> 2.2251 -6.1046 -3.2651 2.6999 2.5576 10.5800 0.0160 -3.9492 -#> 9.1404 4.3808 6.6542 -0.1789 0.4081 8.8088 0.6046 -0.0137 -#> 1.1672 -9.2990 3.7059 5.6774 2.8614 -2.1775 -7.2604 0.4836 -#> -2.4184 -0.9147 -0.6182 1.0150 3.7945 -2.1332 1.4857 -7.8693 -#> -6.0545 -8.6096 1.3435 0.2288 4.7465 -0.0030 -7.0416 -1.5885 -#> -11.9655 0.3257 -4.6491 2.0680 -3.0774 -0.2861 8.1254 -4.1858 -#> 0.5986 12.9359 -1.3932 3.2026 4.7596 4.9578 0.6192 -0.7744 -#> 0.2748 4.1774 -1.7001 -4.7775 0.3853 0.3061 -3.2432 8.5125 -#> -11.3921 -10.8563 -7.5818 -1.9139 -3.8404 3.6539 5.9986 -0.6856 -#> 7.9886 -7.6804 -0.7950 2.8417 4.4662 9.4306 0.1284 11.4431 -#> -1.4610 -1.3248 0.5573 -4.3987 -2.9653 -3.4342 -0.6057 -0.4561 -#> -1.3849 -9.5639 -2.3433 1.1506 7.7221 1.7906 3.4410 1.6867 -#> -1.6394 5.4429 -8.0806 -0.0369 0.6217 -3.6089 7.8226 6.9275 -#> 13.5281 -1.9440 4.2859 -6.6300 -2.5808 10.4666 6.6420 5.9351 -#> -7.1382 0.0638 -2.3715 7.4408 9.4569 0.4644 -4.9758 -6.1014 -#> -3.6463 10.5673 2.6788 2.2272 7.1652 -0.5320 -10.1751 -13.5356 -#> 6.0480 -6.9958 -12.2500 -4.8509 2.3519 -1.9778 -14.9620 3.8193 -#> 4.9883 -8.4463 10.3489 3.0552 0.4646 -7.8505 1.3567 7.2660 -#> 13.4826 10.6962 4.5866 4.7238 8.7162 5.6177 5.2107 6.1307 -#> -3.1070 0.8094 -0.3173 4.1823 -7.6181 -4.1219 2.5528 20.6973 -#> 4.1262 0.0812 10.5087 6.9784 5.2514 3.9877 -13.4908 -0.9355 -#> -6.2435 -0.0535 -5.1731 -1.5473 4.3737 -7.7088 -2.1305 -1.1972 -#> -4.0223 3.8423 -0.1538 2.1298 3.1662 4.6899 -4.0121 -8.6478 -#> 1.9742 0.8946 -9.8064 -1.7442 -1.2579 -0.8046 14.5273 9.2468 -#> -5.3971 1.5371 -2.6447 2.3005 -2.7475 -1.9236 -11.5593 -9.2965 -#> 5.1362 3.7165 -5.0594 1.5171 6.5340 4.1253 13.0958 8.7482 -#> -2.3735 -2.9054 4.5685 1.3250 8.4590 16.7741 3.8116 1.4827 -#> -24.0291 1.0120 -9.4348 -3.3122 4.4437 -0.8187 1.9867 -5.7997 -#> 11.1663 -8.7557 4.2210 2.3879 -1.6097 0.6809 -6.8775 -2.3145 -#> -#> (2,.,.) = -#> Columns 1 to 6 -7.1509e-01 -4.9349e-01 2.9598e+00 7.5520e+00 4.0061e+00 -1.3150e+00 -#> 6.1158e-01 -6.8831e+00 4.3357e+00 2.3711e+00 5.0670e+00 4.3449e+00 -#> -5.1513e+00 -1.9952e+00 3.0471e-01 -1.4798e+00 -2.9636e+00 1.1523e+00 -#> 7.4474e-02 1.0948e+00 -6.2086e+00 -5.7473e+00 3.9698e+00 -3.6172e+00 -#> -1.0056e+01 1.9371e+00 -1.2662e+00 -1.7345e+00 -1.1842e+01 3.6392e+00 -#> 3.7177e+00 -8.3396e-01 -6.0881e+00 4.9506e+00 -5.1576e+00 -4.8864e+00 -#> 2.7897e+00 3.9768e+00 -1.1350e+01 -5.2308e+00 -2.3766e+00 1.1203e+01 -#> 5.9575e+00 3.1428e+00 -5.9637e+00 5.7530e+00 7.2610e-01 5.6093e+00 -#> 1.1167e+00 4.4847e+00 7.6154e+00 -1.3624e+00 8.2594e+00 -9.0724e+00 -#> -4.8284e+00 -1.0704e+01 2.0032e+00 2.9894e+00 -6.4011e-01 8.4435e+00 -#> 5.1141e+00 -1.6564e+00 4.4674e+00 6.0807e+00 -6.4299e+00 5.6468e-01 -#> 2.3772e-01 6.5366e+00 8.5935e+00 3.8245e+00 -1.4917e+01 -9.2021e-01 -#> -5.6431e+00 -1.2715e+00 -2.1304e+00 6.5598e+00 -5.9600e+00 -3.6707e+00 -#> -4.7215e+00 -1.9434e-01 2.5668e+00 1.0392e+01 -3.2271e+00 -6.6734e+00 -#> -3.0612e+00 -2.3705e+00 -7.3181e-01 9.4870e+00 -7.3197e-01 -2.5297e+00 -#> -5.7609e+00 -8.1802e+00 -7.4856e+00 -1.3887e+01 2.7613e+00 -5.9915e+00 -#> 7.5561e+00 6.5578e+00 5.6733e+00 9.0824e+00 1.1058e+01 1.6211e+01 -#> -2.8835e+00 -4.0863e+00 2.6817e+00 6.0484e+00 -3.5239e+00 3.4253e+00 -#> 8.4178e+00 -1.9318e+00 -8.1999e+00 5.6324e+00 -4.5409e+00 6.2325e+00 -#> -4.9633e+00 1.1019e+01 3.7780e+00 1.7794e+00 -4.6708e+00 -5.6795e-01 -#> 5.0768e+00 5.8880e+00 -2.3387e+00 -2.4749e+00 4.1631e+00 2.0668e-01 -#> 3.3728e+00 1.9753e+00 1.2435e+01 -9.1491e+00 6.8930e+00 -1.4042e+01 -#> -2.3147e+00 1.1081e+00 -4.7888e+00 3.3629e+00 -5.7519e+00 7.9487e+00 -#> -4.1190e+00 -6.3947e-01 -1.4461e+01 -6.3951e+00 -6.1554e+00 3.6205e+00 -#> 1.3226e+00 5.5740e+00 -4.9470e+00 -3.4072e+00 5.9575e-01 -7.4203e+00 -#> -2.0237e+00 6.2413e+00 -1.3620e+01 -4.2507e+00 7.2215e+00 6.4567e+00 -#> 4.3053e+00 7.2772e+00 8.3178e+00 3.6260e+00 -4.1942e+00 -1.9330e+00 -#> -1.4517e+00 7.2922e-01 4.5117e+00 -2.4268e+00 7.4411e+00 6.0721e+00 -#> -1.0672e+01 -5.0188e+00 1.7881e-01 6.3910e+00 -4.9103e+00 -1.8536e+00 -#> -6.9459e+00 -5.0136e+00 -2.8328e+00 4.1931e+00 -1.0879e+00 8.8608e+00 -#> -1.3535e+00 -6.3363e-01 6.6245e+00 8.1844e+00 -4.6843e+00 -3.0662e+00 -#> -1.7060e+00 -5.8109e+00 -2.4222e+00 -1.0309e+01 -8.6567e+00 3.7424e-01 -#> -9.0191e+00 -5.1409e+00 -2.7915e+00 -1.0410e+01 -5.4462e+00 -1.1519e+01 -#> -#> Columns 7 to 12 -5.9516e+00 7.0156e+00 5.0552e+00 8.3654e-01 4.9012e+00 3.2226e+00 -#> -2.9401e-03 -8.3668e+00 -3.8962e+00 1.0038e+01 -3.0033e+00 -8.2599e-01 -#> 3.8983e+00 -8.2420e+00 -2.4075e+00 1.6270e+00 -1.6823e+01 1.3287e+00 -#> -1.0277e+00 1.8767e-01 -1.2692e+01 1.2294e+00 -1.2903e+01 1.2650e+01 -#> -4.3204e+00 -8.9256e+00 -5.2164e-01 -1.7049e+00 -2.2844e+00 -8.9570e+00 -#> 5.9251e+00 2.8970e+00 -2.0009e-01 -3.1560e+00 5.2960e+00 -5.3078e+00 -#> 1.7958e+00 4.7085e-02 -7.6300e+00 8.0213e+00 -1.4238e+01 -3.1199e+00 -#> 4.0685e-01 8.8669e-01 -1.9305e+00 9.5970e+00 4.0827e+00 -5.9016e+00 -#> 3.5766e+00 3.4390e+00 -5.6902e+00 -1.2657e+01 -5.3466e+00 -3.8703e+00 -#> 1.9449e+00 -9.8307e+00 3.2450e+00 2.4425e+00 -3.7293e+00 8.7565e-01 -#> -1.5291e+00 -8.5767e+00 6.2224e+00 4.7074e+00 -1.7078e+00 3.0609e+00 -#> -3.9321e+00 6.1785e+00 5.0283e+00 4.1094e+00 -2.6448e+00 5.3437e-01 -#> 1.5058e+00 -1.2001e+00 -9.9301e+00 7.9571e+00 -4.7609e+00 -1.0210e+01 -#> 4.2221e+00 -3.1459e+00 -2.5754e+00 3.1745e+00 -1.0673e+01 -4.2940e+00 -#> -1.5951e+00 -2.1660e+01 -6.0590e-01 1.2761e+00 -5.5640e+00 -4.4963e+00 -#> 3.5914e+00 -7.2424e+00 -4.2301e-01 -2.1719e+00 -5.9560e+00 -6.6360e+00 -#> 6.0457e+00 9.1813e-01 9.8854e+00 1.5404e+01 8.7826e+00 -1.0195e+01 -#> 7.1532e+00 1.1020e+01 -7.9245e-01 1.8356e-01 -1.0282e+01 -2.5896e-01 -#> 4.4396e+00 -6.3894e+00 -3.1537e+00 -1.0250e+01 -9.5840e+00 -3.5273e+00 -#> -1.0148e+01 -6.4415e+00 7.3811e+00 1.6038e+01 -1.2954e+00 -7.3381e-01 -#> -1.1356e+01 1.0164e+01 3.7982e+00 1.1379e+01 -1.3215e+01 -4.7850e+00 -#> -7.0512e+00 3.6091e+00 -1.1422e+00 -7.8434e+00 8.2024e-01 3.2626e+00 -#> -5.2845e+00 -3.2287e+00 1.0512e+01 3.9971e+00 -5.6834e+00 -9.3933e+00 -#> -1.4016e+00 9.6409e-01 8.2726e+00 1.9828e+00 4.4276e+00 3.6313e+00 -#> 1.9920e+00 -9.7004e-01 5.1981e+00 -9.5342e+00 -5.0760e+00 3.3793e+00 -#> 2.1394e+00 -1.4262e+01 3.0598e+00 7.2985e+00 1.9970e+00 -9.2220e+00 -#> -1.0893e+01 -2.5566e+00 -6.1387e+00 1.0846e+01 6.5455e+00 1.6814e+00 -#> 3.2075e+00 -1.5686e+01 1.4660e+01 6.6920e+00 2.2215e+00 -1.4393e+01 -#> 3.2561e+00 -3.0157e+00 -2.8714e+00 -8.9107e-02 -8.5308e-01 9.6009e-01 -#> 6.7504e+00 9.9947e+00 1.3110e+01 -1.0746e+01 8.8372e+00 -1.4285e+01 -#> 9.1524e-01 2.6967e+00 1.3832e+00 7.8529e+00 4.4973e+00 4.9536e+00 -#> -2.6921e+00 -1.4808e+01 -2.9850e+00 1.8306e+00 2.6230e-01 4.2523e-01 -#> -1.2366e+01 -7.5979e-01 -2.1096e+00 -6.5637e-02 2.7424e+00 -8.5872e+00 -#> -#> Columns 13 to 18 2.7000e+00 8.5381e+00 -8.1091e+00 -3.7162e+00 2.0213e+00 -6.0458e+00 -#> -4.4232e+00 -1.0786e+01 1.4304e+01 7.0479e-01 -3.7465e+00 -1.9488e+00 -#> 2.8797e+00 4.6008e-01 7.6718e+00 -2.5439e+00 5.3121e+00 1.3722e+00 -#> 1.9172e+00 3.2784e+00 7.7748e+00 -1.2554e+01 6.0663e+00 3.9239e+00 -#> 1.9770e+00 1.4078e+01 -6.5417e-01 9.7094e+00 -2.4134e+00 5.5477e+00 -#> -3.2059e+00 6.4861e-02 4.7822e+00 -6.6546e+00 -4.4499e+00 -1.5853e+00 -#> -2.4581e+00 -1.5303e+00 4.5577e+00 -3.0146e+00 1.0402e+01 9.5618e+00 -#> -6.2991e+00 -5.5900e+00 -5.3919e+00 -1.0916e+00 9.6959e+00 -5.6915e+00 -#> 1.0740e+01 9.5805e-01 -4.2546e+00 8.1128e+00 -7.7091e+00 -2.4287e+00 -#> -1.1507e+00 5.1993e+00 8.7242e+00 1.1963e+01 -6.3006e-01 -3.7769e-01 -#> -6.9138e+00 -1.0091e+01 -1.2608e+01 8.1199e+00 1.6238e+00 3.1721e+00 -#> -1.6155e+01 7.7175e-01 1.0307e+01 -2.4856e+00 1.0481e+01 -6.2984e+00 -#> 3.8670e+00 -2.1379e+00 -1.2605e+00 2.7508e+00 7.6017e-01 -8.3812e+00 -#> 5.5454e+00 6.7137e+00 -1.7055e+00 -7.7823e+00 -3.4515e+00 4.7732e+00 -#> 7.6535e+00 9.3814e+00 -8.8797e+00 -1.1818e+01 -8.5267e+00 -1.4578e+00 -#> 2.7738e+00 1.0396e+01 3.6270e+00 3.0116e+00 2.7857e-01 1.8120e+01 -#> 3.4237e+00 -1.0749e+01 5.3512e+00 2.4723e-01 -7.9703e+00 1.9826e+01 -#> 6.5276e+00 9.1472e+00 2.5897e+00 5.6845e-01 2.4023e+00 9.8481e-01 -#> -8.0553e+00 -1.1198e+01 -1.0056e+01 1.4838e+00 6.0982e+00 1.8073e+00 -#> 7.9459e+00 -8.7685e+00 2.4101e+00 4.0183e+00 -2.7991e-01 8.6546e+00 -#> -1.1821e+01 4.5998e+00 6.9386e-01 -4.8410e+00 3.9323e+00 3.4028e-02 -#> 6.9009e+00 6.5700e+00 -2.3530e+00 -2.3630e+00 -3.5682e-01 -3.1213e+00 -#> -6.3269e+00 3.8343e+00 -7.5480e+00 -1.1199e+01 -8.8526e+00 1.1089e+01 -#> -7.9600e+00 -2.6078e-01 7.3612e+00 3.3860e+00 -5.8622e+00 -5.7458e+00 -#> 1.6788e+00 -4.5105e+00 4.8601e+00 2.6536e+00 6.3036e-01 1.6862e+00 -#> -8.7453e+00 1.9969e+00 -2.7020e+00 1.2625e+00 8.6836e+00 4.4179e+00 -#> -3.8124e+00 -5.3169e+00 -1.0316e+01 -6.2008e+00 -1.0265e+01 -4.3392e+00 -#> -1.1196e+01 1.0953e+00 1.2303e+01 -3.2832e+00 -2.6740e+00 -8.8153e-02 -#> -4.3849e+00 9.3394e-02 1.0517e+01 6.6309e+00 -1.0079e-01 -2.0375e+00 -#> 2.8490e+00 1.0117e+01 -5.2329e+00 9.0272e-01 4.3376e+00 7.2262e+00 -#> 1.2064e+00 2.9931e+00 -1.0497e+01 -1.0496e+01 5.1515e+00 -1.6065e+00 -#> 1.3753e+01 -3.8291e+00 1.2658e+01 -2.3645e+00 -1.1169e+01 4.3685e+00 -#> -2.9462e+00 3.2668e+00 4.3312e+00 4.1991e+00 6.5311e+00 -3.7727e+00 -#> -#> Columns 19 to 24 -2.6109e-01 4.8078e+00 -1.0124e+01 7.8367e+00 -4.4960e+00 -9.2124e+00 -#> 3.4842e+00 -8.3740e-01 1.0619e+01 2.2888e-01 -9.4949e+00 4.2166e-01 -#> 7.1282e-02 5.1503e+00 8.8166e+00 -3.2567e+00 -3.4595e+00 2.0536e+00 -#> 7.7086e+00 -1.0302e+01 2.3211e+00 -1.3219e+00 -6.5876e-01 -9.8129e+00 -#> -3.6886e+00 1.1903e+01 -2.9881e+00 -7.3948e+00 -1.5627e+00 8.1673e+00 -#> 1.1306e+01 -6.6918e+00 -3.6204e+00 -4.8952e+00 -9.2597e+00 -6.2369e+00 -#> -6.9676e+00 -1.6389e+00 -2.3276e+00 3.5074e+00 -2.0643e+00 -2.0283e-02 -#> -4.5878e+00 -2.5588e+00 8.6070e+00 3.8829e+00 -3.9444e+00 -3.7486e+00 -#> -5.5344e+00 8.6413e+00 8.5727e-02 9.4498e-01 8.3532e+00 -1.1366e+01 -#> -5.5513e+00 2.5488e+00 7.4590e+00 -3.5227e+00 -5.4778e+00 7.7005e+00 -#> 6.5902e+00 -6.7006e+00 -1.9534e+00 -4.8660e+00 -7.5983e+00 1.9447e+01 -#> -1.2766e+01 6.2877e+00 7.3611e+00 -7.5964e+00 9.8168e+00 3.7307e+00 -#> -1.1571e+00 3.2945e+00 3.3334e+00 -6.9614e-01 1.3975e+00 8.6417e+00 -#> -2.3076e+00 -3.3192e-01 3.1283e+00 -1.2837e+01 5.1154e+00 -5.8279e+00 -#> 5.5291e+00 7.0720e+00 -6.0436e-02 -5.0244e+00 -1.6342e+01 5.6834e+00 -#> 4.0753e+00 -8.6386e-01 -1.2163e+00 -3.3384e-01 8.5434e+00 -6.3283e+00 -#> -8.1850e+00 7.5152e-02 -5.8536e-01 1.3266e+00 -7.6313e+00 1.1692e+01 -#> 1.2035e+01 2.2418e+00 1.5214e+00 2.7325e+00 -4.7900e+00 -8.3553e+00 -#> 1.3710e+00 -5.7864e+00 3.9161e+00 4.7640e+00 9.8295e-01 -8.4390e-01 -#> -5.9023e+00 -7.4171e+00 -5.0537e+00 -4.7018e+00 -8.9029e-01 9.8228e+00 -#> -9.1306e+00 -1.9524e+00 8.9289e+00 3.8039e+00 1.5887e+01 -5.0845e+00 -#> 7.2528e+00 -3.6954e+00 -3.4254e+00 -4.3860e-02 5.0786e+00 -8.6250e+00 -#> -1.7334e+00 3.8103e+00 -2.5993e-01 -1.1297e+01 -1.1125e+01 6.7070e-01 -#> -1.3782e+01 5.9581e+00 -3.7445e+00 7.2762e+00 8.2591e+00 1.3241e+00 -#> 7.8407e+00 -3.1490e+00 -2.0267e+00 2.5580e+00 1.0221e+01 -6.0490e+00 -#> -4.8752e-01 -9.7989e+00 3.6149e+00 6.3880e-01 -6.0927e+00 6.9155e+00 -#> -5.2781e+00 2.3281e+00 -2.9401e+00 -5.2770e+00 -4.5600e+00 4.0959e-02 -#> -1.6510e+00 -4.4628e+00 -4.4317e-01 -3.7385e+00 -7.0257e-01 1.2179e+01 -#> -1.4263e+00 2.4627e-02 1.5154e+01 -8.5295e+00 1.1547e+01 -1.7261e+00 -#> 3.3542e-01 3.9312e+00 -1.0944e+01 2.7019e+00 6.4122e+00 2.6792e+00 -#> 4.4669e-01 1.9376e+00 7.9560e+00 -3.4105e+00 -3.0819e+00 2.3268e+00 -#> -9.2093e+00 -2.2787e+00 6.0683e-01 -5.0750e+00 -4.6972e+00 2.2780e+01 -#> 3.7401e+00 2.3183e-01 -5.3432e+00 -7.5792e+00 1.1843e+01 2.6668e+00 -#> -#> Columns 25 to 30 1.3633e+01 1.1635e+01 -1.4768e+00 -1.0947e+00 -1.4935e+00 7.3367e+00 -#> 1.8347e+00 -5.9848e+00 -1.5196e+01 8.0386e+00 -8.6412e-01 5.7371e+00 -#> 3.3975e-01 -8.7520e-01 -2.1590e+00 6.9778e+00 2.7767e+00 4.6087e+00 -#> 1.4119e+01 -3.1869e+00 -7.2740e+00 9.6347e+00 8.3509e+00 -9.9526e+00 -#> 1.7852e+00 -6.6034e+00 1.3139e+00 5.3587e+00 6.7300e+00 1.1495e+01 -#> 1.0884e+01 -3.6163e+00 -6.1070e-01 9.3564e+00 1.5856e+00 -1.4446e+01 -#> 4.5013e+00 -3.0506e+00 -1.3709e+00 -6.1256e+00 -1.4894e+00 6.7897e+00 -#> 7.1776e+00 -3.4181e+00 -5.2673e+00 -8.4402e+00 -1.3334e+01 9.0230e+00 -#> -4.5190e+00 -8.1797e+00 1.2852e+00 -1.2342e+00 1.4442e+01 3.5929e+00 -#> -4.9083e+00 -8.2237e+00 -7.7480e+00 1.4833e+00 -3.7700e+00 8.9328e+00 -#> 1.3677e+01 -1.1703e+01 -3.4264e+00 8.4007e+00 3.0622e+00 -3.9277e+00 -#> -1.2444e+00 -7.7115e+00 -1.1126e+00 -4.7788e+00 -9.1684e+00 1.7096e+00 -#> -7.9805e+00 1.9402e+00 5.8282e+00 -1.0984e+01 5.7442e-01 4.8096e+00 -#> 3.5863e+00 -6.9901e+00 2.0761e+01 -6.0529e+00 1.0542e+01 -5.5653e+00 -#> 4.7324e-01 1.0031e+01 1.2682e+01 5.8382e+00 2.4513e+01 1.7300e+01 -#> 4.3019e+00 -3.9498e+00 4.4878e+00 -2.3613e+00 8.3003e+00 -1.0475e+01 -#> 6.1743e+00 -6.1079e+00 -1.0839e+01 -4.8769e+00 -6.2840e+00 7.5247e+00 -#> 1.1283e+01 -1.8325e+00 -7.0253e+00 5.0291e+00 -2.7680e+00 -7.1598e+00 -#> 2.6188e-01 -6.6821e+00 -1.2407e+00 7.5910e+00 -1.4712e+01 4.1615e+00 -#> 1.2098e+00 7.7760e+00 8.4132e+00 -1.8960e+00 9.5166e-02 1.1113e+01 -#> -8.5597e-01 -1.1365e+01 -6.1514e-01 -7.5014e+00 -4.4777e+00 7.6538e-01 -#> -5.3641e+00 8.4976e+00 7.7201e+00 -9.8481e+00 1.3061e+01 2.3271e+00 -#> -5.7804e-01 -2.1340e+00 6.5594e+00 1.0442e+01 2.3031e+00 5.4915e+00 -#> -2.4073e+00 -7.2948e+00 -7.1055e+00 -1.4424e+01 -1.8628e+01 6.5668e+00 -#> -1.7887e+00 -1.0561e+01 -3.0666e+00 1.3548e+01 1.8072e+00 -8.3541e+00 -#> 1.4815e+01 2.1536e+00 -2.1050e+00 -2.4299e+00 -2.1874e+00 7.0530e+00 -#> 1.2392e+01 1.9511e+00 -5.1508e+00 4.0476e-01 1.3191e+01 1.0705e+01 -#> -3.2279e+00 1.7266e+00 -4.3829e+00 -7.9983e+00 2.0262e+00 6.8517e+00 -#> -8.2418e+00 -8.7575e+00 1.0860e+01 1.2190e+01 -1.1073e+00 7.0043e+00 -#> -4.0054e+00 7.2396e-01 -1.3193e+00 4.3663e+00 -1.4141e+01 6.7344e+00 -#> 8.4796e+00 -7.0270e+00 -5.3384e+00 -9.5140e+00 -5.6786e-01 3.8349e+00 -#> -1.6548e+01 7.2795e-01 7.1365e+00 8.1874e+00 2.1921e+00 -9.6058e+00 -#> -3.1823e+00 -8.5109e-01 4.5027e+00 4.7013e+00 8.7304e+00 -3.1466e+00 -#> -#> Columns 31 to 36 -3.0502e+00 -5.4226e+00 -1.1866e+00 -7.7360e-04 1.0046e+01 -1.4339e+01 -#> 5.1785e+00 8.5924e+00 7.3401e+00 -9.1732e+00 6.5585e+00 4.2616e+00 -#> 4.7791e+00 1.9069e+00 1.2301e+00 -3.1632e+00 1.0141e+01 -7.9537e+00 -#> -3.2250e+00 1.1265e+01 8.3408e+00 -1.5957e+01 1.1140e+01 3.0385e+00 -#> -5.7449e+00 3.2804e+00 -3.5045e+00 3.9281e+00 -4.7560e+00 5.1381e+00 -#> -3.3620e+00 3.1483e+00 6.3762e+00 -3.0654e+00 -5.8521e+00 -2.6027e+00 -#> -6.8715e-01 -1.3580e+01 2.9583e+00 -2.8945e+00 -2.2106e+00 -1.0597e+00 -#> 2.3841e+00 -2.2713e+00 5.1123e-01 1.3546e+00 2.8196e+00 -4.8460e+00 -#> 4.1300e+00 -8.9717e-01 -2.0092e+00 8.0245e+00 8.9510e+00 -5.7317e+00 -#> 4.9242e+00 5.6605e+00 4.8302e-01 -1.6416e+00 -5.6424e+00 -3.1269e+00 -#> -1.7472e+01 9.0286e+00 -1.1657e+00 1.5655e+01 -1.8243e+01 1.5363e+01 -#> -1.8474e+00 4.0897e+00 7.5385e+00 6.4631e+00 1.1732e+01 -1.5875e+01 -#> -2.9430e+00 1.0304e-02 -6.3027e+00 4.3613e+00 7.5438e+00 -7.5815e+00 -#> -6.3004e+00 -2.0406e+00 -2.1198e+00 -1.7317e+00 5.3290e+00 1.1506e+00 -#> 4.4363e+00 -7.1078e+00 -2.2689e+00 9.1593e+00 -1.0209e+00 1.4142e+01 -#> -1.0158e+00 3.4533e-01 1.0040e+01 -5.2964e+00 -3.8510e+00 -9.8925e+00 -#> 7.0940e+00 -1.9105e+00 -2.6435e+00 -6.7222e+00 4.9688e+00 -9.4728e+00 -#> -1.4549e+01 5.3767e-01 2.3201e+00 -1.9899e+00 -5.1085e+00 -2.1957e+00 -#> 9.1778e+00 1.1308e+01 -4.9178e+00 -2.6880e+00 3.7381e+00 -4.8390e-02 -#> -1.4211e+00 -4.6613e+00 -5.0277e+00 7.5841e+00 -5.6116e+00 1.0896e+01 -#> -1.3916e+00 1.0658e+00 6.3661e+00 -3.2695e+00 8.0748e+00 1.1913e+01 -#> -5.4840e-01 -7.0615e+00 -2.1461e-01 5.8277e+00 4.4998e+00 6.1279e-01 -#> -7.2273e+00 -3.3353e+00 -2.2451e-01 -4.2120e+00 1.4607e+00 2.1736e+00 -#> 2.3888e+00 7.9467e+00 -1.4923e+01 -2.8997e+00 -1.4515e+00 -1.0238e+01 -#> 1.9038e+00 -2.1767e+00 1.7820e+00 5.0317e+00 9.6143e+00 3.5390e+00 -#> -2.2836e+00 -6.7059e-01 -3.3277e+00 2.3968e+00 -6.2656e+00 7.1541e-02 -#> 1.8024e+00 -4.6275e+00 3.8931e+00 2.2845e+00 9.2879e+00 2.6737e+00 -#> -2.7062e+00 -6.9413e+00 -2.2579e+00 9.6534e+00 -2.0443e+01 1.0024e+01 -#> 7.2623e+00 3.7980e+00 6.2083e+00 -5.2031e-01 6.6343e+00 2.8769e+00 -#> 4.6087e-01 -1.1342e+00 -1.0071e+01 -1.0014e+01 -1.0012e+00 -1.6799e+01 -#> 1.3046e+01 1.2031e+01 -5.9934e+00 2.7704e+00 5.1693e+00 -9.1025e+00 -#> 2.3357e+00 3.6832e+00 1.0729e+01 2.8181e+00 4.0101e+00 2.8323e+00 -#> -3.7210e+00 9.5575e-01 9.1658e+00 -3.9453e+00 6.6812e+00 -9.8705e+00 -#> -#> Columns 37 to 42 8.0274e+00 1.3473e+00 -7.9683e+00 -2.5021e+00 -1.4562e+00 6.9043e+00 -#> 6.4800e+00 -6.4930e+00 6.5987e+00 1.6992e+00 3.3423e+00 2.9007e+00 -#> 4.1236e+00 2.8806e+00 3.4910e+00 9.0047e+00 1.3609e+00 -4.1131e+00 -#> 3.9045e+00 -2.7661e+00 8.0995e+00 -3.2099e+00 -6.3028e+00 3.6100e+00 -#> 2.8595e+00 8.5890e+00 6.9759e+00 -3.6375e+00 1.2716e+01 -5.1055e+00 -#> -3.9744e+00 -3.7970e+00 4.5694e+00 -4.4010e+00 4.6595e+00 1.0435e+01 -#> 8.7296e+00 2.9286e+00 -1.9703e+00 2.7617e+00 1.0797e+00 2.7736e+00 -#> 4.3076e+00 -1.6706e+00 -7.8943e+00 7.5354e+00 -6.1941e+00 3.9286e+00 -#> 1.2360e+01 -5.4219e+00 1.6949e+01 -6.1366e+00 1.3807e+01 -6.2802e+00 -#> 1.7763e+00 6.6658e+00 -1.9258e+00 7.6167e+00 4.0521e+00 -4.5945e+00 -#> -8.7879e+00 2.0683e+00 -1.3782e+00 -1.0502e+00 4.1240e+00 -9.3366e+00 -#> -2.4919e+00 4.6335e+00 5.6993e-01 5.8056e+00 1.3667e+00 -3.8362e+00 -#> -7.9760e+00 -9.7185e+00 6.3919e+00 5.8327e+00 -5.5591e+00 -1.0145e+01 -#> -2.4361e+00 -1.3595e+00 5.3784e+00 -1.0564e+01 4.4731e+00 -7.7008e+00 -#> -1.2498e+00 1.1055e+01 1.4361e+01 4.4318e+00 -9.1977e-01 -1.0886e+01 -#> 7.8499e+00 -4.4737e-01 4.0138e+00 -1.6863e+01 3.4583e+00 -8.8853e+00 -#> 5.6519e+00 7.7085e+00 1.0316e+00 -5.9979e+00 1.2462e+00 -5.8032e-02 -#> -6.5523e+00 -4.4294e+00 1.6930e+00 -8.1491e+00 -1.3399e+00 -7.3746e-01 -#> -2.9941e+00 -7.9544e+00 -9.6285e+00 2.6299e+00 3.5760e+00 -3.2629e+00 -#> 5.6632e+00 1.0023e+00 -3.3555e+00 8.9834e+00 -2.5112e-01 -3.7210e+00 -#> -1.3401e+00 1.2614e+00 -1.3780e+00 -1.5681e+00 3.4846e+00 -2.9290e+00 -#> 7.9777e+00 -9.5111e+00 9.9007e+00 7.6525e-01 -7.4262e+00 -2.2465e+00 -#> -1.0764e+01 1.0823e+01 -1.8126e+01 -8.9734e+00 -3.3309e+00 7.3897e+00 -#> -2.4367e+00 -9.3883e+00 5.1017e+00 1.9732e+00 1.5514e+01 -9.8860e+00 -#> 1.1193e+01 -1.3586e+01 1.2199e+01 -6.2468e+00 4.8108e+00 -4.9684e+00 -#> 5.5543e+00 -7.2836e+00 -1.2312e+01 1.6544e+00 -7.8411e+00 -8.5071e-01 -#> 7.2632e+00 7.6805e+00 1.0372e+01 4.6467e+00 3.2371e+00 1.8547e+01 -#> 1.0000e+01 4.6672e+00 8.9041e+00 6.0691e+00 2.6341e+00 -1.5674e+00 -#> 1.2061e+01 3.0764e+00 5.6641e+00 4.6102e+00 -1.0707e+01 -2.9058e-01 -#> 9.3883e+00 4.1780e+00 -1.0755e+01 -3.1317e+00 1.2084e+00 -6.2495e+00 -#> 1.5567e+00 1.9683e+01 1.0854e+01 2.5242e+00 3.8949e+00 4.5147e+00 -#> -8.4709e+00 2.0079e+01 -4.0331e+00 3.4815e+00 -1.0787e+01 -2.4349e+00 -#> 8.2260e+00 -1.2772e+01 5.0200e+00 -5.5445e+00 -2.8086e+00 -3.2149e+00 -#> -#> Columns 43 to 48 -3.8896e+00 6.5787e-01 -1.2978e+00 -6.1221e+00 1.3124e+00 -6.3121e-01 -#> 3.2791e+00 -2.4867e+00 1.8503e+00 -2.1419e+00 -2.8203e+00 6.7540e+00 -#> -3.9395e+00 8.9447e-01 4.1483e+00 2.9906e+00 -1.8324e+00 -1.3498e+01 -#> 9.0459e+00 2.9160e+00 1.3991e+01 -3.4683e+00 -7.5850e+00 9.4991e-01 -#> -4.8166e+00 4.3716e+00 -4.0342e+00 1.5501e+01 6.5063e+00 1.4848e+01 -#> 4.3978e+00 -1.7978e+00 -1.5569e+01 9.5838e-01 -4.2077e+00 1.1983e+01 -#> -2.2132e+00 5.1870e+00 -2.2836e+00 7.6397e+00 -3.4778e-01 6.4687e+00 -#> -3.0285e+00 -6.3110e+00 -7.4775e-01 -4.4857e+00 -6.5898e+00 1.2399e+00 -#> 8.5546e+00 9.6174e+00 5.5361e+00 1.3995e+00 7.3978e+00 6.0620e+00 -#> 9.5426e-01 -4.6008e-01 2.7788e+00 -1.6533e+00 7.6717e+00 -6.0897e+00 -#> -2.3392e+00 -1.4206e+00 9.6478e-01 1.2578e+00 4.4325e+00 3.5684e+00 -#> -1.2031e+01 3.2264e-01 -1.2314e+00 5.9877e+00 -1.8689e-01 -6.1992e+00 -#> 3.6786e+00 4.4235e+00 7.3856e+00 -3.5121e+00 1.8254e+00 -2.7950e+00 -#> 1.2859e+01 3.8752e+00 3.0138e+00 2.1199e+00 -2.3312e+00 8.4692e+00 -#> 3.6537e+00 -3.2655e+00 6.4011e+00 9.5566e+00 1.0377e+00 1.9657e+00 -#> 9.0634e+00 -6.8571e+00 4.1418e+00 -2.2360e+00 -4.1616e+00 -2.5369e+00 -#> -1.2561e+01 -7.0413e+00 9.6420e-01 -1.5377e+00 7.4733e+00 -6.1888e+00 -#> -1.2041e+01 -4.9432e+00 -6.8733e+00 7.2480e+00 4.7230e+00 1.5316e+00 -#> 4.2213e-01 2.5705e+00 -4.9289e+00 8.1845e+00 -1.2568e+01 -6.1808e+00 -#> 3.1592e+00 -2.7227e+00 -1.4107e+00 6.9731e+00 2.3468e+00 -2.5774e+00 -#> 4.4478e+00 -1.9829e+00 2.7573e+00 -2.3223e+00 -1.0544e+01 2.5697e-01 -#> 1.6394e+01 4.0235e+00 1.8288e+01 -1.6814e+00 2.0892e+00 1.2139e+01 -#> 3.9782e+00 -2.6290e+00 -4.9761e+00 1.6872e-01 -6.4128e+00 4.5709e+00 -#> 6.5372e+00 7.4668e+00 -6.9902e+00 1.1342e+01 -9.0094e+00 3.0885e+00 -#> -2.4901e+00 1.1389e+01 -6.9001e+00 8.2024e+00 -1.1017e+01 -4.0060e+00 -#> -4.9509e+00 -5.8122e+00 -1.0876e+00 7.2729e+00 -7.2931e+00 6.0226e+00 -#> -9.5747e-01 1.3897e+00 -9.5474e-01 -5.0532e+00 3.3397e+00 1.9763e+00 -#> 2.7213e+00 -1.0482e+01 1.0440e-01 5.5811e+00 -1.7081e-01 3.2723e+00 -#> -4.6305e+00 2.3510e+00 -2.4568e+00 -6.0458e+00 -6.0839e+00 -7.8381e+00 -#> -1.0843e+01 6.2862e-01 -1.2367e+00 5.5577e+00 -3.5569e+00 -8.7925e-02 -#> -2.7422e+00 1.8663e+00 -5.8871e-01 -4.5779e+00 -2.7187e+00 -1.7621e+00 -#> 3.8242e+00 4.7493e+00 8.6695e+00 -5.1289e+00 -1.1561e+00 -1.6367e+01 -#> -2.5585e+00 4.8995e+00 5.4630e+00 -1.4587e+00 -6.8408e+00 1.0571e+01 -#> -#> (3,.,.) = -#> Columns 1 to 6 1.4080e+00 -1.1899e+01 -4.1353e+00 -1.4412e+00 1.1742e+01 6.9489e+00 -#> -6.5218e+00 1.3255e+01 9.3847e+00 -7.3147e+00 1.4185e+00 2.9938e+00 -#> -9.4088e+00 9.8435e+00 7.5546e+00 -1.8330e+00 5.6999e-01 2.4329e+00 -#> -2.1506e+00 -1.1168e+00 1.6671e-02 7.6816e+00 7.6185e+00 -1.3157e+01 -#> -1.4543e+00 -7.0075e+00 3.7290e+00 -6.4805e+00 -4.2380e+00 -1.1300e+01 -#> -6.9435e+00 -3.1541e+00 1.6184e+01 2.3992e+00 2.1464e+00 -2.0194e+01 -#> -2.1261e+00 -6.1874e+00 -3.1801e+00 -9.9955e+00 4.5779e+00 -3.3643e+00 -#> -1.3131e+00 6.6850e+00 -6.6825e-01 -3.8057e+00 -2.1247e-01 -1.0433e+01 -#> -5.0201e+00 -5.4811e+00 1.9074e-02 -9.6455e-01 -6.4131e+00 6.0250e+00 -#> -5.4076e+00 1.1422e+01 5.0324e+00 -2.3835e+00 -5.3782e+00 3.2404e+00 -#> -8.0323e+00 2.0626e+00 -1.0257e+01 2.4690e+00 -4.1333e+00 -1.6341e+01 -#> 8.1692e-01 -4.8066e+00 6.5308e+00 -2.0152e+00 1.1058e+01 -1.1816e+01 -#> -1.1623e+00 4.4053e+00 4.9868e+00 -1.4910e+00 -8.6254e+00 8.2777e+00 -#> -8.3532e+00 2.3937e+00 -1.1702e+01 5.5676e+00 -4.8145e+00 -6.8998e+00 -#> -8.3821e+00 -2.0785e+00 -8.5879e+00 -1.1150e+01 4.8695e+00 -3.8867e+00 -#> -1.2349e+00 7.4407e+00 3.5352e+00 -2.9630e+00 -4.7452e+00 4.3327e+00 -#> -7.6731e+00 9.8579e+00 -5.3611e+00 -2.5492e-01 -8.7314e+00 4.6041e+00 -#> -5.3861e+00 -9.1160e-01 7.6677e+00 1.7004e+00 7.4107e+00 -6.2495e+00 -#> -2.6204e+00 3.1218e+00 -5.8771e+00 -9.5579e+00 -9.5801e+00 -6.1502e+00 -#> 2.1210e-01 4.4249e+00 -1.3583e+01 -5.0736e+00 -1.4737e+00 9.9051e+00 -#> 1.1629e+01 5.7636e+00 4.6826e+00 -2.0306e+00 3.1641e+00 -3.3694e+00 -#> 4.8854e+00 -7.1501e+00 -1.1292e+01 -2.5348e+00 5.3458e+00 1.3115e+01 -#> 8.2982e-01 -1.4568e+00 -8.7658e+00 6.9908e+00 1.3195e+00 -7.1090e+00 -#> 1.1800e+00 1.6513e+00 6.2211e+00 2.4865e+00 -1.1569e+01 1.5910e+01 -#> -1.0067e+01 -2.4248e+00 3.5204e+00 -7.3729e+00 -2.9552e+00 9.0173e+00 -#> -3.5460e-01 -1.3474e+00 3.3893e+00 1.2785e+00 4.2084e+00 -1.0641e+01 -#> -1.9705e+00 -1.6608e+01 -3.1503e+00 -4.9205e+00 -5.1441e+00 -7.2497e+00 -#> -4.0333e+00 1.4794e+01 -1.0558e+01 -5.8430e+00 -1.0755e+01 -1.7470e-01 -#> -1.1416e+01 1.0601e+01 -3.5518e+00 -1.1902e+00 -4.6416e+00 5.6309e+00 -#> -6.7081e+00 3.0187e-01 -8.6661e-01 -5.8872e+00 2.0574e+00 -3.1048e+00 -#> -7.0169e+00 -5.9670e+00 -8.7838e+00 -1.1772e+01 -4.3665e+00 -1.3870e+01 -#> 6.1087e-02 9.9251e+00 2.8213e+00 1.0690e+01 -1.8414e+00 2.7020e+00 -#> 7.8980e-01 -3.7714e+00 1.1692e+01 -1.0431e+00 3.6993e-01 -3.0131e+00 -#> -#> Columns 7 to 12 4.9066e+00 -4.1308e+00 6.7264e+00 -2.3971e+00 2.5494e+00 3.4488e+00 -#> 6.5412e-01 6.1340e-01 3.3838e+00 -8.8767e-01 -5.3286e-01 1.1592e+01 -#> 8.5030e+00 4.7017e+00 2.5309e+00 1.8400e+00 -2.3989e-01 5.0525e+00 -#> -2.8190e+00 1.1768e+00 2.9686e+00 9.4875e-01 -2.0928e+00 2.8620e+00 -#> 8.0581e+00 2.0097e+00 7.7453e-01 -2.4825e+00 5.6598e+00 -5.6388e+00 -#> -2.9977e+00 5.2568e+00 -8.4047e+00 4.8200e+00 9.3105e+00 -1.2028e+00 -#> -1.1490e+01 4.0066e+00 9.3038e+00 -1.0326e+00 8.8266e-01 8.2433e+00 -#> -1.9693e+00 9.3338e-02 6.7674e+00 -7.5838e+00 -1.2921e+00 4.9577e+00 -#> 1.2455e+01 1.2788e+01 -1.2791e+00 6.4981e-01 -3.5957e+00 -1.3183e+00 -#> 9.4231e+00 -2.2993e+00 4.4472e+00 -7.1652e+00 -2.1302e-02 3.9951e+00 -#> -2.4195e+00 -2.5335e+00 -4.7614e+00 -2.5586e+00 2.0540e+00 2.6202e+00 -#> -1.3300e+01 8.5292e+00 4.0447e+00 -6.2759e+00 7.4482e+00 7.2431e-01 -#> 4.9680e+00 -3.3235e+00 -4.6155e-01 -1.5015e+00 -1.1129e+01 7.0367e+00 -#> 1.1640e+01 -2.5281e+00 -6.0034e+00 1.1838e+01 8.8062e-01 -1.9703e+00 -#> 5.5088e+00 3.4637e+00 -8.0601e+00 3.6713e+00 9.8656e+00 -4.1082e+00 -#> 9.0013e-01 -7.9194e+00 -1.2030e+01 6.5047e+00 -4.0442e+00 3.4924e+00 -#> 1.9017e+00 2.0474e+00 7.3447e-01 4.4541e+00 -8.6128e+00 -1.1041e+01 -#> 3.9773e+00 5.6824e+00 -3.7343e+00 -6.4544e+00 5.0678e+00 -1.2625e+01 -#> -1.3919e+01 -7.6696e+00 -8.0680e+00 -7.8409e-01 1.7832e-01 1.4216e+01 -#> -8.4426e+00 -6.3279e-01 9.9402e+00 7.9904e+00 2.1366e+00 9.8434e+00 -#> 1.0340e+01 2.9289e+00 7.5236e+00 -3.0736e+00 5.6299e+00 2.1383e+00 -#> 4.8554e+00 2.0273e+00 -3.4944e+00 9.8330e+00 -1.5486e+00 -7.0196e+00 -#> -1.3181e+01 -6.1692e+00 -4.3927e+00 -6.9112e-01 7.3283e+00 -4.3668e+00 -#> -1.0650e+01 -1.0113e+01 -1.0493e+00 -4.0480e+00 -5.0376e+00 1.5352e+01 -#> -1.2744e+00 1.3469e+01 -8.7806e+00 4.8216e+00 4.3525e+00 7.1390e+00 -#> -7.6099e+00 8.2266e+00 -1.6163e+00 -5.2979e+00 -9.2686e-01 4.1946e+00 -#> -6.8688e+00 6.8039e+00 4.5720e+00 2.1791e+00 3.2972e+00 2.7397e+00 -#> -1.3702e+01 5.0406e+00 -2.5053e+00 1.6890e+01 2.2438e+00 7.1325e+00 -#> 1.4510e+01 3.9684e+00 -2.3852e-01 4.7563e+00 3.9079e+00 1.0018e+01 -#> 3.8940e+00 -7.0659e+00 -5.8890e+00 1.1028e+01 -8.5083e+00 -7.1714e+00 -#> -7.7616e+00 -1.2060e+00 1.5313e+00 -1.0961e+00 4.9758e+00 -2.1184e+00 -#> 6.0980e+00 -1.2272e+00 5.8412e+00 1.2258e+00 -2.6513e+00 5.4603e+00 -#> 6.5693e+00 -2.6360e+00 -1.1088e+01 3.7412e+00 4.6298e+00 2.0834e+00 -#> -#> Columns 13 to 18 4.6010e+00 4.5067e+00 -2.9038e+00 3.2502e+00 -2.4335e+00 -6.4512e+00 -#> -2.3157e+00 -6.3810e+00 1.3949e+01 9.6802e-01 -3.6449e-01 2.2528e+00 -#> -3.6608e+00 -8.8017e+00 5.9599e+00 -3.7491e+00 -7.6035e+00 6.1058e+00 -#> -4.8018e+00 2.0252e+00 4.9238e-01 -9.8625e+00 4.6778e+00 -2.1808e-01 -#> 3.1025e+00 8.5312e+00 -3.7468e+00 6.2848e+00 -5.9987e-01 4.7689e+00 -#> 3.6285e+00 1.5422e+01 -3.4136e+00 1.2394e+00 -6.1809e+00 -3.1180e+00 -#> 2.5461e+00 -5.0396e-01 3.0065e+00 -7.7716e+00 -1.0346e+01 4.0124e+00 -#> -7.0728e-02 3.7410e+00 1.0579e+01 -4.4932e+00 1.1285e+00 -6.4617e+00 -#> -7.7204e-01 -9.0712e+00 -1.7770e+00 -9.2219e+00 -8.7108e+00 -4.0555e-01 -#> 9.1434e+00 5.0025e+00 8.2765e+00 2.4627e-01 -1.8783e+00 -4.6973e+00 -#> -3.2545e+00 5.0161e+00 -3.9063e+00 -1.1353e+01 5.4595e+00 -6.0954e+00 -#> -4.9873e+00 1.4954e+01 5.8090e+00 -6.2336e-01 -4.7916e+00 3.4780e-01 -#> -4.5335e+00 -3.1057e+00 1.1861e+00 2.7490e+00 2.5365e+00 1.1029e+01 -#> -1.5909e+00 1.3848e+00 -5.3683e+00 -5.9884e+00 -3.4089e+00 6.0434e+00 -#> -1.3089e+01 6.8337e+00 -8.6937e+00 -5.4593e+00 -2.6766e+00 1.0728e+01 -#> 2.8165e+00 -4.1133e+00 5.6637e+00 -4.4917e+00 -1.4141e-01 -1.4887e+00 -#> 9.4678e+00 4.1728e+00 1.1035e+01 9.4205e+00 -2.1007e+00 -7.6966e-01 -#> -1.0080e+00 2.3142e+00 3.3514e+00 1.3970e+00 -8.1402e-01 8.2785e+00 -#> -3.4290e+00 -3.6558e+00 -1.9663e+00 -1.4807e+01 -5.8848e+00 -3.5223e+00 -#> 9.5822e+00 5.5018e+00 1.1599e+01 -1.9601e+00 -2.5891e+00 5.5822e-01 -#> 5.1442e-01 4.4396e+00 -6.9159e+00 -6.6224e+00 -7.4509e+00 -5.4584e+00 -#> 3.9008e+00 -7.7933e+00 -3.8777e+00 -9.7138e+00 8.1011e+00 -1.8866e+00 -#> -6.4980e-01 1.0553e+01 1.3711e+00 3.4205e+00 6.2344e+00 -6.0786e+00 -#> 1.8012e+00 -2.5126e+00 -6.1118e+00 -4.8493e+00 -2.3870e+00 -8.4272e-01 -#> -8.4451e-01 3.0856e+00 -4.3986e+00 -5.1790e+00 -1.4728e+01 -3.6166e+00 -#> -2.4438e+00 1.3056e+01 1.2298e+01 -5.6175e+00 8.8998e+00 -9.7530e-01 -#> 1.0519e+00 8.9182e-01 -3.7485e+00 1.4638e+00 -2.2603e+00 -1.1745e+01 -#> -6.2835e-01 -3.9113e+00 -5.2103e-01 -1.3101e+01 -5.4891e+00 -3.1151e+00 -#> 3.8050e-01 -3.9765e+00 3.3705e+00 -5.4369e+00 -7.1983e+00 -4.7236e+00 -#> 5.7476e+00 4.3498e+00 1.7391e+00 9.4122e+00 -1.8259e+00 4.8131e+00 -#> -5.2306e+00 1.1402e+01 2.8967e+00 5.5618e+00 -7.5690e-01 -2.9042e+00 -#> 2.3695e+00 -8.2757e+00 -4.8955e+00 -7.9524e-01 -2.1519e+00 2.2029e+00 -#> 5.9470e+00 5.0308e+00 -6.3217e+00 4.2271e+00 -5.8797e+00 -1.2538e+00 -#> -#> Columns 19 to 24 4.0035e+00 -5.2533e+00 -6.4376e+00 -6.5796e+00 1.4593e+01 3.5199e+00 -#> 5.3346e+00 -1.0014e+00 8.6347e-01 2.0348e+00 -5.6212e+00 -4.0300e+00 -#> -2.2729e+00 -1.5539e+00 2.4310e+00 1.7948e+00 1.5205e+00 4.4759e+00 -#> -1.0538e+01 -2.0725e+00 4.6372e+00 -3.4391e+00 -2.4950e-01 -2.3709e-01 -#> 4.2838e-01 2.8630e+00 3.5305e+00 1.2936e+00 -2.5255e+00 -4.0722e+00 -#> -2.2387e+00 3.0365e+00 2.7049e+00 -1.3206e+01 -4.8693e+00 1.7665e+01 -#> -1.0460e+00 3.7885e+00 -2.7064e+00 7.5554e+00 -7.9784e+00 -1.5450e+00 -#> 3.3267e+00 -2.1692e+00 5.9835e+00 -6.4481e+00 -7.5086e+00 3.9535e+00 -#> -2.4311e+00 -3.8278e+00 -5.4211e-01 -8.2869e-02 3.1314e+00 -1.1352e+01 -#> 4.5032e+00 -1.3754e+00 -1.0011e+00 3.0971e+00 -6.2162e+00 -2.0818e+00 -#> 1.2915e+00 -6.8684e+00 6.6977e+00 -2.4609e+00 1.3056e+00 6.7367e+00 -#> -8.9935e+00 -9.3973e+00 9.1078e+00 1.0718e+00 -1.8343e+01 2.3391e+00 -#> 6.7598e-01 1.4605e+01 -1.1669e+01 1.4837e+01 -4.9263e+00 -8.4995e+00 -#> -8.4471e+00 3.9360e+00 -5.0993e+00 1.2821e+00 -5.0476e-01 -5.0610e-01 -#> -3.1034e+00 -1.9458e+00 -1.6512e+00 4.0102e+00 -1.5717e+00 1.5329e+01 -#> -9.0065e-01 -3.1436e-01 -1.4189e+00 -2.0881e+00 2.7689e+00 -9.9803e+00 -#> -1.0392e+01 6.7957e+00 -8.4666e+00 1.5654e+01 8.5171e+00 -2.7840e-02 -#> 5.7474e+00 -7.4794e+00 2.5581e+00 -8.2204e+00 -2.3948e-02 4.2277e+00 -#> -8.4238e+00 6.8594e+00 -7.1296e+00 1.5019e+00 -7.6827e+00 8.1506e+00 -#> -7.1825e+00 -2.6377e+00 -4.2360e+00 1.5290e+01 -8.4564e+00 -1.9189e+00 -#> -1.6717e+00 -6.7846e+00 1.0160e+01 -1.1922e+01 4.1626e-01 -7.3792e+00 -#> 5.8180e+00 -7.7029e+00 3.5438e+00 -5.4828e+00 1.0892e+01 -1.7366e+01 -#> -4.1492e+00 5.8430e+00 -8.5405e-01 9.1512e-01 -9.1028e+00 8.8594e+00 -#> -2.9390e+00 1.2791e+01 -1.4800e+01 8.0902e+00 9.1215e-02 -7.9433e+00 -#> -1.1290e+01 -4.3278e+00 -6.7345e+00 -7.3926e+00 -7.8704e+00 -7.1902e+00 -#> 8.8550e-04 6.6792e+00 6.1034e+00 9.4450e-01 -8.4841e+00 9.1573e+00 -#> -5.3408e+00 -4.3279e+00 -2.6200e+00 7.4204e+00 2.9669e+00 4.2542e+00 -#> -4.2130e+00 3.1756e+00 -9.8007e-01 -3.4946e+00 3.6135e+00 4.5236e+00 -#> -3.6319e+00 -1.5918e+00 -4.6717e+00 1.3942e+00 -4.0734e+00 1.0169e-02 -#> -5.5155e-01 8.8678e+00 -2.8303e+00 4.7980e+00 1.5487e+01 -2.1578e+00 -#> -1.0643e+01 -9.3310e+00 -5.1527e+00 5.4748e+00 -2.3128e+00 1.1993e+01 -#> -4.4540e+00 8.2229e+00 -7.8324e+00 5.7167e+00 -2.1282e+01 5.6597e+00 -#> 1.6077e+00 5.2613e-01 6.0516e+00 -8.0079e+00 1.0206e+01 -9.7924e+00 -#> -#> Columns 25 to 30 1.1147e+01 3.6391e+00 1.1187e+01 6.0805e+00 -8.1869e+00 -2.5905e+00 -#> 5.2926e+00 -1.6961e+01 -5.4181e+00 -3.6479e+00 -4.9755e+00 -2.2046e+00 -#> 4.7799e+00 -2.3298e+00 -3.9841e+00 3.9436e+00 -2.6750e+00 -2.1315e-01 -#> 6.8501e+00 -8.3940e+00 8.5949e+00 -7.8190e+00 -1.7399e+00 -5.6473e+00 -#> -1.1091e+01 1.1763e+00 -3.5075e+00 -3.0827e+00 4.8371e+00 1.9554e+00 -#> -1.1648e+01 -7.4380e+00 2.7608e+00 -8.0752e-01 5.0588e+00 2.9817e-01 -#> -7.2856e+00 9.5683e+00 -1.2037e+01 4.4913e+00 2.2812e+00 -5.4674e+00 -#> 3.0411e+00 -2.4650e+00 -6.0758e+00 6.5565e+00 3.3859e+00 -1.2842e+01 -#> 1.2164e+01 -1.1720e+01 4.6296e+00 -4.5861e+00 -1.8319e+00 -3.1555e+00 -#> 1.0095e+00 2.8889e+00 -3.4291e+00 1.0095e+00 -4.2277e+00 7.2100e+00 -#> -8.5960e+00 -2.2444e+00 -4.6157e+00 2.9012e+00 3.8361e-01 -6.3091e+00 -#> -5.5840e+00 -1.8335e+00 6.8453e-01 7.2422e+00 5.8593e+00 -6.2295e+00 -#> 2.0426e+00 -3.1267e+00 -3.0833e+00 1.8640e+00 -3.1724e+00 2.6324e+00 -#> 1.8683e+00 -4.9115e+00 5.0103e+00 2.4655e+00 -5.9479e+00 5.0536e+00 -#> 7.5598e-02 -7.1535e+00 -4.4489e-01 4.7277e+00 -5.1581e+00 3.8179e+00 -#> 1.1880e+01 -2.5464e+00 1.2672e+01 -5.9125e-02 -4.3747e+00 1.1664e+01 -#> -1.0828e+01 3.0879e+00 -1.4164e+00 2.4949e+00 -8.1771e+00 -1.1576e+00 -#> -6.1593e+00 3.6951e+00 1.3597e+01 6.8700e+00 3.7079e+00 -5.8486e+00 -#> 2.5296e+00 1.7619e+00 -4.1247e+00 -7.1495e+00 -9.2967e+00 -2.0392e+00 -#> 1.4032e+00 2.5314e+00 -1.4011e+01 4.0010e+00 1.7084e+01 4.9672e+00 -#> -6.2591e-01 -4.5542e+00 5.4845e+00 8.8774e+00 6.3592e+00 -5.5222e+00 -#> 6.6866e+00 -6.4466e+00 3.6383e+00 -5.8592e+00 -2.6308e+00 2.5872e+00 -#> -8.4094e+00 3.9811e+00 -9.8913e+00 -2.0240e+00 -8.8618e-01 5.7595e+00 -#> 8.8510e+00 4.8119e-01 5.9794e+00 -2.5156e+00 -7.9376e+00 6.2095e-03 -#> 5.2374e+00 -3.6271e+00 1.6991e+01 1.2048e+00 4.1467e+00 4.2666e+00 -#> -9.1448e-01 -5.3934e+00 -2.2018e+00 4.8980e+00 3.2072e+00 -1.5705e+00 -#> -1.4598e-01 -3.9422e+00 -1.3923e+01 -2.8437e+00 4.3032e+00 -6.9459e+00 -#> -9.6621e+00 -4.9686e+00 6.9225e+00 -2.1751e+00 -1.7244e-01 1.2816e+01 -#> 9.4945e+00 -1.7000e+00 5.5441e+00 9.0715e+00 2.7920e+00 1.0386e+01 -#> -7.8475e+00 5.6721e+00 2.7749e+00 -1.0268e+00 7.0137e-01 7.0984e+00 -#> 3.0558e+00 -1.8220e+00 5.1956e+00 -7.3876e+00 -9.1895e+00 -3.0264e+00 -#> -3.3453e+00 9.7614e+00 -7.0201e+00 9.7704e+00 3.7403e+00 7.6579e+00 -#> 1.2433e+00 -8.8167e+00 7.2628e+00 -3.4548e+00 3.0066e+00 4.6408e+00 -#> -#> Columns 31 to 36 1.6280e+00 3.6802e+00 -1.1593e+01 -8.4596e-01 -1.7400e+00 6.1596e+00 -#> -4.5575e+00 4.7161e+00 1.3676e+01 1.6125e+00 1.5714e+01 1.5754e+01 -#> -1.8076e+00 2.8713e+00 3.8487e+00 -1.0413e+00 4.4413e+00 6.8242e+00 -#> 3.1493e+00 4.0503e+00 -1.4582e+01 -1.5412e+01 1.5087e+01 -2.7381e+00 -#> 3.5763e+00 -6.4330e+00 -2.3951e+00 1.6168e+01 -1.4622e+01 6.6627e+00 -#> -2.4222e+00 -8.8587e+00 -3.3878e+00 1.8356e+00 2.1590e+00 6.4357e+00 -#> -5.4561e+00 -4.7786e+00 -5.5594e+00 8.7320e+00 -5.4292e+00 6.9659e+00 -#> -2.4925e+00 5.7615e+00 3.0381e+00 5.1190e+00 6.9788e+00 -3.4237e+00 -#> 4.2632e+00 -1.4803e+01 4.0556e+00 6.8307e+00 -3.7379e-01 2.1806e+00 -#> -2.8327e+00 -3.2625e+00 9.2088e+00 -5.7277e+00 6.3689e+00 9.3465e+00 -#> 1.1018e+01 8.0445e+00 9.3478e+00 -5.5861e+00 1.5302e-01 3.1772e+00 -#> 8.2271e+00 1.5902e+01 1.8643e+01 -1.6779e+00 1.8905e+00 1.0224e+01 -#> 6.3268e+00 -6.0640e+00 1.0101e+01 -5.8077e+00 -4.8274e+00 1.0012e+01 -#> 5.0458e+00 -7.7735e+00 -9.2099e+00 -5.3822e+00 -2.3859e+00 -3.3018e+00 -#> -4.3210e+00 1.1542e+01 -1.9341e-01 1.0544e+00 -5.4314e+00 7.5066e+00 -#> 5.5280e+00 -1.2632e+00 -4.0221e+00 -6.5578e+00 -6.1401e+00 -4.2676e+00 -#> 6.8111e+00 -6.7595e-01 1.1877e+01 -4.4416e+00 7.3254e+00 9.2634e+00 -#> -9.3897e-01 -7.3323e+00 -5.6501e+00 -4.2338e+00 -1.6136e+01 -7.9817e+00 -#> 5.1431e+00 7.9962e+00 1.9244e+00 -1.3078e+01 9.7229e+00 1.4527e+01 -#> -6.6730e+00 9.3903e-01 1.7861e+01 4.9748e+00 6.6223e+00 4.5511e+00 -#> 5.7490e+00 4.0505e+00 1.2850e+00 1.7379e+01 6.4363e+00 -3.4994e+00 -#> -4.3550e+00 -1.3790e+00 -4.9837e-01 1.6309e+00 7.2332e+00 -9.8394e+00 -#> 5.2478e-01 6.5799e-01 -9.2388e+00 -3.2802e-01 -1.0465e+01 -7.3347e-01 -#> 1.2429e+00 -7.9775e+00 6.9857e+00 -4.2260e+00 -4.6722e+00 1.1488e+01 -#> 1.9202e+00 -3.3812e+00 -3.2940e-01 -1.7986e-01 -4.8731e+00 3.0378e+00 -#> -1.2325e+00 1.7000e+01 -1.5116e+01 2.6105e+00 6.9141e+00 3.5265e+00 -#> -4.5038e-02 -1.0016e+00 1.0130e+01 1.7839e+01 -2.6167e+00 7.1525e+00 -#> 3.8105e+00 1.5240e+01 -6.2467e+00 6.8072e+00 1.5233e+01 -1.2775e+01 -#> 4.2361e+00 1.1523e+01 -4.7938e+00 2.2597e+00 9.2862e+00 -5.7732e+00 -#> 8.5660e+00 1.7525e+00 -1.1605e+01 -1.7779e+01 -1.0388e+01 5.5783e+00 -#> 1.7085e+00 9.5260e+00 -3.2841e-01 -8.4488e+00 9.9827e+00 2.2759e+00 -#> -8.8871e-01 1.1511e+01 7.0996e-01 -1.5810e+01 3.7052e+00 1.9111e+00 -#> 1.7095e+01 6.1128e+00 -3.5600e+00 7.4983e+00 -1.8792e+00 3.0595e+00 -#> -#> Columns 37 to 42 4.0209e+00 2.8236e+00 3.2228e+00 4.9068e+00 -1.0765e+01 -1.8393e+00 -#> 1.7493e+00 -1.6852e-01 -8.9894e+00 -7.7516e+00 -5.7265e+00 1.5782e+01 -#> -7.5828e-01 5.2742e+00 -6.7648e+00 -7.0245e-01 -2.9208e+00 8.0485e+00 -#> 2.4751e+00 2.0166e+00 -2.7255e+00 -1.6698e+01 2.6687e+00 2.0290e+00 -#> 5.5240e-01 -2.9833e+00 -1.0485e+01 -4.2613e+00 -3.7978e+00 -4.5257e+00 -#> -5.4438e+00 -4.8938e+00 -1.2284e+01 2.8564e+00 8.7059e+00 -2.1239e+00 -#> 3.2328e+00 -4.8299e+00 -8.4738e+00 5.2165e+00 -6.1057e+00 -2.3035e+00 -#> 9.0472e+00 -1.7072e+00 1.4483e+00 1.0006e+01 1.4309e+00 3.3362e+00 -#> -3.5516e+00 -2.4131e+00 -7.4484e+00 -9.3446e+00 -2.2438e+00 -5.8856e+00 -#> 4.3916e+00 1.3265e+00 -9.2051e+00 -6.4683e+00 -1.0181e+01 1.0631e+01 -#> 2.4456e+00 -6.5465e+00 6.4855e+00 -2.8299e+00 -1.2262e+00 1.0795e+01 -#> -4.5987e+00 -5.7546e+00 -3.0917e+00 1.2375e+01 8.1699e+00 4.7183e+00 -#> -9.2398e+00 1.1088e+01 1.1573e+01 -1.9905e+00 -1.2787e+01 6.5136e+00 -#> -7.4235e+00 -2.3060e-01 9.7640e+00 -1.0802e+01 -2.6236e+00 1.8933e-01 -#> -1.0108e+01 -2.2754e+00 3.0799e+00 -8.2495e+00 -6.9606e+00 9.3663e+00 -#> -7.5995e+00 7.7299e+00 3.7903e+00 -1.0401e+01 5.5817e+00 4.9853e+00 -#> -5.1197e-01 1.0524e+01 -7.7887e+00 7.9674e+00 -1.1456e+01 -1.8496e+00 -#> 3.4752e+00 -7.9746e+00 1.3573e+00 -3.7112e+00 -7.2776e+00 -3.8574e+00 -#> -1.8658e+00 3.9683e+00 -1.6261e+01 3.5811e+00 3.0788e+00 1.8561e+01 -#> 1.0342e+01 -2.8909e+00 -2.8083e+00 7.6817e+00 -4.0105e+00 4.0795e+00 -#> 3.1097e+00 -1.7027e+01 5.6430e+00 8.0713e+00 3.7843e+00 -2.7920e+00 -#> -5.6453e+00 -8.9694e+00 1.8651e+01 -1.3044e+01 -4.0992e+00 -5.1164e+00 -#> -4.7247e+00 2.1068e+00 -6.8930e+00 -3.1100e+00 -4.0142e+00 2.4032e+00 -#> -8.0732e+00 -1.9362e+00 -1.1289e+00 1.5146e+01 -1.2932e+01 3.7773e-02 -#> -3.3798e+00 -1.0183e+01 -2.1320e+01 4.0629e+00 7.0527e+00 5.3845e+00 -#> 1.3099e+01 2.9590e+00 -5.3575e+00 -1.3653e+00 1.5970e+00 -1.6306e+00 -#> 6.9089e-01 -4.2401e-02 6.2614e-01 8.6173e-02 5.2735e+00 5.8539e+00 -#> -6.8782e+00 -8.2778e+00 8.0748e+00 7.1342e+00 5.1705e+00 2.6406e+00 -#> 6.2406e+00 -6.6000e+00 -6.3177e+00 1.2859e+00 9.0834e+00 1.1542e+01 -#> -1.0031e+01 1.7066e+01 -4.2424e+00 1.1230e+01 -1.5471e+01 -1.2751e+01 -#> -3.2882e+00 9.8471e+00 6.1404e+00 -4.0524e+00 8.4977e+00 5.9648e+00 -#> -1.6528e+01 8.3521e+00 -9.0226e+00 -1.2466e+00 8.3971e+00 8.7195e+00 -#> -3.5397e+00 5.0278e+00 3.5069e+00 3.1022e-01 1.0142e+01 -2.0094e+00 -#> -#> Columns 43 to 48 1.9703e+00 -1.1875e+01 -8.7695e+00 1.5051e+00 -5.7029e+00 -9.7498e+00 -#> -1.3301e+00 3.2313e+00 8.9848e+00 -1.0432e+01 -4.4717e+00 -7.2384e+00 -#> 7.5183e+00 1.2554e+01 7.1470e+00 -3.7589e-01 -2.4162e+00 2.9486e+00 -#> 3.3490e-01 4.8699e+00 3.7211e+00 -1.0333e+01 9.6389e-01 -1.0573e+01 -#> 9.7084e+00 -4.1592e+00 5.4356e+00 -6.3258e+00 3.1986e-01 1.1790e+01 -#> -9.9423e+00 -1.5987e+01 -6.0567e+00 8.4694e+00 -1.5238e+00 -1.0579e+00 -#> 8.9344e+00 5.8672e+00 -7.3183e+00 4.8573e+00 -1.2020e+01 2.9273e+00 -#> 1.1320e+01 -7.6255e+00 -3.5554e-01 1.0035e+00 -8.6486e+00 -6.9284e+00 -#> 1.2469e+01 3.7959e+00 -6.8073e+00 1.0302e+00 1.1334e+01 3.4938e+00 -#> 4.6032e+00 -2.1861e+00 -2.2557e+00 -8.5205e+00 -9.0304e+00 5.3963e+00 -#> 1.2137e+00 -7.5389e+00 1.7085e+01 -6.2938e+00 2.4187e+00 8.8205e-01 -#> 8.5736e+00 8.6609e+00 -7.3895e+00 -5.0710e-01 -1.4946e-01 -2.8082e+00 -#> 5.7984e+00 1.0131e+01 5.3564e+00 -7.1073e+00 2.3574e+00 8.9853e-01 -#> 1.5896e+00 1.8233e+00 -2.4928e+00 -2.7644e+00 -2.4445e+00 3.6724e+00 -#> 3.1770e+00 -1.2907e+01 -3.4434e+00 -7.8347e+00 -9.8390e+00 7.6037e+00 -#> 7.2079e+00 8.7224e+00 -6.1744e+00 6.8033e+00 -1.0701e+00 -3.7301e+00 -#> -6.1776e+00 8.1282e+00 -1.6816e+01 4.8213e-01 -1.7013e+01 -6.6154e+00 -#> -1.8096e+00 -8.0818e+00 7.6369e+00 4.6508e-01 1.3007e+00 -2.2696e+00 -#> 5.6270e+00 5.2678e+00 -1.9481e+00 -5.9225e+00 -1.8582e+01 1.7727e+00 -#> 7.5104e+00 -3.5144e+00 1.7004e+00 -6.3550e+00 -5.8357e+00 3.4854e-01 -#> 1.6668e+01 -1.5817e+00 1.3947e+01 -2.3133e+00 8.9867e+00 -5.4141e+00 -#> 5.5480e+00 4.2205e+00 2.1355e+00 -5.3126e+00 1.0447e+01 -3.6338e+00 -#> -9.6799e+00 3.6726e-01 -1.1408e+01 -5.9977e-01 -1.0750e+01 1.0392e+00 -#> -4.2397e-01 -3.0862e-01 5.3942e-01 -1.0992e+00 7.5872e+00 7.4528e+00 -#> 6.6743e-01 -3.1385e+00 -1.0539e+01 1.2112e+00 -1.9948e+00 2.6654e-01 -#> 1.2855e+01 -4.1853e+00 -4.4663e+00 -3.0913e+00 -2.1435e+01 -1.0684e+00 -#> -4.6279e-03 1.4366e+00 -1.3314e+01 -4.3680e+00 2.9159e+00 5.4549e-01 -#> -1.4838e+01 1.6832e+00 1.4140e+01 1.0610e+01 7.4335e+00 1.5544e+01 -#> -5.9024e+00 -5.5648e+00 -8.7734e+00 -7.4661e+00 -1.0869e+01 1.4613e+00 -#> -3.0699e+00 4.7420e+00 -4.3134e+00 5.4277e+00 -6.0626e+00 2.3346e+00 -#> 3.9549e+00 -4.3669e+00 -1.1736e+01 -6.6775e+00 -7.9589e+00 -2.7425e-01 -#> -1.5342e+01 5.3356e+00 -1.0145e+01 3.7000e-01 -4.6578e+00 5.8591e+00 -#> 3.7243e+00 2.5135e+00 4.8050e-01 -6.2709e+00 5.8081e+00 -4.3069e+00 -#> -#> (4,.,.) = -#> Columns 1 to 6 -2.4260e+00 -1.6090e+01 -7.1397e+00 1.0959e+01 1.3156e+01 -3.1173e+00 -#> -8.2181e+00 -1.3502e+01 3.9274e+00 -2.2057e+00 -1.2749e+01 4.7140e+00 -#> -1.4480e+01 -2.6388e+00 1.6621e+01 2.2499e+00 -1.5909e+01 9.4288e-02 -#> -4.2047e+00 4.9874e+00 1.2744e+01 -2.2830e+00 -6.1745e+00 -3.4464e+00 -#> -1.0277e+01 -1.6134e+00 -8.3036e-02 -1.7949e+00 -1.4716e+00 -1.4627e+00 -#> -2.1850e+00 -6.9099e+00 3.0435e+00 8.3562e+00 2.9605e+00 -5.9025e+00 -#> -1.0164e+00 -4.8070e-01 2.9863e+00 -1.1825e+00 -1.4028e+01 4.0764e+00 -#> -4.2190e-02 -5.4516e+00 6.1648e+00 -5.6370e+00 4.4230e-01 5.5009e+00 -#> 3.7369e+00 6.2467e+00 6.4830e-01 3.9464e+00 4.0394e+00 -1.3070e+01 -#> -3.4354e+00 -1.3006e+01 1.2709e+00 2.2926e+00 -7.9340e+00 7.8264e-01 -#> -2.8094e+00 -6.3679e-01 3.3087e+00 -4.7293e+00 1.1885e+00 8.2875e+00 -#> -9.3574e+00 -1.2533e+01 5.0777e+00 6.9207e+00 9.7479e+00 1.9588e+01 -#> -4.3638e-01 1.3491e+01 -6.4486e+00 -7.1892e+00 7.7435e-01 -9.9856e+00 -#> -1.7874e+00 9.6274e+00 1.0126e+01 -1.0308e+01 8.8940e+00 -8.6108e+00 -#> -1.1105e+01 8.8923e+00 6.5329e+00 -8.6803e+00 -7.5524e+00 -8.0187e+00 -#> 4.4636e-01 4.6858e+00 1.1691e+01 1.0431e+00 -7.0344e-01 -8.0854e+00 -#> -5.0888e+00 -1.2481e+01 -4.9545e+00 5.0573e+00 -1.8414e-01 3.1990e+00 -#> -8.5629e-01 -9.0537e+00 4.4783e+00 1.1009e+01 4.8588e+00 -4.0643e+00 -#> -3.3022e+00 2.5722e+00 1.6664e+01 -2.9806e+00 -1.4120e+01 -4.9962e+00 -#> -5.7592e+00 -1.0260e+01 -8.1434e-01 4.3060e-01 -4.0868e+00 9.3831e+00 -#> -1.3060e+01 -3.0198e+00 1.7335e+01 5.5972e+00 7.4274e+00 5.9574e+00 -#> 3.5454e-01 1.7126e+01 -2.0991e+00 -9.9543e+00 3.3188e-01 -6.5695e+00 -#> -3.2862e+00 3.9254e+00 1.3965e+01 -1.2239e+00 -1.8622e-01 3.0077e+00 -#> 8.3303e+00 -9.2190e+00 3.1816e+00 1.9540e+00 8.2113e-01 -6.3173e+00 -#> 6.4037e+00 -6.7632e+00 7.7021e+00 1.1969e+01 5.9819e+00 4.0527e+00 -#> -1.1441e+00 -7.0357e+00 5.4759e+00 -4.3076e+00 -8.1542e+00 2.4943e+00 -#> -3.9106e+00 -7.4675e-01 -6.9691e+00 -2.8430e+00 -6.4367e-01 9.7454e+00 -#> -9.9073e+00 -4.1061e+00 1.1572e+01 -1.0203e+00 2.5985e+00 1.9412e+01 -#> 3.7332e+00 -7.6847e+00 9.5770e+00 -3.4412e+00 5.8689e+00 1.1729e+01 -#> -6.3888e+00 3.3612e+00 5.4892e+00 2.0501e+00 8.9452e+00 1.8355e+00 -#> -1.4714e+01 -4.1082e+00 -1.1987e+00 4.5001e-01 3.4602e+00 2.8930e+00 -#> 8.4743e+00 1.4654e+01 -2.0721e+00 6.1471e+00 -3.8771e+00 3.2316e+00 -#> -2.5601e+00 1.2247e+01 -3.2863e+00 -6.9482e+00 7.3815e+00 5.2209e-01 -#> -#> Columns 7 to 12 2.7039e+00 1.7928e+01 4.7722e+00 -9.0033e+00 -9.2337e+00 4.3650e+00 -#> 5.6106e-01 -3.7155e-01 -9.4022e-01 6.6389e-01 -1.0687e+00 -3.6676e+00 -#> -1.5103e+00 -4.8982e+00 2.8042e+00 3.2289e+00 1.7656e+00 1.3508e+00 -#> -1.7560e+00 -1.2062e+00 -4.4098e+00 -8.6041e+00 -9.1194e-01 -2.4839e+00 -#> -7.7607e-02 -1.4647e+00 -1.4750e+00 2.2109e-01 -9.2688e+00 -1.6978e+00 -#> -6.4080e+00 -1.4036e+01 -3.3036e+00 7.0697e+00 -2.8365e+00 -1.5730e+00 -#> -8.7269e-02 2.7889e+00 6.2123e+00 3.2134e+00 -4.1613e+00 -4.4587e+00 -#> -2.1674e+00 -1.1504e+00 5.9939e+00 1.1103e+01 1.1826e+01 5.6860e-01 -#> -6.9300e+00 -6.7035e+00 -7.3591e+00 -7.8506e+00 -2.8402e+00 -7.3483e+00 -#> -1.1257e+00 2.5082e+00 2.6712e+00 -3.2684e+00 -4.4637e+00 8.8603e-01 -#> -7.6516e-02 -1.4251e-01 -2.3874e+00 1.1411e+00 -2.4513e+00 -1.6663e+01 -#> 4.6774e+00 -1.4641e+00 8.1328e-01 -1.1810e+01 -1.5955e+01 1.2982e+01 -#> -8.5431e+00 3.9702e+00 9.8489e+00 4.6478e+00 1.2068e+01 4.8349e+00 -#> -5.8179e+00 -8.1493e+00 1.6114e+00 -4.0638e+00 6.5257e+00 -5.3299e+00 -#> -7.2035e-01 -4.5514e+00 -1.9175e+00 1.3576e+00 9.6702e+00 -6.4111e+00 -#> 3.2116e+00 -2.5445e+00 -4.9105e+00 -8.8980e-01 1.1377e+01 -3.6236e+00 -#> 1.0368e+01 2.7730e+00 8.0303e+00 -6.5360e+00 -1.2322e+01 2.1166e+01 -#> -4.8024e+00 4.8266e+00 8.0571e-01 8.9774e-01 -8.2069e+00 -9.2629e+00 -#> 2.2174e-01 -6.4808e+00 -5.2895e+00 3.6308e-01 -1.2931e+00 2.7188e+00 -#> 9.5745e+00 1.2801e+00 7.4423e+00 1.8955e+00 -4.1224e+00 -4.7624e+00 -#> -5.2236e+00 -1.0175e+00 -5.3479e+00 1.6259e-02 1.3361e+00 7.1376e+00 -#> -7.6194e+00 6.1334e+00 1.1134e+00 -9.8276e+00 7.6407e+00 -1.5565e+01 -#> -1.8772e+00 -1.9989e+01 -1.6736e+01 2.7651e+00 -3.2357e+00 6.1813e+00 -#> 8.6555e+00 1.2206e+01 7.3953e+00 -1.3803e+01 -9.0671e+00 1.8454e+01 -#> 6.6007e+00 -5.4982e+00 -7.9111e+00 -1.0212e+01 -1.3995e+01 -2.5820e+00 -#> 3.3626e+00 -2.6789e+00 7.5463e+00 8.0132e+00 7.1050e+00 -4.7084e+00 -#> 3.1330e+00 1.9449e+00 -7.4689e+00 1.5632e+00 -5.9301e+00 6.4092e+00 -#> 1.0197e+01 -7.0770e+00 -4.1250e+00 -8.2758e+00 1.3048e+01 -7.7559e+00 -#> -2.1499e+00 -6.4168e+00 -2.3384e+00 -1.8222e+00 1.1717e-01 3.8442e+00 -#> -6.0073e+00 -8.4099e+00 3.9294e+00 -6.3322e+00 -6.4771e+00 2.1426e+01 -#> 1.1076e+01 5.8835e+00 -4.9446e+00 -4.7285e+00 2.9562e+00 1.0060e+01 -#> 3.8793e+00 -1.3364e+01 -5.9460e+00 -5.2039e-01 4.9388e+00 9.8313e+00 -#> -7.5788e+00 -4.2444e+00 2.7747e+00 2.6616e+00 4.5422e+00 3.9646e-01 -#> -#> Columns 13 to 18 -6.5245e+00 2.3812e+00 -2.7277e+00 -2.9581e+00 -1.6089e+01 -1.4964e+01 -#> -5.1098e-01 -1.6415e+00 -1.0149e+01 5.5287e+00 4.1055e+00 1.2482e+00 -#> -6.7670e-01 -1.7101e+00 -4.9976e+00 1.0970e+01 -3.4336e+00 -5.6223e+00 -#> -4.6039e+00 -1.9907e+00 -2.7878e+00 -1.1976e-01 5.3809e-01 2.7278e-01 -#> -4.1515e+00 6.1336e+00 -2.5220e+00 6.6164e+00 9.2006e+00 3.1525e+00 -#> -3.0194e+00 -2.8319e+00 -2.1512e+01 2.1852e+00 9.3475e+00 -8.1556e-02 -#> -2.9080e+01 2.7905e+00 7.8276e-01 2.3943e+00 1.5815e+00 -3.1924e+00 -#> -4.5630e+00 4.8802e+00 -6.8771e+00 -8.7013e-01 -1.4001e+00 1.2690e-01 -#> 5.6586e+00 8.6122e+00 1.9926e+00 -2.5550e-01 1.9946e+00 3.0214e+00 -#> 5.9062e+00 -3.4057e+00 3.9138e+00 -3.8029e-01 -7.9910e+00 3.1446e+00 -#> 1.0425e+01 -7.9493e+00 3.6418e-01 1.0828e+01 -8.4890e+00 2.3251e+00 -#> 2.3559e+00 3.4553e+00 3.1563e-01 1.0489e+01 -6.5857e+00 -1.4040e+01 -#> -7.9710e-01 4.0536e+00 -3.0672e+00 1.1442e+00 3.5079e+00 1.6066e+00 -#> -7.5029e+00 1.2243e+00 -3.7898e-01 -1.5373e+00 -1.1446e+00 3.8205e-01 -#> 4.1850e+00 2.2051e+00 -2.0903e-01 2.1436e+00 1.8623e+00 -3.8784e+00 -#> -4.1479e+00 -7.1423e+00 -3.1687e+00 -3.1377e+00 5.9941e+00 5.1574e+00 -#> 8.0884e+00 5.1589e-01 4.0641e+00 -1.9410e+00 -1.3332e+01 -7.8220e+00 -#> 8.4895e-01 8.6939e+00 -8.5540e+00 7.4821e+00 4.9653e+00 -1.1706e+01 -#> -1.3710e+01 -9.1676e+00 -2.5768e+00 9.1837e+00 -1.1243e+01 2.3341e-01 -#> -4.1771e+00 -2.4854e+00 2.1148e+00 -4.6356e+00 -2.6337e+00 9.0549e-01 -#> 5.2531e+00 -3.7544e+00 -5.2746e+00 4.3894e-01 6.3111e+00 -1.2990e+00 -#> -3.5439e+00 -6.1807e-01 9.6415e+00 -8.0851e+00 4.2410e+00 7.7130e+00 -#> -6.7433e+00 -1.0340e+01 -8.5594e+00 2.3174e+00 -2.7162e+00 -8.6086e-01 -#> -2.7265e+00 -7.5501e+00 -3.1762e+00 1.0278e+01 -8.0296e+00 3.2575e-01 -#> -4.1769e-01 -2.5689e+00 -9.3748e+00 2.6912e-01 -4.4217e+00 2.3118e-01 -#> -4.7241e+00 3.1261e+00 -3.9767e+00 -3.7091e+00 -2.0489e+00 2.0303e+00 -#> 5.1013e+00 1.4020e+01 -4.9716e+00 -9.3442e+00 1.9353e+00 -4.5956e+00 -#> -4.8233e+00 -8.7770e+00 1.9722e+01 9.5561e-01 3.9089e+00 1.3546e+01 -#> 9.2608e-01 3.2052e+00 -1.4554e-01 -1.4802e+01 -5.1632e+00 6.9924e+00 -#> -5.7905e+00 -5.4141e+00 4.2075e+00 6.7678e+00 -1.7884e+01 -3.1346e+00 -#> 1.0392e+01 2.7993e+00 -2.3305e+00 7.8083e+00 -1.1962e+01 -1.3522e+01 -#> 5.6182e+00 -2.6997e+00 3.7665e+00 1.6071e+00 -3.7697e-01 1.3203e+00 -#> -4.3505e+00 -1.3015e+00 -1.3011e+01 -7.3237e+00 4.3552e+00 6.8349e+00 -#> -#> Columns 19 to 24 -4.3832e+00 1.1854e+00 3.7849e+00 -8.7515e-01 -7.2142e+00 2.8712e+00 -#> 4.3580e+00 1.0377e+01 7.7775e+00 5.6452e+00 -5.5160e+00 -4.8094e+00 -#> 1.2200e+00 2.5590e+00 7.4564e-01 8.7451e+00 8.5624e+00 4.9214e-01 -#> -4.0737e+00 1.1564e+01 3.7737e+00 5.8098e+00 -9.9780e+00 6.1329e+00 -#> 8.3322e+00 -8.3092e+00 -6.6719e+00 -1.1653e+00 9.0612e+00 -6.9816e-01 -#> 4.1872e+00 5.6975e+00 3.3598e+00 1.7068e+00 -1.0912e+01 1.0734e-01 -#> 6.7030e+00 -6.7618e+00 1.2650e+00 -1.8857e+00 1.7936e+00 1.0692e+01 -#> 1.0883e+00 4.4177e+00 -2.1538e+00 1.1302e+00 -3.1243e+00 1.4657e+00 -#> 2.5484e+00 2.1354e+00 -5.3882e+00 2.5681e+00 9.9810e+00 -5.4780e+00 -#> 7.7740e+00 1.0476e+00 -5.1584e+00 8.7806e-01 9.8774e-01 -1.1787e+00 -#> 2.5723e+00 -1.0043e+01 -2.5171e+00 1.2171e+00 6.1552e+00 1.3662e+00 -#> 4.5510e+00 -1.4972e+00 1.1797e+00 9.6993e+00 1.1976e+00 -4.3668e+00 -#> 2.2118e+00 -2.7551e+00 -8.4045e-01 3.4012e-01 9.2944e+00 3.2099e+00 -#> -9.9009e+00 1.7620e+00 -3.0801e+00 7.2096e+00 8.1377e+00 8.4507e+00 -#> 1.2373e+00 -2.8196e+00 -2.6450e+00 -7.0677e+00 9.0234e+00 6.5156e+00 -#> -4.0219e+00 8.0798e-01 -3.0108e+00 -1.4071e+00 6.3210e+00 6.3155e+00 -#> 4.1826e+00 1.0785e+01 -7.9641e-02 -8.0781e+00 -2.8322e+00 5.8314e+00 -#> -1.6334e+00 -8.2829e-01 -5.2327e-01 9.1560e-01 2.7875e-01 1.0949e+00 -#> 1.1560e+01 7.1775e+00 2.4676e+00 4.3689e+00 1.2348e+00 1.5355e+00 -#> -7.2999e+00 -8.7090e+00 -1.0109e+00 8.0552e+00 5.2843e+00 -5.5434e+00 -#> -5.6621e+00 2.0950e+00 5.6926e+00 8.5302e+00 -1.8420e+00 4.9828e+00 -#> -6.2905e+00 -3.2019e+00 3.9112e+00 -2.8016e+00 -2.4425e+00 -3.9356e+00 -#> -1.2319e+00 9.3284e+00 7.1342e+00 -2.9960e-01 3.5464e+00 9.6238e+00 -#> -2.5003e+00 3.7505e-01 -1.8017e+00 -2.4822e+00 -1.3950e+01 4.2405e+00 -#> 6.2448e-01 5.0793e+00 -2.1631e+00 4.3120e+00 -6.7088e+00 -6.2485e+00 -#> 1.0608e+00 8.4343e-01 1.4154e+00 -8.8833e-01 -4.9350e-02 5.2178e+00 -#> 5.0803e-01 -2.5242e-02 5.2353e+00 -4.7372e+00 1.6922e+00 2.6130e+00 -#> -8.1313e-01 -1.4678e+01 -8.0687e+00 -7.8823e+00 5.9716e+00 9.7268e+00 -#> -4.1983e+00 1.7083e+00 -4.8165e+00 1.0524e+01 2.4571e+00 -5.8272e-01 -#> 9.0055e+00 1.6781e+00 -3.9620e+00 -1.9208e-01 7.7751e+00 4.8966e+00 -#> 5.6446e+00 1.0289e+01 -2.6237e+00 -9.7269e+00 -1.8522e-01 -1.3218e+00 -#> -3.3620e+00 -1.8233e+00 -3.1495e-01 1.1295e+00 5.1915e+00 -6.8499e+00 -#> -1.2493e+00 -3.7749e+00 -4.4679e+00 4.6988e+00 6.2839e+00 1.0712e+00 -#> -#> Columns 25 to 30 -3.7678e+00 -5.7242e+00 -6.5666e+00 -6.8208e+00 -5.8472e+00 -4.0306e-01 -#> -3.7068e+00 -1.8190e+00 -3.1648e+00 6.9638e+00 5.9712e+00 4.9328e+00 -#> -1.0388e+01 -5.5448e+00 -1.4448e+00 2.6815e+00 3.7214e+00 -3.2213e-01 -#> -3.6283e+00 -2.1454e+00 3.2638e-01 6.5090e+00 3.4894e+00 5.7031e+00 -#> -5.3661e-02 4.1527e+00 2.4702e+00 -1.5813e-01 -1.6505e+00 1.4394e+00 -#> 1.6428e+01 -9.1191e+00 1.1358e+00 -3.5391e+00 8.9095e+00 -2.4697e+00 -#> -4.4852e+00 1.1873e-01 -3.0818e+00 2.8583e+00 2.7108e+00 5.9310e-01 -#> 3.1718e-01 5.4803e+00 -7.4583e+00 -3.3241e+00 1.9538e+00 -9.9190e+00 -#> -5.7366e+00 9.0325e+00 -3.6218e-01 1.2094e+01 -5.1567e+00 2.4442e+00 -#> -1.0432e+01 4.0087e+00 -5.2226e+00 -3.0613e+00 -2.6841e+00 4.3142e+00 -#> 1.5683e+01 -4.8294e+00 -3.7603e+00 -3.9085e+00 6.4568e+00 -9.4047e-01 -#> -8.2675e+00 -1.2175e+01 -2.7009e-01 3.1945e+00 7.9773e+00 -9.3286e+00 -#> -1.1884e+00 6.6128e-01 5.1019e+00 5.5271e+00 -1.3284e+01 -6.6113e+00 -#> -1.3139e+01 4.7815e+00 1.3458e+00 7.1244e+00 -8.1554e+00 5.0782e-01 -#> 4.6021e+00 5.5465e+00 1.0623e+01 -6.7340e+00 -1.0824e+01 9.9315e+00 -#> -1.2730e+01 1.4776e+01 -2.8510e+00 1.9954e+00 -6.1477e+00 6.8560e+00 -#> -8.4356e+00 3.8520e-01 -9.1840e+00 1.9059e+00 -8.1112e+00 -5.6377e+00 -#> -7.3018e+00 -2.0424e+00 2.1029e+00 -1.0376e+00 2.2140e+00 -6.5560e+00 -#> 5.9514e+00 2.8834e+00 8.2532e+00 -1.2182e-01 1.4292e+01 6.1602e+00 -#> 2.3185e-01 -5.6710e+00 -1.6720e+01 -6.4373e+00 -8.8326e-01 1.2590e+00 -#> -8.3497e+00 -1.0511e+01 -9.8981e+00 1.9591e+01 7.4912e+00 4.6404e+00 -#> -6.4806e+00 9.1930e+00 3.6603e+00 6.1261e+00 -4.8635e+00 -1.2633e+01 -#> 1.6575e+00 1.2219e+01 1.8400e+00 -5.0897e-01 4.8157e-01 -1.5647e+00 -#> 1.0697e+01 -1.0563e+01 -1.9335e+00 5.2464e+00 -4.9309e+00 5.6560e+00 -#> 2.7579e+00 -1.0974e+01 3.7751e+00 2.4941e+00 1.2820e+01 1.2087e+01 -#> 1.0323e+01 9.6038e-01 -8.0909e+00 -1.2246e+01 5.8800e+00 -8.1793e+00 -#> -3.2346e-01 5.5698e+00 -4.4115e+00 9.1758e-02 -5.0301e+00 3.8046e-02 -#> 5.3821e+00 -8.0975e+00 3.9839e+00 -1.6710e+00 1.3352e+01 -6.0166e+00 -#> -1.8801e+01 -7.5684e-01 -6.5811e+00 2.7727e+00 8.5961e+00 8.7461e+00 -#> 5.9716e+00 -2.9117e-01 6.6215e+00 -6.3897e+00 3.0783e+00 -7.0034e+00 -#> 3.3100e+00 5.9482e+00 7.1441e+00 -7.2146e+00 -1.1453e+01 -2.1975e-01 -#> -2.5174e+00 6.1783e-04 3.0374e+00 9.1811e-01 1.3462e+00 1.2635e+01 -#> -1.1931e+00 -5.5093e+00 -6.1567e+00 -2.6138e+00 2.6214e+00 1.4476e+00 -#> -#> Columns 31 to 36 7.4223e-01 7.0131e+00 1.1629e+01 -5.2268e+00 -1.1719e+01 3.1106e+00 -#> 2.6765e+00 7.4811e-01 2.4381e+00 -6.1843e+00 8.3653e+00 1.0713e+01 -#> 5.0892e+00 1.7012e+01 5.5125e+00 -1.1288e+01 4.5462e+00 5.6505e+00 -#> 6.4937e+00 5.3456e+00 7.8426e+00 -1.6293e+01 -2.3455e+00 -5.8268e+00 -#> -1.0484e+00 3.4260e+00 -8.2399e+00 4.6738e+00 -5.2327e+00 7.4706e+00 -#> 5.3485e+00 -1.8475e+00 -3.6222e+00 -1.6178e+01 -3.4572e+00 3.2129e+00 -#> 6.6275e+00 -2.6183e+00 1.6539e+00 4.8539e+00 -5.4083e+00 5.7612e+00 -#> -2.7518e+00 5.2740e+00 1.5759e+01 -9.9001e+00 2.3535e+00 7.7896e+00 -#> 3.7825e+00 -3.4907e-01 -3.4481e+00 -2.5707e+00 -9.7787e+00 1.5958e+01 -#> -3.8520e-01 1.4803e+01 3.4473e+00 -6.6064e+00 4.9638e+00 7.6164e+00 -#> -2.5500e+00 4.3077e+00 -1.0892e+01 -4.6185e+00 3.9399e+00 9.3895e+00 -#> -8.5954e+00 5.5902e+00 9.9969e+00 9.0837e+00 -7.0463e+00 1.6710e+00 -#> 2.7788e+00 7.4384e+00 8.5017e+00 -2.4923e+00 1.0762e+01 5.9373e+00 -#> 4.4875e+00 1.5368e+00 -1.6538e+00 -1.0821e+01 4.6451e+00 -1.2634e+00 -#> 1.2521e+00 1.4079e+01 2.2609e+00 -1.0919e+01 8.3839e+00 -3.6383e+00 -#> 1.1817e+01 2.9383e+00 -4.2285e+00 2.2827e+00 -1.1505e+01 3.6831e+00 -#> 5.6361e-01 1.7811e+00 -2.0413e-01 8.8456e-01 -2.2689e+00 -3.1501e+00 -#> -9.8936e+00 3.8421e-02 1.3489e+00 -7.7164e+00 1.4918e+00 1.5608e+00 -#> 3.9265e+00 1.5115e+01 -4.5694e+00 -7.6374e+00 3.5958e+00 2.6907e+00 -#> -2.7697e+00 -1.5410e-01 5.8248e+00 3.0651e+00 9.2409e+00 -4.3196e-01 -#> -3.9996e+00 3.8280e+00 7.0900e+00 -9.7899e+00 -1.5215e+00 1.3456e+01 -#> 1.0321e+01 -9.7283e+00 -4.8336e+00 1.3471e+01 4.9134e+00 3.0929e+00 -#> 8.3774e-01 4.2876e+00 -1.7407e+01 2.7746e+00 -6.0241e+00 -9.6043e+00 -#> 3.1691e+00 8.8499e+00 -9.8104e-01 5.2230e+00 -8.6498e-01 1.2157e+01 -#> 9.0028e-01 -7.4834e+00 -3.0958e+00 -4.7875e+00 -1.0619e+01 1.0254e+00 -#> 5.0236e+00 1.2426e+01 1.5218e+01 -4.1620e+00 -5.4289e+00 7.4496e+00 -#> 1.5874e+01 -1.8470e+00 -5.5536e+00 1.0463e+01 -1.4672e+01 -4.7601e+00 -#> -9.4839e-01 1.9456e-02 -2.4531e+01 -1.4408e+00 4.2728e+00 -7.0230e+00 -#> 9.5121e+00 1.6671e+00 8.0547e-01 -7.9168e+00 4.5085e+00 -9.3673e+00 -#> -1.4331e+00 -3.4387e+00 -8.9806e+00 3.4347e-01 -1.6345e+01 -8.6711e-01 -#> 3.4529e+00 7.9604e+00 7.3007e+00 -1.6455e+01 -9.1844e+00 -7.5894e+00 -#> 4.7911e+00 6.6002e+00 -3.2026e+00 -9.1167e-01 3.9497e+00 -1.9500e+01 -#> 1.6120e+01 -1.0085e+00 2.9139e+00 9.0391e+00 -4.1158e+00 1.2126e+01 -#> -#> Columns 37 to 42 -4.2708e+00 -6.2193e+00 4.0405e+00 -2.4626e+00 2.6114e+00 -6.6725e+00 -#> -2.6733e+00 5.4875e+00 -9.1574e-01 1.0587e+01 1.4299e+00 7.1512e+00 -#> -1.1510e+01 -1.1695e+01 -6.2071e+00 -8.2175e+00 1.9661e+00 1.0048e+01 -#> -2.6016e+00 -7.4609e+00 -5.6747e+00 9.3854e+00 9.3149e+00 5.0030e+00 -#> -1.1570e+00 8.6847e-01 4.5684e+00 1.1916e+01 4.5815e+00 3.9475e+00 -#> 4.1254e+00 5.6743e+00 -2.8369e+00 8.3233e+00 -6.2624e+00 -6.4872e+00 -#> 6.2880e+00 -1.0629e+01 4.8851e+00 4.4807e+00 7.8539e+00 2.9114e+00 -#> 5.6523e+00 -9.5010e+00 1.1585e+01 -7.8069e+00 -1.9712e+00 -1.4402e+00 -#> -9.8762e+00 1.1291e+01 3.9450e+00 7.5963e+00 1.1955e+00 1.0262e+01 -#> 6.1205e-01 -3.9418e-01 6.0771e-01 1.7916e+00 4.0193e+00 1.2516e+01 -#> -5.3330e+00 5.4140e+00 1.0709e+01 5.9039e+00 -2.1101e+00 5.9015e+00 -#> 4.4010e+00 -5.8462e+00 2.4776e+00 -2.7191e-01 2.4018e+00 -5.4125e+00 -#> -1.6926e+00 3.5603e+00 6.8201e-01 -1.0361e+01 -1.1898e+00 1.4208e+01 -#> -8.8385e+00 9.4390e-01 -1.0417e-01 7.5702e-01 5.1242e+00 6.3443e+00 -#> -1.0421e+01 3.4307e+00 1.0905e+00 -3.4570e-01 -4.2475e+00 3.7947e+00 -#> -8.2029e+00 -7.5056e-02 -1.0317e+01 9.6688e-01 5.8398e+00 6.9069e+00 -#> 9.2056e+00 2.6418e+00 -6.2371e+00 3.8774e-01 1.1345e+00 -3.7623e+00 -#> -4.0129e+00 -4.6145e+00 -4.4591e+00 -1.5093e-01 3.8242e+00 -2.6517e+00 -#> -7.3270e-01 3.5610e+00 9.5007e+00 5.1863e+00 5.7790e+00 -9.9534e+00 -#> -1.9771e+00 -4.4519e+00 2.3197e+00 -5.2061e+00 -4.5916e-01 -1.5046e-01 -#> -6.4935e+00 -5.2702e+00 4.9656e+00 -7.4239e+00 9.6984e-01 5.0983e+00 -#> -3.6458e+00 1.0499e+01 -3.4077e-02 2.9104e+00 -4.5600e+00 9.1534e+00 -#> 9.5156e+00 -1.5982e+00 4.5251e+00 3.1438e+00 4.4952e+00 -1.8068e+01 -#> -2.3347e+00 8.9090e+00 -3.1034e+00 -5.2686e+00 1.9010e+00 3.1109e+00 -#> -1.6672e+01 7.5201e+00 -1.2067e+00 3.2212e+00 -9.8372e-02 -1.1015e+01 -#> 1.0232e+01 -6.7055e+00 1.3532e+01 -2.2495e+00 -3.1926e+00 -3.1087e+00 -#> -5.6343e-01 -2.0494e+00 3.1293e+00 8.3727e+00 6.9515e+00 -7.7970e+00 -#> -3.8374e-01 4.6507e+00 -1.0472e+01 -1.6164e+00 9.6006e+00 6.4874e+00 -#> -6.7306e+00 -5.4101e+00 5.1182e+00 -7.9834e+00 9.7989e+00 -5.6655e+00 -#> 3.6640e+00 -1.7696e+00 -4.1505e+00 1.7499e+00 8.3987e-01 4.7474e-01 -#> -4.6932e+00 -3.6107e+00 6.9508e+00 8.5656e+00 6.0091e+00 -3.3539e+00 -#> 8.8946e+00 -6.8784e+00 -5.4160e-01 8.3567e+00 -8.4307e+00 1.9961e+00 -#> -3.1567e-01 6.1672e+00 -1.2873e+00 -3.1027e+00 -7.7037e+00 4.4967e+00 -#> -#> Columns 43 to 48 7.6586e+00 -1.1518e-01 -3.6124e+00 -6.9019e-01 5.0078e+00 -7.1079e+00 -#> -1.1350e+01 5.9555e+00 6.9061e+00 7.6141e+00 4.6341e-03 -9.8206e+00 -#> -8.3167e+00 -4.8917e+00 6.2786e-01 3.4504e+00 3.6465e+00 -8.0655e+00 -#> -9.8720e+00 -1.3697e+00 1.8162e+00 -1.2960e+00 5.4253e+00 -8.4163e+00 -#> -4.6093e+00 3.1744e-01 1.3845e+01 5.9917e+00 -6.8247e+00 1.5222e+00 -#> 2.9278e+00 7.6591e+00 8.8834e+00 -7.3206e+00 3.5923e+00 1.1554e+00 -#> 1.7499e+01 -7.5501e+00 1.0134e+01 -2.2161e+00 2.7105e-01 -1.2830e+01 -#> 4.5848e+00 -6.8604e+00 -7.4688e-01 2.7601e+00 1.7959e+00 -4.4080e+00 -#> -6.2159e+00 1.0061e+01 8.0681e+00 4.7329e+00 -1.4557e+01 -5.9783e+00 -#> -1.4388e+01 6.6257e+00 5.5512e+00 1.4502e+01 2.5892e-01 1.5548e+00 -#> -5.8538e+00 1.7142e+00 3.0457e-01 5.8071e+00 -1.3670e+01 1.0826e+01 -#> -2.2469e+00 -8.5539e+00 2.7219e+00 -3.7882e+00 6.9489e-01 4.0771e-01 -#> -1.2400e+00 -6.6279e-01 -5.0088e+00 1.0212e+00 4.5081e+00 2.4033e+00 -#> 6.0568e-01 1.0213e+00 4.9064e+00 -1.2058e+01 -5.7127e+00 1.8151e+00 -#> 7.8649e-01 3.0735e+00 9.0810e+00 -1.4522e+00 -2.7879e+00 -1.1300e+01 -#> -6.7457e+00 -3.5366e+00 -8.0993e+00 2.4395e+00 3.8280e+00 2.8154e-01 -#> 3.8816e-01 4.0043e+00 -1.2529e-02 2.9227e+00 1.1307e+00 -2.1515e+00 -#> -1.3813e+00 -7.7966e+00 6.1247e-01 -6.6544e+00 6.6106e-01 6.3762e+00 -#> -2.4359e+00 -4.7479e+00 -1.0501e+01 3.6424e+00 -3.3815e+00 -7.2181e+00 -#> 3.5248e+00 4.5957e+00 1.1000e+01 2.9740e-01 -5.6136e+00 8.5737e+00 -#> 3.5406e+00 6.4048e+00 1.1231e+01 -8.7433e+00 -7.5803e+00 1.0159e+00 -#> 8.7339e+00 1.2499e+01 3.4785e-01 -1.0078e+01 -6.6816e+00 -5.8729e+00 -#> 6.4248e+00 -5.9144e-01 5.8935e+00 -6.0223e+00 -8.3323e+00 5.0163e-02 -#> 9.3207e+00 1.0922e+01 -2.6929e+00 -1.4812e-01 2.8792e+00 1.0079e+01 -#> -1.1273e-01 -3.4145e+00 7.7179e+00 -3.9226e+00 -3.9568e+00 -3.7776e+00 -#> 6.0739e+00 -1.0924e+01 6.6230e+00 8.0777e+00 4.8874e-01 -8.7335e+00 -#> 6.0373e+00 -2.6559e+00 6.3964e+00 -3.3931e+00 -4.7059e+00 -4.6125e+00 -#> 1.0763e+01 3.2660e+00 1.2023e+00 1.4326e+01 2.6127e+00 5.9575e+00 -#> -5.4855e+00 -7.0947e+00 9.2428e+00 6.1904e+00 -4.3986e+00 -5.3961e+00 -#> 6.7367e+00 2.6316e+00 -3.6548e+00 9.4662e+00 7.7264e+00 5.8823e+00 -#> -6.9419e-01 -7.8668e+00 8.8550e-01 1.0576e+01 6.2078e+00 -4.0643e+00 -#> -8.6446e+00 4.2446e+00 -2.4906e+00 5.2612e+00 9.2999e+00 7.0488e-01 -#> -5.8305e-01 -7.3184e+00 2.9134e+00 -4.2438e+00 1.0961e+00 -5.6040e+00 -#> -#> (5,.,.) = -#> Columns 1 to 6 -2.3830e+00 1.5045e+01 1.0755e+01 1.9093e+00 3.4693e+00 6.0668e+00 -#> -9.1152e-01 -3.0114e+00 -2.5333e+00 5.3111e+00 7.9649e-01 -1.6620e-01 -#> 3.2936e+00 -8.9781e+00 -3.2925e+00 4.2987e+00 8.5330e+00 -2.5871e+00 -#> -6.7483e+00 -1.5892e+00 -8.2601e-01 8.3264e+00 2.0445e+00 8.9829e-02 -#> 7.6095e-01 -3.3332e+00 2.8401e+00 -6.2223e+00 3.0014e+00 5.3401e+00 -#> 4.4991e+00 6.8539e+00 -7.9476e-01 -1.1602e+01 -1.2376e+01 5.7659e+00 -#> -3.9050e-01 -6.8403e+00 3.2067e+00 2.6319e+00 2.6960e+00 5.4775e+00 -#> 2.5490e+00 -2.2545e+00 1.9085e+00 1.9497e+00 -5.6462e+00 -5.9350e+00 -#> -7.9047e+00 2.6436e+00 2.0883e+00 -4.8206e+00 1.5255e-01 9.9853e-01 -#> -6.8098e+00 -4.4973e+00 -9.4591e-01 4.4095e+00 6.2232e+00 4.2034e+00 -#> 3.6400e+00 4.5510e+00 -1.4693e+00 -1.9791e+00 1.4600e+01 -1.1596e+01 -#> 2.0117e+00 -7.4495e+00 1.9576e+00 1.4078e+01 -1.7327e+00 -7.9045e+00 -#> -5.4675e+00 -8.1726e+00 -2.1059e+00 6.7968e+00 -4.1634e+00 7.1000e-01 -#> 3.3354e-01 -2.4415e-02 1.1163e+00 5.2675e+00 4.8768e+00 -1.1194e+01 -#> 5.9835e+00 8.5104e+00 4.7339e+00 1.0464e+01 4.9154e+00 2.1113e+00 -#> -1.3915e+01 -1.9716e+00 -2.9983e+00 -1.9405e+00 2.1733e+00 -9.8436e-02 -#> 7.1030e+00 -2.7523e+00 3.8541e+00 -6.1061e+00 6.9082e-01 4.5209e+00 -#> 9.2541e+00 -1.7739e+00 -4.2750e-01 -9.3434e+00 -6.3315e-01 1.9368e+01 -#> -3.5566e+00 -1.6541e+00 4.1912e+00 4.9319e+00 9.8950e+00 -1.2191e+01 -#> 1.1695e+01 -3.1335e+00 -3.6334e-01 5.4959e+00 1.1974e+01 -3.9265e+00 -#> 5.5305e+00 -2.7223e-01 -8.4738e+00 3.1758e+00 -1.2866e+00 -6.1654e+00 -#> -5.3902e-01 -3.0807e+00 -9.3326e-01 7.4006e+00 -3.2220e+00 7.1673e+00 -#> -5.6331e-01 4.7113e-01 1.0444e+00 -5.5636e+00 7.4316e-01 -1.0495e+01 -#> -5.2943e+00 -3.6060e+00 -2.2940e+00 -2.6019e+00 1.6932e+01 -9.5823e+00 -#> 1.2445e+01 1.6621e+00 1.4975e+00 -4.6527e+00 3.0234e+00 -3.0627e-01 -#> 3.7138e+00 -7.4142e-01 6.6150e+00 -2.3159e-01 2.8262e+00 1.0662e+00 -#> 7.6823e+00 8.5604e+00 9.3044e+00 5.1059e+00 -6.4475e+00 -5.8448e+00 -#> -3.1239e+00 -6.4697e+00 2.3946e+00 3.6733e-01 -7.1365e+00 -9.1770e+00 -#> 1.0079e+01 1.0610e+00 7.7564e+00 9.0786e+00 3.1420e+00 -1.1080e+01 -#> 3.1758e-02 -6.1169e+00 -2.4867e+00 -7.9838e+00 1.3619e+00 -9.4099e+00 -#> 2.5039e+00 9.3481e+00 6.4029e+00 3.7382e+00 -3.6803e+00 -1.0931e+01 -#> -7.1225e+00 -5.6279e-03 5.9118e+00 1.0263e+01 2.1115e+00 -7.7262e+00 -#> 2.5962e+00 -4.0675e+00 1.1823e-01 4.6199e+00 -8.2061e+00 7.5710e+00 -#> -#> Columns 7 to 12 1.8408e+01 1.5652e+00 -1.9543e+00 -2.8085e+00 -6.8231e-01 4.1349e+00 -#> -6.5164e+00 1.4820e+01 -1.1288e+00 4.5189e+00 8.7818e+00 -4.2647e+00 -#> -7.7523e+00 7.2089e+00 1.1656e+00 -2.2606e+00 2.2263e+00 -8.8486e+00 -#> -6.4213e+00 -7.8116e-01 -1.3705e+01 -1.4288e+00 7.0685e-01 -9.6210e+00 -#> -5.2904e+00 -4.3724e+00 -4.5616e+00 -2.1462e+00 -1.6026e+00 2.4151e+00 -#> 7.6076e+00 1.4017e-01 7.1753e+00 1.6275e+00 -1.8538e+01 -6.3133e+00 -#> -3.0048e-01 1.1206e+00 -2.7181e+00 -8.8819e+00 1.1768e+01 -2.3927e-01 -#> 3.9040e+00 1.5906e+00 -4.6165e+00 5.6143e+00 -2.3985e+00 3.8737e+00 -#> -1.8581e+00 4.4859e+00 -8.5661e-01 9.1416e+00 -7.1321e+00 1.5359e+00 -#> 2.5870e+00 6.4013e+00 -3.5112e+00 3.0193e+00 -9.9813e-01 4.9914e-01 -#> -5.6749e-01 -2.5634e+00 -2.0049e+01 7.6160e+00 -5.6121e+00 1.2002e+01 -#> -5.1016e+00 4.5954e+00 -6.6194e+00 1.0558e+01 8.4648e+00 4.0876e+00 -#> 1.9074e+00 -4.4138e+00 1.6856e+01 -2.3015e+00 9.8041e+00 5.1201e+00 -#> 7.9012e+00 -7.8831e+00 6.1398e+00 3.3781e+00 -1.0034e+01 1.1867e+00 -#> 3.8663e+00 -8.9898e+00 8.8156e+00 -1.6206e+01 1.1986e-01 -2.6682e-02 -#> -3.2041e+00 -1.4711e+01 -1.0443e+00 5.8869e-01 -9.3162e+00 -3.5679e+00 -#> 8.5277e-01 -6.4199e-01 -6.1192e+00 -1.5057e+01 7.4858e+00 1.3985e+00 -#> 2.5524e+00 1.4245e+01 -5.8290e+00 -2.5738e+00 -9.5414e+00 -2.3828e+00 -#> -1.0419e+01 -1.1870e+01 -3.9730e+00 1.5484e+00 6.0266e+00 8.5218e+00 -#> -3.8284e+00 1.5833e+00 -6.3376e-01 -1.2201e+01 5.7151e+00 1.7104e+00 -#> -7.7268e+00 4.3423e+00 -1.8297e+00 1.2690e+01 6.7800e-02 8.8895e+00 -#> 2.4643e+00 2.1095e+00 2.8859e+00 4.3288e+00 6.2184e-01 4.8834e+00 -#> 1.5286e+00 -5.3391e-01 6.7303e+00 2.7903e+00 -2.8789e-01 -4.8311e+00 -#> 5.6804e+00 -1.5724e+00 4.5631e+00 1.4864e+00 1.0664e+01 7.5194e+00 -#> -5.1199e+00 -6.2939e+00 -1.4950e+00 -2.8765e+00 -3.7088e-01 -3.6455e+00 -#> 9.5212e+00 -1.0249e+01 4.2535e+00 -1.1122e+00 1.0850e+01 5.5438e+00 -#> -9.8939e+00 -7.7922e+00 -6.7318e+00 -3.9111e+00 -5.4714e-02 -1.3669e+00 -#> 4.4817e-01 -1.5928e+01 7.6097e-01 -9.1389e+00 8.9798e+00 -6.3461e+00 -#> 1.1088e+00 -5.1600e+00 1.5623e+00 4.1705e+00 -3.6206e+00 -2.3324e+00 -#> 2.9760e+00 -4.8346e-01 2.1871e+00 -7.7891e+00 -8.0293e+00 -6.5323e+00 -#> -1.4006e+01 -1.1111e+01 -9.3578e+00 -1.0229e+01 -9.0496e+00 -5.3950e+00 -#> -6.5811e+00 -4.4025e+00 7.0217e+00 -1.9153e+00 1.4510e+01 -6.2039e+00 -#> -8.7036e+00 -6.6167e+00 7.2552e+00 8.6312e+00 3.9875e+00 7.1604e-01 -#> -#> Columns 13 to 18 -4.0327e+00 1.2063e-01 3.8963e-01 1.2082e+01 4.4529e+00 -4.0982e+00 -#> 7.4974e+00 1.8638e+00 -8.1714e+00 -2.7034e+00 5.9391e-01 -1.8888e+00 -#> -1.9006e+00 -4.8901e+00 -5.2321e+00 -1.3496e+01 1.1133e+00 -1.3292e+00 -#> -1.7714e+00 -6.2307e-01 -1.4108e+01 7.9459e+00 4.0040e+00 4.6520e+00 -#> -5.9653e+00 -6.6982e+00 2.3026e+00 -4.1065e+00 -9.7532e-01 -4.1826e+00 -#> 9.5455e+00 -1.2973e+01 9.3023e+00 8.9305e+00 -2.2556e+00 -8.2025e+00 -#> 6.3662e+00 2.3125e+00 6.1421e+00 -2.8569e+00 -6.1378e+00 3.1209e+00 -#> 1.7458e+01 1.5194e+00 1.1149e+00 -1.0232e+00 -5.4569e+00 4.5391e+00 -#> -8.9644e+00 5.6622e+00 -9.7496e+00 6.8962e+00 2.9958e+00 -5.5083e+00 -#> -3.4634e+00 -4.3375e+00 -3.3078e+00 -1.6976e+00 -1.8537e+00 -8.7774e-01 -#> 6.9377e+00 3.8605e+00 6.0268e+00 2.3274e+00 -8.8593e+00 -2.4879e+00 -#> 6.8199e+00 -5.9747e+00 -7.2865e+00 -1.4146e+00 7.5257e+00 -2.5840e+00 -#> 5.0461e+00 -6.7386e-01 -8.0948e+00 -6.2207e+00 -4.3244e+00 -6.1133e+00 -#> -2.8252e+00 -2.5870e+00 1.8028e+00 -3.6711e-01 2.0952e+00 3.6568e+00 -#> 2.3060e+00 -6.2086e+00 3.9923e+00 -1.4150e+00 -8.5676e-01 5.8699e+00 -#> -1.3242e+01 6.3704e+00 -1.0038e+01 2.3710e+00 -1.4868e+01 6.5075e+00 -#> -2.8410e+00 -6.0744e+00 9.4370e+00 1.7102e+00 5.2457e+00 -7.8648e+00 -#> 8.1169e+00 -9.1841e+00 8.2035e-01 4.0471e+00 -2.7658e-01 -3.2713e+00 -#> 7.9194e+00 -5.5048e-02 -3.5546e+00 2.4541e+00 -9.2249e+00 -5.8647e+00 -#> -1.7885e+00 5.6227e+00 9.0222e+00 -1.3317e+01 -3.9801e-01 1.6702e+01 -#> -3.9880e+00 8.6878e+00 -2.2664e-02 -7.4790e+00 -1.3203e+00 1.8476e+01 -#> -4.4812e+00 5.0602e+00 -3.5708e+00 1.2414e+00 -2.9848e+00 1.2678e+01 -#> -4.5570e+00 -2.6359e+00 1.4333e+01 9.2020e-01 4.0976e+00 4.4490e+00 -#> -1.1185e+01 3.1190e+00 -5.5076e+00 4.0855e+00 6.6168e+00 1.6649e+00 -#> -6.8375e-01 -2.9971e+00 3.8258e+00 5.9877e+00 6.4271e+00 -5.3766e+00 -#> 1.2921e+01 -1.8491e+00 1.1426e+00 -2.6788e+00 -1.8496e+01 7.4915e+00 -#> -7.3414e+00 3.5322e+00 2.5229e+00 5.7388e+00 1.1160e+01 -3.6650e+00 -#> -3.9620e+00 -4.8352e-02 7.4024e+00 -7.2069e-01 -9.0369e+00 1.8158e+01 -#> 1.4811e+00 -2.8406e+00 2.1601e+00 -6.3998e+00 8.4908e+00 7.1127e+00 -#> -1.0033e+01 -8.8568e-01 4.8732e+00 1.6056e+00 6.6930e+00 -8.4509e+00 -#> -3.1729e+00 -1.4420e+01 2.9933e+00 9.6449e+00 6.0507e+00 3.3586e+00 -#> -5.9448e+00 -4.6698e+00 2.9528e+00 -4.8673e+00 3.8821e+00 2.8102e+00 -#> 2.5439e+00 5.1470e-01 -2.2484e+00 -7.2786e+00 -2.3403e+00 -1.0638e+00 -#> -#> Columns 19 to 24 9.0982e+00 5.1736e+00 -2.4757e+00 1.5844e-01 -6.7332e-01 5.6665e+00 -#> -4.8000e+00 -8.6078e-01 9.5131e-01 -2.8858e+00 1.2703e+00 4.7499e-01 -#> -7.5245e+00 3.5997e+00 5.3406e+00 -3.8828e+00 5.1932e-01 7.5683e-02 -#> -8.4505e+00 1.1701e+01 -2.4831e+00 -1.4745e+01 5.4654e+00 -1.7338e+01 -#> 1.0567e+01 6.2607e+00 -3.6660e+00 -8.7202e+00 -1.5690e+01 -2.2930e+00 -#> 9.8490e+00 8.0237e+00 -3.6052e+00 3.3250e+00 -7.4190e+00 4.9683e+00 -#> 7.4106e+00 -6.2680e+00 5.1907e-02 -1.2005e+01 -6.8563e+00 4.7717e+00 -#> -9.6498e+00 -3.7699e+00 1.3658e+00 4.0453e+00 6.3397e+00 2.1112e+00 -#> 3.6101e-01 2.1361e+00 -5.9951e-01 -7.0478e+00 3.9619e+00 -1.2897e+01 -#> 1.7499e+01 1.4941e+00 -1.6669e+00 -1.9513e+00 -1.1608e+01 6.2823e+00 -#> 5.6085e+00 3.7327e+00 -1.0657e+01 6.8685e+00 -2.5067e+00 -2.8697e+00 -#> 4.8413e+00 1.7147e+00 3.9855e+00 -2.3646e+00 -1.2719e+00 5.9345e-01 -#> -1.1901e+01 2.1154e-01 1.2861e+01 6.5933e+00 5.5342e+00 -1.1736e+01 -#> -4.1619e+00 -4.0502e+00 4.5874e+00 -5.9631e+00 6.2003e+00 -1.5919e+01 -#> 3.2160e+00 6.1556e+00 5.7512e+00 -2.4275e+00 1.1169e+00 -7.0630e+00 -#> 3.8809e+00 -5.3835e+00 5.9786e+00 -1.2241e+01 -1.6439e+00 -1.5627e+01 -#> 1.7291e+01 2.2233e+00 -5.2146e+00 6.4464e-01 -5.5866e+00 1.7714e+01 -#> 5.3964e+00 1.3962e+01 1.9331e+00 -1.3314e+00 -5.3025e+00 -2.3913e-01 -#> -7.8175e+00 -7.1026e+00 -6.0704e+00 -8.0185e-02 -2.7631e-01 -3.9497e+00 -#> 2.5051e+00 -6.0993e+00 -5.4260e+00 -5.7393e+00 -3.3067e+00 9.7217e+00 -#> -1.2314e+01 -1.6606e+01 1.5940e+00 -6.1850e+00 1.7046e+01 -4.5048e-01 -#> -8.8948e+00 -5.6061e+00 5.1760e-01 -4.5070e+00 2.5624e+00 -1.1760e+01 -#> -8.3118e+00 -4.8301e-01 2.4386e+00 1.1577e+00 -5.4625e+00 9.3602e+00 -#> 1.0364e+01 -1.7660e+00 -7.1697e+00 4.0364e+00 6.2778e+00 8.9278e+00 -#> 9.0088e+00 3.4713e+00 -1.4689e+01 -3.7752e+00 -2.6339e-01 -5.4922e+00 -#> -6.4503e+00 7.0307e+00 -2.5707e+00 -4.4928e+00 -3.8815e+00 2.7975e+00 -#> -4.7806e+00 -1.3645e+00 -2.0122e+00 -2.8309e+00 3.8261e-01 9.4296e+00 -#> 1.8947e+00 -7.1179e+00 2.2631e-02 3.1296e+00 -5.7349e-01 6.5689e+00 -#> 2.1592e+00 -2.2114e+00 1.0351e+00 -1.8950e+00 -2.5478e+00 -7.2731e+00 -#> 9.4142e+00 3.9147e+00 -3.7307e+00 6.2922e-01 -2.9651e+00 7.1303e+00 -#> -2.8823e-01 7.4542e+00 -4.3850e-01 -7.6315e+00 -3.2843e+00 -6.0237e+00 -#> 9.5521e+00 -1.9974e+00 2.2915e+00 2.4075e+00 -7.1246e+00 3.7398e+00 -#> -4.8636e+00 1.7748e+00 9.8526e-01 -3.3251e+00 3.6598e+00 -1.3092e+01 -#> -#> Columns 25 to 30 3.1280e+00 1.2788e+00 -5.9366e+00 -4.2194e+00 -1.0509e+01 1.7713e+00 -#> -5.3974e+00 -4.3479e-01 4.8893e+00 4.0898e+00 5.8615e+00 -5.5451e+00 -#> -7.2648e+00 1.4969e+00 -6.0060e+00 -3.7986e+00 4.8079e+00 6.1958e+00 -#> 7.6044e+00 -5.9375e+00 -1.1993e+01 3.3922e+00 5.0549e+00 9.1746e+00 -#> 5.0515e+00 -5.0973e+00 4.8219e+00 1.0619e+00 4.5165e+00 -8.5143e+00 -#> 2.8204e+00 -3.9292e+00 9.8225e-01 -6.2513e+00 3.4724e+00 5.7119e+00 -#> -5.8743e+00 -4.5231e-01 -8.1410e+00 -4.4791e+00 1.2651e+01 1.2361e+00 -#> -7.2702e+00 -2.2595e+00 6.5974e+00 3.6935e+00 -1.9022e+00 7.6575e+00 -#> 9.8234e+00 2.3628e-01 2.1644e+00 1.7129e+00 -6.3852e+00 -1.3977e-03 -#> 7.5595e+00 -2.3019e+00 9.2416e+00 2.9271e+00 4.4853e-01 -4.5047e+00 -#> -2.7705e+00 -7.2757e+00 -2.2790e+00 -2.7667e+00 -2.1357e+00 -3.7666e+00 -#> -1.5240e+00 -2.3198e+00 -1.0283e+01 -3.7943e+00 -9.4341e-01 6.0867e+00 -#> 2.9003e-01 1.1639e+01 5.3857e+00 -3.1820e+00 -1.7843e+00 1.6590e+00 -#> 1.1793e+01 -5.1962e-01 -1.2450e+01 -7.9430e+00 -1.1940e+00 2.2993e-01 -#> 3.3375e+00 1.0644e+01 5.7075e+00 -6.2523e+00 1.4935e+00 -8.8530e+00 -#> 7.4632e+00 -1.0253e+01 -5.3447e+00 2.2636e+00 -3.4865e+00 -2.5092e+00 -#> 6.9494e+00 6.0295e+00 2.3967e+00 -4.1855e+00 -8.0001e-01 -6.9067e+00 -#> 6.3739e+00 3.9918e-02 -5.2436e+00 -6.9196e+00 4.8604e+00 1.7641e+00 -#> -8.2620e+00 -6.7295e+00 -1.7506e+00 4.9274e-01 -5.8497e+00 4.8315e+00 -#> -8.1358e+00 -8.7697e+00 -1.4823e+00 -1.1388e+01 -4.1414e+00 -4.4856e-01 -#> -2.1427e+00 -8.0484e+00 1.3658e+00 -4.8648e+00 -1.7214e+00 2.5082e+00 -#> 4.4764e-01 -1.6163e-01 -2.3081e+00 -3.4154e+00 -3.8190e+00 -2.5088e+00 -#> 6.5203e-01 -6.3120e+00 -2.4546e+00 -4.3488e+00 -4.5031e+00 -4.1094e+00 -#> 8.2619e+00 2.1131e-01 7.1995e+00 3.4129e+00 -4.1694e+00 6.2864e+00 -#> -1.2033e+01 -1.0293e+01 -8.4255e+00 -1.1377e+00 -7.0015e+00 1.5763e+00 -#> 1.9668e+00 8.1584e-01 1.2777e+01 3.1784e+00 -6.6701e+00 -3.4124e-01 -#> -3.6915e+00 -5.8549e+00 -4.4516e+00 4.3056e+00 4.6994e+00 -2.7244e+00 -#> -1.6473e+00 1.9608e+00 -5.7647e+00 -7.5848e+00 4.9922e+00 -4.6011e+00 -#> -8.8363e+00 3.1260e+00 -2.4390e+00 1.9239e+00 -1.3892e+00 3.3748e-01 -#> -1.2075e+00 7.8508e+00 -3.4638e+00 -5.1220e+00 -1.4918e+01 -1.8710e+00 -#> 1.6721e+00 -1.1214e+00 -1.0962e+00 2.9899e+00 -4.2842e+00 -1.8938e-01 -#> -2.8022e+00 4.7344e+00 4.4270e+00 6.5299e+00 1.0474e+01 -4.4890e+00 -#> -5.0150e+00 2.8116e+00 -1.5807e+00 -4.7746e-01 -7.7342e+00 1.1266e+00 -#> -#> Columns 31 to 36 3.5927e-01 -5.8005e-04 -1.1270e+01 9.0515e+00 -8.1627e+00 6.4540e+00 -#> 3.4068e+00 -1.2006e+01 5.3405e+00 8.2016e-01 6.0324e+00 7.1262e+00 -#> 3.7822e+00 -2.6076e+00 2.8620e+00 -5.2815e+00 -2.5095e+00 -1.0144e+00 -#> -8.2290e-02 -1.3884e+01 1.2595e+00 -3.4814e-02 -9.7531e+00 5.2547e-01 -#> -1.4943e+01 5.3147e+00 5.4917e+00 -1.0074e+00 9.0445e-01 -1.3210e+01 -#> -9.3631e+00 -3.5228e+00 4.4674e+00 -1.0704e+01 7.5457e+00 -6.0601e-01 -#> -1.3818e+01 5.2119e+00 -1.1345e+01 3.5436e+00 1.3895e+00 1.4796e+00 -#> 2.4844e+00 -8.3415e+00 6.8261e+00 4.8637e+00 5.8557e+00 1.5788e+01 -#> -6.6571e+00 -1.9480e-01 -5.0670e+00 5.2520e-02 -9.8149e+00 1.1303e+01 -#> 5.5721e+00 -6.1646e+00 1.1623e+01 -1.4157e+00 -5.3222e-01 -5.4600e+00 -#> -6.5203e+00 -6.7455e+00 7.7968e+00 -9.9367e+00 1.3339e+01 -1.4284e+01 -#> -1.4378e+00 8.9679e+00 9.4038e+00 -8.1249e-01 6.5240e-02 3.7578e+00 -#> 9.4793e+00 -4.9221e+00 -1.6958e+00 2.6819e+00 2.1806e+00 -3.3611e+00 -#> -1.5861e+00 2.8447e-01 -1.2740e+01 -8.7952e+00 5.9439e+00 -3.9432e+00 -#> 4.3443e+00 -6.5303e+00 -7.6826e+00 -4.1726e+00 7.0762e+00 -1.4766e+01 -#> 1.7967e+00 1.0627e+01 -8.8136e+00 3.6239e+00 -9.1490e+00 -2.4560e+00 -#> -1.2485e+01 8.5619e+00 2.9441e+00 1.5589e+01 -4.7064e+00 1.8462e+00 -#> -4.0600e+00 -1.4710e+00 4.7904e+00 -1.8557e+00 8.3307e+00 -7.1752e+00 -#> -3.9976e+00 -6.5843e+00 2.6554e+00 -3.1964e+00 1.0752e+00 -4.8778e-01 -#> 3.3116e+00 -1.7098e+00 3.2721e+00 -4.4763e+00 -2.2594e+00 -7.9010e+00 -#> 5.7161e+00 6.5650e+00 -1.6615e+00 -1.1369e+01 4.7180e+00 5.7694e+00 -#> 8.5136e+00 -5.9878e-01 -1.5651e+01 8.4650e-01 -2.4974e+00 6.7638e+00 -#> -5.8394e+00 5.7140e+00 -1.1210e+01 -2.2551e+00 1.6443e+00 -6.3403e+00 -#> 4.9703e+00 1.8214e+01 -9.8342e+00 -3.6146e+00 -5.6450e-01 7.4347e-01 -#> -7.1601e+00 8.6449e+00 1.8380e+00 -6.6168e+00 -9.3282e+00 4.5143e+00 -#> -7.4665e+00 -4.2576e+00 1.4170e+01 3.6051e+00 -4.7992e+00 3.7910e-01 -#> -1.4914e+01 3.7157e+00 -8.1508e+00 7.3911e+00 1.0948e+00 4.0697e+00 -#> -1.0161e+01 3.9304e+00 3.7996e-01 -1.3344e+01 9.3577e+00 -1.5207e+01 -#> -1.7293e+00 -2.0529e+00 1.0785e+01 -9.5662e+00 -1.9826e+00 5.8388e+00 -#> 2.3011e+00 4.4566e+00 -1.1964e+01 -7.5399e-01 -9.7352e+00 -6.9948e+00 -#> 2.0763e+00 -1.1806e+01 -4.8473e-01 5.3601e+00 2.5147e+00 8.4158e+00 -#> 7.8955e+00 -2.6015e+00 6.0200e+00 -2.1845e+00 -1.3762e+01 -6.8676e+00 -#> 4.4075e+00 1.2753e+01 1.1472e+00 2.4227e+00 -1.0693e+01 4.7613e+00 -#> -#> Columns 37 to 42 -1.1020e+01 6.6465e+00 -1.9290e+00 -4.8945e+00 1.8490e+00 1.5339e+00 -#> -4.5310e+00 -1.0638e+00 -4.4853e+00 7.1065e+00 -8.2337e-01 6.9402e+00 -#> 1.7499e+00 -3.6581e+00 1.2350e+00 2.8491e+00 1.7408e+00 -1.7779e-01 -#> -1.4229e-01 5.2913e-02 -2.4037e+00 8.6961e+00 -1.7492e+00 5.9939e+00 -#> -5.8223e+00 1.6047e+00 6.7784e+00 2.6981e+00 -1.2013e+01 -1.1739e-01 -#> -1.6281e+01 -1.5035e+00 -2.1397e+00 -6.4374e+00 6.2523e+00 -9.6294e+00 -#> 2.9919e+00 4.3445e+00 1.8871e+00 8.1929e+00 -1.2139e+01 -1.4848e+00 -#> 2.7343e+00 1.6345e+00 -3.4733e+00 3.3497e+00 6.7508e+00 -1.8764e+00 -#> -1.9815e+00 4.4974e+00 -1.5160e-02 -2.7958e+00 -4.8445e-01 -1.2368e+00 -#> -2.6236e+00 -1.4458e+01 5.1467e+00 4.9547e-01 -3.0636e+00 -5.2946e-02 -#> 1.2763e+01 -6.6685e+00 2.1419e+00 -2.5938e-01 1.0607e+01 9.4662e-02 -#> -2.5186e-01 1.7765e+00 1.8338e+01 5.5510e+00 -1.2427e+00 3.7455e+00 -#> 1.2176e+01 -1.7097e+01 -1.3656e+00 5.5532e+00 -7.6164e-01 -8.6892e-01 -#> -6.6048e-01 -4.9274e+00 1.1193e+01 -3.6997e+00 -6.0456e-01 -2.1174e+00 -#> -2.6654e+00 -6.7153e+00 4.3796e-01 8.9276e-01 3.0882e-01 6.6746e+00 -#> -3.7119e+00 -4.4686e+00 3.3810e+00 1.2998e+00 5.5979e+00 1.9030e+00 -#> -2.1550e-01 -7.2725e+00 3.9841e+00 1.9504e+00 -1.3238e+00 -9.6834e+00 -#> -1.7437e+01 7.4582e+00 -5.9427e+00 5.0374e-01 -2.0163e+00 6.8246e+00 -#> 1.0537e+01 -6.8816e+00 3.1621e+00 4.3694e+00 1.6787e+01 4.0453e+00 -#> 7.4132e+00 2.5091e+00 5.2410e+00 6.6752e+00 -7.1528e+00 -4.8371e+00 -#> 6.0238e+00 1.1466e+01 2.0985e-01 1.3284e+01 -6.6727e+00 9.7171e+00 -#> 1.6460e+00 1.2047e+01 1.4384e+00 -9.1026e-01 -3.0676e+00 -3.0724e-01 -#> -2.9586e+00 -8.5117e+00 2.8998e+00 -7.8197e+00 -9.8610e+00 4.7346e+00 -#> 3.0021e+00 -6.0272e+00 2.6469e+00 -4.5248e+00 -1.1962e+01 -5.6412e-01 -#> -2.9277e+00 1.6119e+01 -4.9798e-01 5.0180e+00 1.2726e+01 6.9309e+00 -#> -1.4897e+00 -8.5421e+00 -9.3005e+00 6.3357e+00 2.9906e+00 -3.1236e+00 -#> 3.2158e+00 1.0030e+01 7.0613e+00 -3.3065e+00 -7.8101e+00 1.4496e+00 -#> -3.6055e-01 2.4626e+00 1.1457e+01 8.1908e+00 9.5694e-01 -1.1264e+01 -#> -6.4157e+00 -2.8063e+00 6.9518e+00 9.7039e+00 1.0435e+01 2.9756e+00 -#> -2.5457e+00 -2.9342e+00 -1.4990e+00 -1.3760e+01 5.3558e+00 -1.0135e+01 -#> 2.4038e+00 8.8341e+00 1.1209e+01 1.9843e+00 3.2433e+00 -2.7260e+00 -#> 5.9133e+00 -8.8468e+00 3.7391e+00 3.1379e+00 7.2138e-01 4.3503e+00 -#> -9.0470e+00 2.9551e+00 -7.0717e+00 4.9854e+00 4.3781e+00 4.0203e-01 -#> -#> Columns 43 to 48 -8.8934e+00 -5.8731e+00 5.3382e+00 3.8283e+00 -7.9933e+00 -6.6487e+00 -#> -2.6201e+00 6.9511e+00 -1.7930e+00 1.0116e+01 3.7670e+00 9.1283e+00 -#> 1.2525e+00 7.8141e+00 -2.4204e+00 -5.5882e+00 6.7728e+00 1.0990e+01 -#> -9.0517e+00 5.1123e+00 2.6250e+00 -3.6983e+00 -1.6602e+00 3.9326e+00 -#> -1.8962e+00 2.7367e+00 -1.8428e+00 -2.0395e+00 -5.9638e+00 1.1027e+01 -#> -4.4575e+00 1.3333e+01 3.1724e+00 7.7592e-01 -5.7508e+00 1.6969e+01 -#> 8.7714e+00 1.4718e+00 -6.6618e+00 -7.8995e+00 9.0566e+00 9.4957e+00 -#> 4.8238e+00 6.6510e+00 -8.4335e+00 1.5356e+00 2.9097e+00 5.2194e-02 -#> 8.9002e-01 3.0509e-01 3.3963e-01 -4.1054e+00 -4.2206e+00 -3.0510e+00 -#> -1.1020e+00 -4.3733e-01 -3.6850e+00 7.9086e+00 4.6608e+00 6.0527e+00 -#> 7.2235e+00 2.0171e+00 -1.2581e+01 -1.5761e+01 -6.6077e-01 2.3741e+00 -#> 3.0645e+00 6.5877e+00 5.2796e+00 -7.6957e+00 -5.3312e+00 9.6305e+00 -#> 2.5498e+00 -3.3503e+00 -4.4238e+00 -1.2314e+01 7.3962e+00 1.7007e+00 -#> 3.7125e+00 -4.4944e+00 -3.4737e+00 -9.5028e+00 -3.9384e+00 -1.1472e+00 -#> 2.9937e+00 1.5265e+00 3.5508e+00 2.3970e+00 9.5820e+00 1.5473e+00 -#> -1.3902e+00 -8.5131e+00 -1.1774e+01 -4.6225e+00 7.6096e+00 -7.9541e+00 -#> 6.0351e+00 5.2485e+00 7.6588e+00 4.2709e+00 4.7346e+00 6.0520e+00 -#> -3.9830e+00 7.0588e+00 6.1215e+00 4.3871e+00 -1.1102e+01 8.4514e+00 -#> -2.8928e+00 -7.0470e+00 -9.8818e+00 1.0352e+00 4.5459e+00 -8.3517e-01 -#> 7.1982e+00 -3.6810e+00 -1.4942e+00 -9.4427e+00 1.7525e+00 4.1326e+00 -#> 9.4196e+00 3.0754e+00 1.7681e-01 -1.2273e+00 -7.9381e+00 -1.5840e+00 -#> 9.9200e-01 -3.7385e+00 -4.5607e+00 -9.1025e+00 -1.7741e+00 -5.9780e+00 -#> 3.4331e+00 6.9655e+00 2.2767e-01 5.4729e+00 -1.9004e+00 1.0142e+01 -#> -2.0826e+00 -6.2107e+00 3.6577e+00 2.0191e+00 -1.2671e+00 2.2451e+00 -#> 3.1745e+00 -1.8666e+00 9.5549e-01 1.1877e+00 -5.7058e+00 -1.1480e+01 -#> -3.5149e+00 1.5049e+00 -5.9846e+00 5.1825e-01 1.1530e+01 7.2957e+00 -#> 1.3030e+00 2.6066e+00 1.7635e+00 -9.6033e+00 -4.6952e-01 6.4410e-01 -#> 1.0071e+01 1.2886e+00 9.9078e-02 -1.1346e+01 2.7033e+00 -1.6566e+01 -#> 3.3214e+00 -8.9271e+00 -1.5329e+00 1.2822e+01 7.3061e+00 -1.0610e+01 -#> -5.5313e+00 -1.9182e+00 5.2385e-01 3.3351e+00 -2.3038e+00 -2.5890e+00 -#> -9.6380e+00 -2.2434e+00 4.5545e-01 -2.4699e+00 -8.2808e+00 -7.3125e+00 -#> 5.7026e+00 2.7890e+00 -1.1418e+01 -2.5375e+00 6.9807e+00 4.0363e+00 -#> -4.7068e+00 -5.1611e+00 -9.9217e+00 2.1581e-01 5.1793e-01 1.5048e+00 -#> -#> (6,.,.) = -#> Columns 1 to 8 -0.1104 0.9378 -5.7425 -1.3886 -6.7944 -1.2199 2.5925 -9.2208 -#> -9.2785 1.3463 2.8895 6.0180 -2.2107 -3.1544 9.8532 -1.1469 -#> -2.0518 -13.8740 -2.3745 6.4815 -3.4840 -8.7533 2.0470 -1.2614 -#> 2.8974 1.6251 -6.4410 11.9340 -4.1402 5.6601 2.2437 1.0130 -#> 2.1063 -0.8892 6.5428 0.7713 1.8376 0.2168 -1.7869 -3.1655 -#> 5.9192 -6.5299 3.9429 -4.7417 -2.0345 -2.1108 1.4094 -1.7040 -#> -3.1214 -9.4695 7.8207 -5.2906 3.8265 -3.1306 0.2918 8.0468 -#> -2.8765 -5.4424 5.3099 5.6219 -2.0720 -3.4595 6.4237 9.0764 -#> -6.5373 2.7311 5.2222 3.8423 3.1656 0.4355 8.0006 -2.0814 -#> 1.2809 2.4835 1.5944 4.1153 -2.1983 -3.9559 -3.0186 0.6905 -#> -12.6721 7.8146 11.8049 -1.7205 -10.0410 8.1629 3.5770 -11.8054 -#> 5.8688 -5.7280 6.1080 3.1844 7.2086 -9.3712 -2.7592 6.2454 -#> -5.4817 8.3479 0.4329 -4.7231 -3.7640 -0.2761 -4.6210 0.6497 -#> -2.6736 -0.6957 4.6696 0.0148 -5.5836 0.8055 0.7237 -2.8764 -#> 5.1874 2.4803 -2.1401 4.0009 0.3659 0.4488 8.4825 6.3955 -#> 8.2776 -6.1439 -6.7428 0.6655 4.5106 -7.2390 -1.9144 -5.6041 -#> 2.6683 0.6469 -7.0735 -0.0103 9.7698 -1.6489 -1.4062 0.4159 -#> -3.5536 -11.1788 8.0697 -5.3634 0.5343 5.5538 7.8198 -7.0964 -#> 7.5448 3.3799 -3.9270 0.0477 7.1459 -2.2052 0.3239 2.4638 -#> -8.3648 11.6064 -1.8365 -2.8855 -3.7194 -9.8075 -13.9457 2.6624 -#> -0.8695 -7.6880 2.9305 1.6563 -4.7491 -5.6331 7.2559 7.0108 -#> -4.5930 8.6262 10.1692 -1.9337 -1.3028 1.9578 4.1001 -1.4597 -#> 5.8933 4.6195 0.6673 -3.3505 4.2870 -3.4126 -1.0499 -1.0670 -#> 3.0959 11.1196 -11.3394 -2.6640 -3.7993 -3.4703 -4.4409 4.4927 -#> 7.0263 1.0446 -5.2210 -0.0563 7.9781 3.2414 4.7680 0.0977 -#> 8.4570 2.5366 -12.1487 8.7612 -3.6473 -6.9408 -9.0835 3.5213 -#> 1.4681 12.0926 4.3367 -7.4203 5.1335 -1.2544 1.0279 2.6569 -#> -1.6785 -21.8863 -1.3839 9.7845 -9.6162 -2.6612 3.2954 -1.9311 -#> 0.9169 -1.5492 -3.9720 3.8593 -0.8790 1.1036 1.5049 6.8214 -#> 0.2175 -10.2392 -8.1670 -1.4911 -2.5520 -10.8965 -1.0136 -5.1678 -#> 1.3053 0.3541 -2.9467 7.1714 4.1810 2.3582 4.0887 4.2044 -#> 5.4859 11.4342 -12.2243 6.8399 2.0426 -0.6062 -4.4571 6.7048 -#> 14.3603 -3.2200 3.9387 1.3679 -3.8398 -8.6160 -2.1590 -8.0406 -#> -#> Columns 9 to 16 -6.3904 5.2458 -4.8878 0.4777 9.6190 -4.0347 6.2520 0.1535 -#> 5.3624 -4.1868 -9.0603 6.5989 -2.7368 -0.5048 -8.6676 5.6638 -#> 5.2559 3.3481 -7.0247 -1.2221 11.6998 5.6450 -0.9032 6.0717 -#> 0.3988 8.7230 1.8028 14.9125 0.2944 -0.6893 1.6754 -1.2014 -#> 5.9161 -2.8328 11.3242 -0.0847 1.8416 2.2938 -4.5816 2.5918 -#> -1.9162 -1.6693 12.8858 -7.4261 -4.7459 21.0233 -0.7926 -13.2537 -#> 0.4529 -1.3662 16.3296 -6.9726 11.5936 -2.3820 -12.0824 -0.4903 -#> -3.9553 2.5013 -4.9634 -13.3355 -3.9944 -0.6187 6.8524 -4.1107 -#> 4.6945 4.5045 -8.6546 10.3891 -2.7178 -5.6847 4.6090 1.6181 -#> 9.8551 -1.0476 -7.6735 9.0922 5.5801 2.0157 -3.8528 14.7383 -#> -1.8485 2.3732 5.4091 -0.7128 9.2158 -4.9523 -13.8347 -14.1544 -#> 1.9425 -4.4296 -0.9140 -20.1580 3.1129 -4.2815 -1.1541 -4.7674 -#> -6.3371 -1.7931 -2.5522 3.0294 -5.5946 0.2522 2.1318 6.1523 -#> -6.9545 4.2920 3.5247 12.1743 -8.3309 1.5324 -4.0184 -2.4786 -#> -7.9861 0.6720 8.6008 8.4768 -5.7279 6.4261 -0.0801 -1.1222 -#> 2.8641 11.7266 -2.5167 11.6780 -3.7370 0.2652 7.3983 3.9098 -#> 9.1588 -3.0015 -3.0287 7.1194 -4.4060 0.8970 -6.1486 5.8799 -#> 1.1350 4.9105 5.6519 -7.8276 5.2783 3.6180 -5.6088 -4.1530 -#> 11.4123 0.2572 -2.4716 -1.5631 -4.3751 -8.8379 -9.4847 -10.3646 -#> -2.2827 -5.8110 4.0167 5.5697 -1.9213 -6.4974 -6.3152 3.8903 -#> 0.8398 3.9075 -4.3331 -8.2620 -0.2703 -12.2609 -1.9502 -0.2135 -#> -12.2079 -1.3780 -1.2939 10.9357 -1.0795 -15.5522 3.0758 10.4788 -#> -1.6880 -4.5991 12.9073 -0.1889 -3.7366 3.0858 -3.7235 -6.3634 -#> 14.6158 -10.9498 -2.9490 14.3120 11.5878 8.4827 10.1251 7.4441 -#> 12.6866 -0.0641 -3.6279 3.0155 -9.3858 -11.3412 -11.1634 -9.6604 -#> -1.7954 3.4595 3.4319 -3.6742 -8.5487 -3.0773 3.5025 1.9246 -#> -2.6488 0.0993 6.1076 -3.1861 1.9818 -3.6518 6.4687 -0.0724 -#> -1.3800 -9.8271 2.9700 5.3824 -0.3110 7.3517 -8.2606 -11.3658 -#> 7.2609 4.6560 -10.1995 8.5112 -9.4829 -10.5180 -6.1653 -0.7339 -#> 6.4446 3.1358 1.7575 11.6717 15.9693 -1.8292 -1.2986 -5.1633 -#> 5.2860 6.0090 -0.7773 -4.0574 2.2034 8.1937 10.7889 3.1794 -#> -2.7170 0.8778 7.5877 0.2280 -13.0934 12.6423 -10.7424 2.5844 -#> -3.7412 1.8666 -0.7106 -0.6447 2.4800 -1.3900 2.4915 -3.1562 -#> -#> Columns 17 to 24 5.4172 -8.8975 4.1241 4.2239 -9.0144 0.6235 -4.9852 -5.5359 -#> -6.1860 14.5444 12.7085 -4.9095 -1.4238 -1.5653 -4.4326 -0.0120 -#> -6.4332 15.0012 7.7659 -0.6413 3.6668 7.6917 -0.7941 -3.8065 -#> -4.8550 0.6845 -5.9768 1.8772 -7.2331 7.2820 2.7222 -3.4189 -#> -1.8828 -5.6188 3.9142 -15.6999 1.0825 3.6080 0.5859 0.2142 -#> -10.4254 -7.3322 7.6011 12.1679 2.1166 -1.3719 -0.8404 -5.9731 -#> 1.6563 -8.6208 0.9193 -1.9948 4.1875 9.4993 -1.0747 -8.5283 -#> 3.6731 1.8565 -0.7910 9.2543 4.0691 0.0453 -3.9086 -7.4447 -#> 1.2494 1.7534 0.7568 -3.5535 5.5141 0.9877 0.9665 3.8092 -#> -7.3750 5.0936 5.8954 -5.7369 -3.3362 -4.8037 -4.0630 -7.6349 -#> -15.0747 6.5869 10.2412 -8.1852 12.6544 -6.4481 -17.7733 9.8440 -#> 2.3890 5.0567 2.5480 11.3571 6.9356 1.6431 -4.5357 -2.4146 -#> 11.5287 3.0665 -12.9552 10.2215 -10.5193 -0.5883 4.4301 -2.9194 -#> -4.5341 4.6156 -11.1645 5.2721 4.7755 -5.4269 8.8919 -4.1058 -#> -2.5422 -4.8962 -4.2533 -3.4713 0.4350 -0.3875 5.2587 6.7581 -#> -2.2825 -1.3043 -5.7796 0.6123 -1.6119 -9.6504 6.3327 -4.8924 -#> 11.4267 3.9958 -5.8304 -0.8209 -9.3814 -1.0441 7.5002 -10.5179 -#> -7.2332 -2.1754 21.7108 -6.8779 0.6091 2.9913 -5.5717 -9.2769 -#> -8.4591 6.1532 -3.9296 0.5712 -5.5937 2.2535 -12.5900 -2.0844 -#> 4.4383 5.7097 -7.7355 -5.1943 5.3091 4.7351 -3.1911 -7.3453 -#> 2.7565 17.1407 7.5476 2.4632 18.5448 -0.6328 1.3705 -5.2688 -#> 21.0880 -6.0673 -15.0749 -6.9460 -0.4218 -10.5325 7.6761 7.2181 -#> -13.9597 -5.9834 -13.9455 3.4727 13.5404 2.1127 6.0688 3.8601 -#> -9.5446 -6.9907 -11.8360 -0.7782 -3.4148 0.4085 -7.5634 9.1145 -#> -7.9267 4.0797 13.3777 0.9393 7.7659 7.1630 -9.1600 -2.5893 -#> 6.0249 -5.8716 -5.5973 0.0046 -3.0231 1.1144 -5.0819 -5.3707 -#> -4.0046 -13.7106 -4.8585 4.0724 6.5528 8.7553 3.9177 6.3200 -#> -0.9138 6.5573 8.3198 2.7524 4.8368 -0.0342 4.3763 0.9721 -#> -4.3229 19.7568 11.2584 5.4611 -1.0099 -5.6638 -2.6794 -7.9756 -#> 3.7537 1.4670 -7.9778 1.0656 -12.9156 2.0157 6.9521 0.2425 -#> -5.8054 -7.1049 -2.3444 -1.4620 -7.0817 3.2761 2.1241 0.6728 -#> -4.7399 8.6810 -0.3137 12.5848 -2.7645 -4.2000 -7.7583 -4.0197 -#> 6.5134 -6.5004 -1.0067 3.1483 -7.6219 -9.4831 2.1260 -2.9960 -#> -#> Columns 25 to 32 0.9164 6.6364 -2.1096 7.6167 5.2051 -3.5309 5.7607 -3.6761 -#> 5.7481 -6.2620 5.5865 -7.7751 6.8103 4.0659 -5.4968 3.3758 -#> 8.8301 0.1947 1.3354 -10.1364 2.6818 -6.0935 -2.1317 5.1515 -#> 3.2282 10.2814 -1.9183 -5.0089 0.9938 0.0229 -14.7824 11.1888 -#> -1.7901 0.0106 0.7380 3.1113 0.0016 3.8805 8.3862 7.7820 -#> -5.4735 9.7038 0.6526 1.3334 2.2150 -16.1218 -2.6737 12.9847 -#> 7.7840 -3.3068 -1.8455 2.6265 0.7157 -3.4875 8.9484 1.1720 -#> -5.0259 12.4743 -4.9562 4.6295 0.6454 -10.6554 -1.6643 11.3980 -#> 5.6431 -4.1585 -3.5017 -4.5217 -6.4074 2.7111 -0.2705 2.2997 -#> -4.3305 -5.4521 13.0141 -1.0215 -2.0584 3.4156 4.8674 12.2027 -#> -2.6645 -3.5395 -15.7281 3.9482 -7.8991 0.9436 -1.4279 -2.3085 -#> -21.3975 -6.5419 -0.5706 10.4837 9.4136 -14.3615 -14.6530 9.4419 -#> 10.6879 -8.3461 -4.0221 -4.7406 -8.1651 0.4021 6.4712 -5.6879 -#> 3.2240 -2.9612 0.0767 -8.3518 -6.7247 3.3232 -3.6010 -3.9666 -#> 10.8131 6.4351 4.1059 -3.3593 -8.6990 -0.5441 8.8988 -9.9437 -#> -11.7232 -2.4128 4.6083 -4.7411 -12.0841 8.8367 -2.9589 0.9392 -#> 6.9387 -11.6099 13.2340 0.9739 8.6156 -11.7220 7.7174 1.2421 -#> -3.5701 -0.7452 -2.5017 6.0676 1.6873 -1.5639 -3.9503 9.2895 -#> -1.7005 5.3688 -5.0744 -16.4163 -10.0591 -11.3082 -6.0288 -1.3911 -#> 3.5844 -5.8183 4.4966 -0.8732 11.8576 4.8436 1.3823 -1.7042 -#> -6.3087 5.5004 -10.4076 -2.8330 -7.5559 0.0676 -10.2145 -0.1479 -#> 4.1106 -1.4924 -9.4611 4.5932 -3.8251 16.5333 1.7181 0.8679 -#> -6.7011 4.7048 7.9716 -4.6916 3.7326 -8.1865 -15.0386 -4.2498 -#> 2.4939 -2.1954 7.4570 -0.5020 4.1974 1.0839 11.5044 -6.8840 -#> -6.2699 -12.1483 2.7677 -7.6698 3.2706 -6.9778 1.7840 -10.8523 -#> -8.1987 9.3442 -1.6294 1.6090 -3.3236 -11.7136 -0.4056 11.0062 -#> 10.5299 -1.8471 0.9151 -0.6181 4.6984 -2.4208 -5.0834 -1.4955 -#> -3.2630 2.9769 0.9128 -4.3088 -10.9615 1.7562 10.8109 -7.8176 -#> -2.5736 -10.3437 11.3399 -11.4009 -1.4814 0.4352 -0.8255 -4.6240 -#> -9.0465 11.5223 0.9622 -0.6233 3.3622 -9.5351 8.7453 3.0839 -#> 0.4572 3.4590 7.2394 1.7829 4.6249 -0.4822 -1.4148 10.4267 -#> 2.5418 -7.5339 9.3177 -9.9249 -0.6547 -6.0251 2.0060 -3.2288 -#> -9.2620 1.0335 -8.6599 -0.5788 -5.1153 1.5086 2.8250 -4.3265 -#> -#> Columns 33 to 40 -11.8986 2.4147 -4.4610 -0.6107 3.3662 -8.5069 -2.2273 -9.2873 -#> 8.6256 -1.8284 9.2563 -3.4672 -0.0532 1.5168 -5.4890 3.4163 -#> 4.2299 -1.9203 1.9084 -2.1578 -10.2429 7.3712 0.1574 -2.3494 -#> -1.5205 -7.9155 7.4539 -18.9782 19.1242 -7.3024 -4.1140 5.9851 -#> 11.3259 -2.9690 -0.6773 6.6274 -1.2912 3.8586 5.5946 4.4924 -#> -1.0683 9.5736 -10.4562 27.6726 -11.6011 -16.5427 14.6381 5.0497 -#> 7.0892 -9.6719 -9.7624 1.9728 -13.5009 7.0491 4.0457 -5.3395 -#> -7.4001 3.7149 -4.0817 -8.0094 -7.2760 -3.4291 3.6974 -1.5636 -#> -2.8440 -3.4581 -5.7945 -11.4628 6.9835 -9.2022 0.4153 0.5739 -#> 12.5440 -7.6764 0.5098 1.7730 9.1909 8.1671 2.5764 -7.0154 -#> -15.5586 7.9803 6.0739 -10.7840 -12.0563 12.1422 7.7948 -3.1358 -#> 5.9472 -2.5652 -0.9360 5.3704 1.3411 0.7240 11.4107 -11.6393 -#> 8.1663 -5.4020 8.3506 -17.4386 -0.8662 2.8202 6.6108 -3.2091 -#> 1.4430 -6.1873 0.6227 -7.8972 -9.4365 0.6525 -3.3555 -0.8793 -#> 6.1050 -16.9247 6.6229 -5.1295 -18.5341 2.2928 -2.2324 -5.7622 -#> 9.0676 -5.8671 5.1432 -8.3789 8.0141 8.3037 7.0876 6.2444 -#> 9.7063 1.2566 -10.7937 -0.6233 -1.8329 -8.1201 1.5637 1.3777 -#> 2.8862 2.2535 -0.1103 8.3647 -7.7593 2.0481 5.5800 3.2890 -#> -5.4787 -6.0615 -15.2475 -3.8781 -7.9978 9.5445 -4.6079 -0.9424 -#> 8.5528 -0.5881 4.5892 2.4927 -3.2778 8.8345 -3.8547 1.1981 -#> 0.0472 -3.2210 11.5313 -10.6498 -2.6684 -0.0521 -8.7467 0.4957 -#> -3.0700 -6.1665 6.6928 -15.0433 3.9053 1.0258 -1.8824 0.8121 -#> 1.9634 3.3713 -0.2318 5.9410 -6.4410 -0.9862 -14.5837 18.1802 -#> 0.7374 -9.0851 -4.4673 12.4889 7.7687 6.9073 -7.1165 -0.6474 -#> -1.6195 1.2593 -19.5939 21.1321 -5.2771 5.0347 -4.9280 4.1432 -#> 3.3800 2.1993 -10.3884 -0.5506 -3.8106 6.3740 1.3545 2.2678 -#> -12.3134 -3.8762 -1.9338 -3.3150 -4.5496 -18.2740 -7.0708 5.3476 -#> 6.4340 -3.6236 11.0587 11.1597 -18.1282 13.3269 -13.9402 -2.0978 -#> 1.6286 -1.0577 -4.4468 7.5563 0.7859 11.6447 -11.9411 -3.2172 -#> 1.3110 6.0144 1.8548 4.5404 -7.4083 5.3819 3.6763 -4.2610 -#> -7.4247 -1.7670 -4.6530 -8.1285 -10.1503 -3.5548 0.2605 -7.1283 -#> 0.1491 -3.1582 8.1519 0.3861 5.1820 2.5664 3.2590 -3.3111 -#> -0.4734 2.3086 2.7129 1.1263 9.1693 9.1878 10.2963 6.6774 -#> -#> Columns 41 to 48 1.2782 3.0907 1.9709 2.3417 -4.3050 7.1534 11.8273 0.9176 -#> 2.4463 10.8948 1.6810 -8.5839 -0.7853 -4.7783 -14.4342 -11.8971 -#> 1.7757 5.7692 5.3072 -4.5191 -3.9519 6.7591 -0.8277 -3.8367 -#> 2.7664 12.2129 -13.4740 -5.2795 16.2868 -3.6188 4.1092 -1.3353 -#> 9.7374 -6.3342 -17.3494 -0.5017 -9.1257 4.6217 3.1513 3.2992 -#> 4.9675 4.2960 -21.1272 20.9616 -4.4320 1.8313 6.9209 4.1419 -#> -0.6570 6.6272 -0.8664 10.3004 -7.1146 10.9799 -2.9945 -2.8342 -#> -6.3293 3.1892 2.5886 5.4950 3.4890 -8.0250 -2.8254 -7.9372 -#> -0.5784 3.8178 4.4441 -3.1310 7.4993 3.6207 -0.3728 2.2499 -#> 8.4700 -1.6866 -3.0571 -16.1363 5.3445 -0.7077 -9.9704 -6.9214 -#> 2.5221 -11.8837 8.1810 1.1480 -4.3493 8.9619 -4.5526 1.7608 -#> -12.1878 21.2674 -6.3019 -5.0895 -14.0484 -0.5744 2.9402 -5.4836 -#> -7.0631 5.2959 7.0089 -3.0844 3.8373 -6.6631 2.1493 -0.9894 -#> 2.0537 -0.0306 -1.7520 -3.4060 11.9724 1.3590 4.3287 12.5580 -#> 12.4126 -4.2870 -4.4063 -1.9916 7.4882 3.6379 -0.4937 10.1923 -#> 1.7027 -6.4505 -2.7387 -16.2455 12.9007 0.3193 -9.1811 9.0046 -#> 2.1747 0.5451 -11.6460 -13.3928 0.0783 -4.0526 3.3838 -17.6555 -#> 4.6698 2.2650 -13.3213 6.2734 -15.1414 3.1142 4.6771 -1.2094 -#> -2.7839 0.2344 8.5908 4.0064 3.6063 3.4178 -3.9916 -4.0202 -#> -10.4147 -11.0731 1.7277 -4.9056 6.9644 -3.1199 2.0394 -2.1820 -#> -5.5745 -2.2343 5.6448 6.1782 7.3012 -7.1318 -10.1754 -5.7040 -#> -4.8389 4.2046 7.1973 -1.9157 8.2896 -0.3394 -12.3002 10.6009 -#> 4.0387 -2.4189 -3.1629 -2.7025 5.4084 1.8913 2.7038 9.0425 -#> -8.2393 12.7998 14.6245 0.2574 3.2928 4.0283 -3.1100 -17.5313 -#> -9.2878 -1.9203 -6.5422 7.5714 3.6611 7.3589 -3.7316 -9.5962 -#> -1.1382 -0.0783 -10.7222 -6.2943 9.0036 2.3743 -2.7191 -0.3087 -#> 0.0976 1.9307 -4.1910 5.6042 0.3211 4.8762 9.9717 11.4200 -#> 5.8096 -6.6802 -3.5446 -9.9357 3.0509 -1.8566 -7.1458 5.7246 -#> -1.5739 -1.6134 0.8694 -13.7485 8.9703 -8.6950 -7.3309 -1.9867 -#> -1.3322 -1.9676 -8.1164 -11.3705 -4.6452 10.2379 7.2234 -2.2219 -#> 0.1392 5.5978 -9.3507 -3.7589 -5.3242 2.3767 8.0046 2.3059 -#> -1.6345 -10.6635 2.6171 -2.6985 4.7007 -0.0966 3.1772 1.3217 -#> -9.4622 6.9922 -7.3082 4.2840 0.9404 0.6788 -3.7414 0.1027 -#> -#> (7,.,.) = -#> Columns 1 to 8 -8.3868 -11.6401 7.4697 -10.9227 -1.2344 -0.8900 -8.8770 -5.4839 -#> -9.1322 4.3897 0.3318 -1.2337 5.0459 2.4164 -1.0493 0.3225 -#> 0.1665 5.2499 -0.4661 -2.1035 -2.2735 1.1229 1.1841 1.3380 -#> -2.3096 2.9836 -0.8848 -14.8167 6.2941 -7.0506 -2.1243 12.7500 -#> 10.8477 -7.4534 -3.2469 1.2711 -1.6800 2.5467 1.2383 5.5295 -#> 5.1895 -2.6360 -9.6760 9.0771 -11.6362 3.2688 -2.5761 -4.0012 -#> -6.1945 -9.9513 2.7478 -14.0589 -5.7439 -4.8767 -6.6987 16.0320 -#> -1.7465 2.0612 5.0697 -4.8202 -0.9129 -1.0709 5.3457 -3.9324 -#> 3.5950 -0.1859 -6.9557 -6.4294 1.4565 -4.4959 14.1353 2.4524 -#> -12.4705 0.6877 -3.4191 6.5834 5.5183 -1.0149 1.1827 5.6798 -#> -8.8835 2.0619 -3.5445 8.5120 -7.9080 6.9911 8.6304 -8.9595 -#> -0.2742 -3.0854 -1.6339 9.4392 -2.5067 4.5785 0.2035 -5.0767 -#> -1.1434 15.9596 -7.8063 -9.7865 10.5698 -4.6604 8.0900 -5.7061 -#> 4.6076 3.9945 -11.6087 -5.4445 -6.3514 -2.9826 -1.0471 3.7196 -#> 12.9841 5.3269 -6.8384 0.2756 -5.1947 -6.7814 2.5640 -7.7269 -#> 2.2016 -1.5726 -2.6084 -0.5966 8.7939 6.1295 -0.2765 0.4567 -#> 1.7501 2.1831 2.6268 4.4627 9.1659 -1.7995 -2.6477 0.7981 -#> 0.1504 -7.4290 -3.9983 -5.3382 -1.8106 0.6926 -6.7663 12.2016 -#> -8.6294 10.0757 -7.5659 6.8896 -6.0096 6.9586 -4.1332 3.6935 -#> -1.4324 4.3200 6.0882 6.4074 -4.5371 -13.4335 3.0922 2.4905 -#> 6.0976 -1.1818 10.5174 -4.9773 -5.7739 6.8962 0.0486 3.7851 -#> -4.0463 -1.4940 1.7253 -16.4487 2.5028 -6.1684 4.3875 7.6539 -#> 7.5972 -0.5269 -2.3114 6.6299 -8.5260 -3.5701 -16.3837 0.6222 -#> -8.7690 -5.4894 -4.1165 5.5410 4.2613 7.8740 -11.9896 -5.6817 -#> -2.9779 -3.2116 -11.2902 7.1702 -9.1442 5.8912 -4.4738 3.3884 -#> 6.4202 2.3732 -0.1645 -4.8198 -0.4243 -1.2116 0.7092 -4.4674 -#> 5.3454 6.0369 2.3749 -5.2803 -0.7246 -2.2842 6.0546 -0.1393 -#> 1.7873 1.5501 9.4551 11.0980 -7.7700 14.2810 -6.7579 -0.5943 -#> -3.9084 9.7229 -3.9989 3.4151 -2.5499 3.7245 -2.2462 1.4619 -#> -3.2414 0.1511 12.3477 9.7800 -3.8685 3.8956 -13.8782 -1.4243 -#> 5.0264 3.0037 -7.3815 9.3482 10.0306 2.9168 0.5762 -5.2767 -#> 3.6499 -5.1904 -9.1577 5.1506 5.4749 -2.8899 16.6732 -6.3274 -#> 3.2116 -2.8429 3.4621 -5.2504 5.2777 8.3226 -5.6161 -11.3883 -#> -#> Columns 9 to 16 0.1044 0.9257 -1.7411 -12.9353 -1.1161 -2.5187 6.6271 -4.6543 -#> 6.4146 6.4546 -0.6852 -0.9331 3.4367 10.3163 1.5969 1.0194 -#> 9.9608 6.6149 4.0068 1.5514 4.0437 0.5026 -0.8534 5.0826 -#> 6.3817 -2.3573 1.7044 0.2486 8.7588 7.6900 -7.8460 -1.7487 -#> 6.5979 5.1956 -0.4842 7.8987 2.7213 0.9117 -0.3715 -2.7825 -#> 9.8294 -3.1727 8.8739 -1.6314 -3.7998 0.8782 -10.0484 8.0967 -#> -6.6380 6.3176 -10.2232 2.1496 8.0068 6.6707 -4.9896 -10.9107 -#> -19.6346 4.5560 4.6483 -10.3726 -4.4241 -0.3028 -3.2161 10.4890 -#> 10.0427 9.7625 0.8205 10.1058 -0.2017 2.8767 2.2373 2.5954 -#> 6.4168 2.5372 4.0594 2.1162 8.3249 14.2594 1.9229 2.2237 -#> -1.2217 -11.7178 -4.6981 -7.9874 -4.3094 12.9896 -2.1611 -1.9043 -#> -2.7255 -11.3130 -13.3637 -3.7082 -2.8801 4.3433 -12.1371 13.4229 -#> -4.5561 3.4566 4.1829 10.1662 5.0826 -6.1936 -6.7400 -4.1677 -#> 4.8390 -1.7285 -2.7652 11.2970 4.4895 -6.7624 -8.2958 -6.0062 -#> 4.2228 2.8737 1.6859 15.0594 -4.6857 -5.1865 7.1145 -14.9673 -#> 3.8024 2.3836 -2.7731 14.1042 -3.3697 2.4403 -8.3130 2.2377 -#> 2.1825 9.9036 -8.2547 -19.5848 16.7161 6.8862 11.2340 -0.6450 -#> 1.9039 1.6240 -5.4858 -6.3526 -6.1712 3.0947 0.7988 12.7938 -#> -3.8852 -13.2789 -1.2641 -3.7055 4.4524 7.2319 -9.9409 -2.3498 -#> -15.0849 5.1506 -0.8828 -9.4088 8.0829 -0.0959 3.6027 -13.6473 -#> -1.5873 12.1405 -3.7498 -0.2601 1.4899 6.2359 -9.1811 -1.0767 -#> -5.0776 -0.6590 -1.7319 9.0051 4.7089 2.1270 -2.0193 -4.1000 -#> -8.3452 5.3808 -5.5685 -2.0289 -0.4346 -1.4210 -2.9761 0.2022 -#> 6.6507 -1.0421 -5.2787 0.5100 9.9340 -4.4988 7.4089 -7.7939 -#> 11.7048 -10.1678 -6.2539 -5.6852 1.4961 0.2818 9.4267 -2.1771 -#> -13.4982 10.8678 3.8737 -5.1592 6.8410 0.9065 -1.2240 -8.9981 -#> -13.6598 8.9172 0.6429 0.0794 -3.5587 -6.8124 -9.3900 -2.4623 -#> 12.8951 -6.3400 -1.7720 6.4668 7.2949 -1.0787 -0.7375 -11.4979 -#> -4.4058 3.9306 2.9124 0.6470 10.7702 -6.2025 3.5430 -2.4585 -#> 14.9334 -7.0435 9.1913 -5.1234 6.4600 6.1955 4.8577 -0.3465 -#> -4.0835 -2.4654 6.2225 -0.5559 -9.6441 -11.2900 4.6662 17.0918 -#> -4.6378 1.4535 4.4265 13.6624 9.1190 -7.6378 -0.3699 -2.8796 -#> 0.9298 -4.5409 0.4691 5.7132 2.0187 -0.3962 -6.2617 -4.4942 -#> -#> Columns 17 to 24 -7.5082 0.1230 -2.7170 -0.1290 1.2378 3.9310 -6.8861 -2.4757 -#> -0.0840 -4.9533 6.8870 -7.8175 -6.2897 -1.5212 12.2053 1.7018 -#> -6.1950 -5.3380 8.6784 -1.1482 -13.9410 -14.6168 7.9137 3.0913 -#> 2.9134 -8.0645 1.5240 -5.6552 -3.8008 4.0944 -0.5454 -8.5642 -#> -2.7359 -4.8647 -4.7398 3.6420 4.1403 -0.2886 -4.6023 -1.1916 -#> -4.4233 1.7021 -9.9913 12.8842 7.4480 -4.9905 0.0126 1.7723 -#> 0.5143 -6.9647 8.9175 2.7165 -0.2401 2.0856 1.3035 -3.8266 -#> -1.7071 -3.9459 4.0166 1.7316 2.3844 -5.6508 0.4939 6.4315 -#> -1.7874 2.9283 -1.7397 -0.0795 3.3628 -3.7478 -3.3556 2.6575 -#> 3.8874 -5.4597 1.9906 -2.7408 -3.4235 0.8202 16.5486 1.5903 -#> -13.6912 7.5320 3.8591 -2.7932 2.2483 -5.1478 9.2281 11.0863 -#> 14.2475 11.0989 -1.1390 -4.1648 -10.9931 -2.8091 14.1621 8.3148 -#> 4.0171 -24.1954 7.6173 6.6532 -1.9288 -2.6925 2.4648 8.5419 -#> -1.1056 -3.7104 -8.6558 7.8332 3.5572 -4.8852 5.6026 -7.8859 -#> -17.2301 -14.5154 -2.0305 0.4276 -9.1197 -0.3856 -9.3708 3.1188 -#> 0.1942 -10.6688 1.5796 11.6256 -2.3149 -0.3170 -1.3611 -3.2544 -#> 8.5874 3.6442 1.8841 -5.7845 3.7858 3.3507 9.5682 5.7652 -#> 11.5020 7.4198 3.1556 7.7805 4.3205 -6.7041 3.4949 -0.9564 -#> -20.2656 -5.5543 8.3369 6.5278 -4.8215 -6.9655 8.4572 -8.7750 -#> -2.4090 -6.5655 0.6052 -7.9955 -2.2705 1.3170 5.0470 -6.7168 -#> 2.3647 9.7501 0.1719 1.5766 3.1177 1.5264 5.3542 -6.0455 -#> 5.8564 -6.1934 -3.3479 -0.8891 -4.3840 13.8145 -6.9400 -5.6010 -#> -12.5772 -2.0369 -4.0395 -4.4490 6.7978 5.7791 -2.5945 -7.7926 -#> -2.7811 12.2068 8.0313 8.7803 -3.9315 2.2293 16.3355 -9.4390 -#> -4.9157 21.3081 -4.7641 -0.2722 0.8232 -7.7434 7.1377 -10.3087 -#> -16.9968 -18.4087 6.2145 7.1472 1.6623 4.9139 -7.8575 2.1788 -#> -4.1799 -10.8815 -1.5725 -13.0412 4.1585 2.2265 -13.8721 0.6302 -#> 9.3972 5.2940 2.4569 5.7679 -4.9076 12.6545 9.4319 -5.0808 -#> 1.1354 0.0392 -5.1914 -9.7269 0.4172 -2.1484 14.0104 -10.9378 -#> -8.9589 5.7229 -7.1192 5.9937 2.2431 11.6666 5.2665 2.3037 -#> -9.6188 -2.3606 8.3819 -5.6156 -7.0418 -2.6749 2.5972 0.8289 -#> -0.9066 -14.1249 0.7172 -3.3330 -2.2977 3.1394 -2.7918 8.9389 -#> -7.4333 -3.1721 -8.1599 4.1023 -1.1054 -1.7108 -0.0041 4.4734 -#> -#> Columns 25 to 32 6.2032 6.7606 -4.1842 4.8856 4.8326 -4.3465 12.9020 -1.9198 -#> -4.9915 7.2119 -13.3811 12.4029 4.2098 -8.6987 -7.5963 -10.2318 -#> 3.2876 0.8499 -10.0451 10.1758 9.1603 4.5728 2.2774 -5.3308 -#> 6.9995 -2.8632 -17.0167 4.8331 13.0833 10.4276 -6.8767 9.1409 -#> -12.4603 -7.5971 0.9850 3.2682 -1.8272 -9.1431 11.7883 -14.3354 -#> -5.0676 0.1846 -2.6840 0.9372 -13.7782 6.9465 -10.2780 -4.0720 -#> -2.2802 -7.5383 3.5250 2.1399 11.5853 -7.8419 -4.0121 -3.3568 -#> 0.4598 -6.3054 -5.5410 13.9453 -2.0206 -7.8521 -1.9166 4.1174 -#> -4.2545 -2.3393 7.3020 12.9533 1.3055 3.4879 6.2869 -5.5537 -#> -12.6416 -3.1648 -10.2565 -0.2521 -1.3447 -10.9668 -0.7903 -17.2056 -#> -3.9141 -9.5369 3.0149 21.3584 -2.6780 1.7901 5.7324 -17.4045 -#> 10.1366 4.0803 -5.2090 22.8942 -9.7008 -9.1265 -3.6588 -1.8557 -#> -4.3599 -13.6794 5.6573 -10.4187 -1.9024 0.6204 1.9504 -1.9618 -#> 0.9198 -9.5261 5.4805 -1.3540 2.4746 9.3641 7.0959 -0.9933 -#> -8.8033 -11.0557 13.0802 -6.0954 12.3576 8.5847 16.7252 1.8883 -#> -5.7972 -4.5999 -3.2640 -12.8288 3.2058 4.2547 -2.3899 -2.1646 -#> 7.3529 -7.6689 1.1825 -13.8196 8.5992 -0.3931 -5.4558 3.0841 -#> 3.8626 0.6250 -15.1615 5.8399 -9.6220 0.8828 1.3470 -3.3132 -#> -8.2905 -6.0210 -6.0656 17.1575 8.8731 1.6172 1.9336 -2.0237 -#> 4.8286 1.5069 11.3829 -1.7471 5.2813 4.3474 -6.8825 -5.2249 -#> 8.4692 11.5821 -8.9529 11.3588 -1.6938 -5.6908 -3.0364 -2.1104 -#> 6.3942 4.8667 15.7654 -2.2110 -7.7429 18.1713 -8.2298 4.1817 -#> 5.3835 -2.9582 -5.8010 -4.3772 0.1724 -4.1213 14.8204 -4.8059 -#> 9.6845 0.8019 -10.6839 -13.3050 -10.2778 -5.5199 0.4439 -10.2108 -#> -7.2615 6.5667 9.5359 18.8324 -0.5216 8.7538 -9.2066 -5.9030 -#> -0.3073 -15.2456 2.1572 -3.2268 -5.6128 -1.0913 -2.9713 -4.1415 -#> 5.9096 -4.1621 1.9846 9.2107 14.1094 3.7912 8.5795 -0.3664 -#> 3.1452 -3.1311 11.1002 -6.0726 -3.6456 6.7602 -9.3967 9.3666 -#> -10.4010 2.7401 8.1540 12.5843 3.1052 -10.2762 -4.5395 6.4634 -#> -1.8610 -6.9201 -18.4132 -11.7961 -3.1464 -0.3100 4.7452 1.2053 -#> 0.6977 -13.1071 -13.5327 13.9274 8.3351 3.5292 7.3812 5.6499 -#> -13.9419 -2.8361 9.5517 -16.0587 8.9735 -15.3313 1.8182 3.1891 -#> -8.8529 4.9793 5.3233 2.8270 -13.7933 5.3210 -6.3498 -4.2234 -#> -#> Columns 33 to 40 -4.6260 8.8620 3.4329 -8.1481 -8.1702 -0.7852 2.0633 -2.5541 -#> 9.5929 2.5161 -5.7370 7.1015 2.3320 -6.9793 -8.9219 5.1713 -#> 6.1647 1.8937 4.1104 19.2877 1.8289 -11.5295 -3.7615 7.3228 -#> 2.2258 -5.6179 0.3055 9.3003 -11.9471 -4.2971 -6.7805 14.6903 -#> -6.3973 -2.0035 -2.7854 0.8171 7.2534 -9.8019 3.8554 -7.0032 -#> 1.0929 -7.6663 -11.2568 4.7428 -4.0117 12.4513 5.8052 5.8733 -#> -17.8153 -1.4327 5.2808 1.4281 0.3969 -17.7094 3.8365 -10.8317 -#> -7.7704 4.2667 -4.6862 6.3226 0.9491 -1.5931 -0.6221 7.8181 -#> -3.7231 5.6704 4.0881 -1.1873 -4.5026 -0.5250 2.7140 0.5003 -#> 12.6789 3.0519 -6.3219 -0.5452 1.5616 -9.3017 -15.5234 1.3124 -#> 5.1097 6.3707 -4.6902 -3.8234 -8.2346 6.2555 7.4859 -11.0259 -#> 10.0714 7.5715 -2.5473 13.3906 -2.7765 -7.9680 2.9459 6.5335 -#> -1.5787 -12.5808 -0.2462 9.8006 0.4333 0.8598 -2.3843 -9.3571 -#> -6.2396 -2.2068 -10.0107 7.9180 -2.8846 -2.9688 1.4483 2.2000 -#> -2.5841 0.5950 -0.6526 -0.3271 -2.4743 -6.3197 9.3940 -4.1160 -#> 2.2653 3.4443 -8.9261 -0.0572 5.1695 -6.9099 -6.3007 -0.9485 -#> 2.9831 2.9335 1.6797 -10.6045 4.4246 -4.5346 -11.1331 -8.0911 -#> -2.4504 7.1525 4.0060 9.7077 1.3683 -5.5369 -0.4538 -3.3585 -#> -0.3564 -4.4546 4.0470 -3.3142 -4.0203 -10.6082 -5.8961 10.2849 -#> 0.5000 -3.4236 -2.3786 -2.6622 8.2442 -9.2194 -2.3547 1.3446 -#> 2.0190 0.3457 -8.0455 11.9630 5.4565 -7.7393 3.8689 6.4768 -#> 1.7497 -0.7774 -1.2803 3.1482 -8.3563 2.5110 0.5142 -6.7214 -#> -7.3846 -3.3771 -8.7545 -6.7862 8.2003 -4.0938 15.2718 3.2306 -#> 8.8394 -4.8474 -1.2207 -3.2168 7.5263 -7.6474 -2.2813 -6.1993 -#> 4.1752 8.8815 5.9743 -9.0465 -5.3511 -6.1333 -7.7893 7.6450 -#> -9.8636 -2.4300 -3.4376 -5.1739 8.7395 -9.5285 -2.1348 -0.2848 -#> -6.7656 0.5918 -0.9239 -10.9065 -5.3489 3.6691 15.7931 -3.7872 -#> 5.8799 -10.8813 -7.1825 1.8036 10.5524 -7.3634 12.1371 2.8828 -#> 5.1392 5.2809 -7.0235 -8.9886 1.4891 -8.4972 -8.1288 17.9264 -#> 1.7187 -11.7695 -8.5988 -4.9949 -1.4106 0.4465 -3.4039 -0.1012 -#> 1.4442 -4.1300 -1.3585 -4.7286 -16.1419 -2.8185 0.3761 9.5014 -#> 5.2136 -3.7293 4.7115 -7.1455 -1.6220 -1.0989 -2.3889 5.3699 -#> 3.4604 -1.1786 -13.8256 4.2584 1.8335 6.6330 -3.3588 1.7650 -#> -#> Columns 41 to 48 -16.4893 5.8363 5.3839 6.3958 -1.4048 4.1891 5.3464 2.6790 -#> -2.8933 2.2947 7.0668 1.9631 0.7939 2.5534 3.7270 0.8783 -#> -3.1012 4.6196 1.3724 -5.0885 2.5820 3.7266 3.9187 -7.3649 -#> -3.5867 4.6952 7.6381 4.0358 -9.3693 -0.2167 -3.8898 2.1728 -#> 3.5989 5.3581 0.7363 -6.6457 -6.0644 -3.5028 -3.2641 4.5286 -#> -14.7376 2.8292 6.0339 -4.5955 0.7311 14.3921 0.4553 -9.5141 -#> 7.2329 11.5357 -6.6458 -10.2309 12.7271 -7.7142 -5.5970 2.3869 -#> -9.5121 -4.3462 0.2946 5.5319 -0.1472 7.0475 4.4480 2.4542 -#> 1.9695 -4.8489 -0.9918 1.9768 -8.7223 0.4888 0.2278 -7.7381 -#> 7.7019 -10.5364 3.4470 1.3466 -1.5905 -14.6580 7.5356 -2.0517 -#> 5.6417 -0.9466 -9.8949 1.3275 3.2663 -1.9140 -2.8904 5.5347 -#> 3.7832 -2.9117 -5.5314 0.2180 14.6717 4.5803 9.6055 13.1461 -#> 7.8353 -18.9302 3.6607 1.4060 3.8057 -9.1981 0.9597 1.2560 -#> -2.9988 -7.1067 -6.3133 7.0839 2.7961 -6.5958 9.6824 -2.1668 -#> 12.1370 -7.3829 -1.9034 -2.0261 7.5014 -1.3085 0.9310 4.8365 -#> 15.2301 -2.5347 1.4293 4.3834 -11.1265 -9.4680 8.2599 0.5567 -#> -6.9354 -0.8273 12.9513 -16.3247 8.7733 -10.5149 -8.3016 -3.5463 -#> -14.8789 6.2359 0.9929 5.7673 -0.0762 0.3267 -2.3356 -5.3080 -#> 4.9092 5.9725 10.4473 -7.8813 -2.5593 11.1572 8.9717 -0.6405 -#> 2.2686 -2.9832 -0.0737 -2.9151 3.4551 8.8344 8.4443 1.2575 -#> -5.6279 -1.6709 0.0603 4.5958 -2.0800 9.7000 7.3730 1.3542 -#> 3.8505 2.0401 -9.5092 18.2622 -3.4618 -5.5624 -1.9886 -0.1771 -#> -1.8987 5.4359 3.5942 -6.1446 0.4830 9.7549 10.8543 2.9409 -#> -3.1502 11.0579 -4.4697 -10.3236 -4.4525 -3.1020 -20.5302 -8.3858 -#> 2.7279 21.5577 -0.4661 5.0657 2.4091 18.8487 3.7039 4.6378 -#> -5.8267 4.8931 -3.0858 2.2141 -10.2913 12.2682 2.7239 7.4378 -#> 1.9541 -1.8613 0.1603 -3.9407 -2.8795 6.0980 3.9917 -1.0544 -#> 3.9225 4.6235 -17.5093 -8.6676 0.0359 -8.2765 -9.6504 -2.3546 -#> -3.0591 -0.6578 -2.2810 14.9146 1.5137 4.6708 18.2630 -1.8310 -#> -2.0140 -2.5775 -2.4864 -8.7478 -2.2091 -5.5142 -10.5906 -5.6048 -#> -5.8611 -1.2948 3.3547 0.3666 -10.2013 -2.2231 5.5278 -1.7056 -#> 8.8016 -10.1636 6.4712 -8.4551 13.4827 -1.8375 6.0630 9.2747 -#> 1.3997 1.9604 -6.4808 8.4185 -0.9404 6.2133 0.5617 11.8576 -#> -#> (8,.,.) = -#> Columns 1 to 8 9.7505 21.5211 -0.7218 3.5807 9.2035 -5.7990 -5.3427 -6.5380 -#> -9.3461 -4.3623 -9.2135 0.1155 -17.1370 11.1844 -8.7337 -8.3448 -#> 4.5790 -3.9913 -6.5917 6.4134 -14.5408 -5.1644 1.6960 -7.1082 -#> -0.1047 -7.5503 -1.7047 1.2399 -2.9983 4.1608 -5.1501 2.1459 -#> -12.8589 -4.4785 -3.2548 0.9505 5.6102 1.1982 10.0071 6.0460 -#> -11.4783 5.1726 -4.2252 4.2747 1.2827 -10.9742 0.3886 2.7719 -#> 4.9565 -1.0287 -9.1887 -7.2761 13.3099 -11.3184 -5.4476 -12.9089 -#> 0.8144 4.3780 -7.6663 -5.0387 8.2941 -7.1678 -3.1985 -3.6510 -#> 2.9263 -6.0486 2.0268 2.1189 -5.3773 0.8080 -6.3592 8.9286 -#> -8.7668 2.5708 -1.3839 7.6679 -9.7390 10.7981 10.0152 3.3428 -#> 1.9171 -0.7832 -3.3217 9.5469 0.1392 -13.4776 6.0295 -2.9709 -#> 0.3317 2.2112 -6.3795 -2.9447 2.0840 -22.5732 -2.1504 6.8618 -#> -6.5304 -0.3599 -4.6970 1.2913 2.6933 7.7431 1.0711 7.2043 -#> -0.4512 -4.5796 7.8038 6.7893 -3.4706 0.2633 -1.9958 -5.2133 -#> 5.8139 -2.7947 0.3844 9.0265 6.0386 0.1709 -2.3162 -5.0494 -#> -9.7443 -8.1242 10.9562 -1.5658 -3.0498 11.7880 6.9681 6.9782 -#> 2.8668 5.6868 7.3995 0.1941 -8.7889 8.4973 -0.6561 0.4459 -#> 4.1562 5.1469 0.3420 3.6547 5.1419 -9.6600 3.7330 3.8443 -#> -2.3604 1.4771 -4.2629 0.4718 3.2103 -9.8736 -0.7177 -12.7919 -#> 12.6961 3.1390 -5.6916 0.6333 -0.3446 1.4370 -2.3489 -14.2977 -#> 0.3057 -4.6486 -2.6362 -7.3607 -1.4229 -4.7620 -1.7483 -6.2187 -#> 6.4337 -9.8663 -2.6658 -0.2098 -3.9033 6.9333 -10.1242 -6.0069 -#> -14.3827 -8.0791 2.1936 -2.1635 7.4595 -9.9999 -4.5255 -15.1139 -#> -6.5837 0.5889 9.7503 1.1785 2.1524 11.7576 8.8130 -2.7063 -#> 12.3878 -0.1523 1.3679 2.5950 -13.6549 -3.5532 -3.3741 -9.7953 -#> -0.7170 -6.3100 -8.0999 9.8903 14.4547 -6.2736 9.8533 -5.8607 -#> 1.1360 1.0335 -0.0059 3.1830 -3.2109 -7.0617 -5.7204 -3.7372 -#> 5.6506 -5.1117 11.7300 -3.5025 -7.5426 7.8535 7.1664 -9.1174 -#> 5.7702 -0.2175 1.7750 6.0243 -21.9186 8.0276 9.3998 -4.4018 -#> -10.6650 12.4759 5.2718 -1.5027 -2.9467 7.5737 15.0786 2.4842 -#> 6.5021 8.8091 -0.6611 4.7909 5.5022 -8.2502 3.1660 7.5191 -#> -8.0627 -1.6115 0.2394 6.6535 2.8433 13.6172 2.0723 19.1200 -#> -6.4064 -6.9318 2.5021 0.6963 -5.9183 5.2332 4.6153 12.9524 -#> -#> Columns 9 to 16 -7.4489 -0.8042 2.7854 8.4652 -6.6809 15.0909 19.8745 7.1479 -#> -9.3199 9.4770 -3.4972 7.5095 -0.6291 -4.8227 2.6740 -6.6423 -#> -2.7731 0.6078 -0.3694 -3.5848 4.8989 4.5288 -0.2155 2.2294 -#> 0.0430 7.7739 0.5964 5.9654 -5.0387 -8.3233 6.6414 -0.9961 -#> 4.0313 -1.3787 6.4081 -6.2994 2.1910 -5.4143 -3.6866 -1.5317 -#> -0.4060 3.4386 -5.3304 -1.7402 -5.4494 8.4392 -2.5724 -2.6365 -#> 2.4761 -6.6946 -2.1128 2.0522 7.0601 7.0882 -7.5598 16.4945 -#> 1.7845 -5.1161 -4.2109 0.0422 2.5480 2.2563 -13.5602 6.9806 -#> -10.9724 2.8955 -0.9919 -4.6993 3.9657 5.9168 2.8627 -3.5043 -#> -2.7526 3.3810 0.4070 4.7812 2.5217 -14.3968 -4.0289 3.0694 -#> 11.0097 -4.5242 -2.8502 -7.9287 15.9853 -11.4852 8.3789 -10.0767 -#> 1.5480 -3.0409 -0.6347 7.7452 6.0101 2.0608 -2.8751 7.2602 -#> 0.0410 -5.5461 7.5252 -11.0345 12.0098 -14.8030 -7.9794 0.9225 -#> 4.2580 -11.4530 3.3565 -3.9571 5.1758 -4.5969 5.6352 9.2885 -#> 12.7585 -1.5618 -3.5641 -13.9890 5.7753 -1.2041 -3.7336 6.4951 -#> 1.9021 3.1302 -0.2939 3.6689 -5.5221 -9.6258 -4.1210 5.6672 -#> -0.9260 -2.8063 -10.6438 5.8411 4.1513 -2.4985 -12.3546 3.7584 -#> -2.3836 2.6476 -0.4543 4.2394 5.1216 6.6579 1.9311 -3.5211 -#> -2.4128 7.1206 -3.8751 -13.5594 4.8275 -8.7876 -0.1497 -2.2992 -#> 0.6666 -1.5999 -1.9034 -4.0682 11.3346 -6.7277 5.9325 15.2344 -#> -6.1739 -7.7141 1.4850 1.0927 4.3074 -3.4131 11.6982 5.0749 -#> -2.6437 2.8348 -0.0386 16.8818 -1.2101 -2.0663 8.1696 7.1242 -#> 6.6301 -1.2115 -2.6788 -3.4775 -9.8312 -4.7657 6.6134 -4.9456 -#> -3.8740 10.2621 12.2994 -6.8315 2.0823 -15.4459 13.4714 -16.5075 -#> -6.1147 7.5286 -2.9784 -3.6033 4.8837 12.5250 1.7987 -0.8655 -#> 9.8514 -3.1028 -6.3037 1.3988 -3.0110 1.9040 -11.5565 9.3783 -#> 5.2401 -1.0755 -3.6053 -5.0613 -11.9937 5.4612 9.3271 -4.1031 -#> 13.2205 -8.3787 -7.4333 1.0424 3.7193 -8.2087 1.5639 6.6348 -#> 5.1053 -8.5397 -4.6884 5.3031 -1.3494 4.1826 0.5436 14.7424 -#> -0.3656 -4.0147 -4.7661 -7.8632 -4.5603 1.3236 -5.4402 -3.0846 -#> 4.5304 0.9833 -2.5083 -9.7851 -14.0094 -0.4676 3.3590 -8.4788 -#> 10.5364 4.5215 -1.5188 5.0515 -7.1089 -6.8000 -7.6796 2.9325 -#> -0.4371 -3.9301 3.3137 3.9294 -1.5344 -0.7612 -7.9348 8.6682 -#> -#> Columns 17 to 24 -0.1389 2.7833 -7.6417 1.0127 4.3739 15.8968 13.7240 -2.0858 -#> 1.2018 -1.3251 11.9279 4.4186 -5.6396 -6.8106 1.4250 -3.8938 -#> 7.1025 7.5194 15.7088 10.0171 -14.3076 -12.7381 3.2395 0.8369 -#> -4.8720 -8.5710 12.2525 3.2695 -5.5304 -5.0498 15.3786 -3.1256 -#> 5.6980 6.7997 -3.1759 0.1066 2.2419 3.0356 -4.8155 -1.3508 -#> -15.3042 -7.0760 4.9612 -14.5196 6.9446 11.3609 1.7151 -9.4168 -#> 2.4581 11.1489 -2.4932 -12.1783 8.0708 -0.7791 8.1304 -3.2390 -#> -11.4518 5.1643 1.6748 -0.9467 -7.1317 -7.2802 3.5551 -1.4782 -#> 8.5132 6.0701 2.0084 2.9306 -7.6543 4.9825 2.7986 -1.4610 -#> 7.6029 -2.1964 -4.7003 4.6741 -2.4956 4.5839 -3.3663 -0.7649 -#> 16.4216 1.3188 -0.2868 -7.9091 -4.4317 11.5826 -4.2421 3.3804 -#> -0.3521 1.5658 -3.3197 -8.8169 -8.6169 -7.3379 0.2026 -4.2922 -#> 4.6503 0.3193 4.8151 1.6662 -5.3378 -8.9674 -5.4164 10.2136 -#> 0.3513 -0.3407 2.1979 -3.5283 -2.5753 8.7386 1.5805 5.4534 -#> 10.4485 13.4950 1.1411 -0.7244 -8.2095 -4.7985 -5.4840 17.3438 -#> 5.4441 -7.1625 3.4126 6.5226 -3.5294 -6.1442 -2.3265 1.3708 -#> -2.3677 4.9284 -11.7406 5.4634 -2.0317 7.2705 9.2657 1.4044 -#> -11.1751 5.8955 -3.9999 -6.3525 7.5790 1.7701 2.9943 -8.7027 -#> 5.8696 2.9949 17.6991 -1.4768 -12.4445 -6.9235 -3.2380 -6.8725 -#> 12.3087 3.4907 -5.1019 8.4455 7.1078 -1.3708 -3.1946 10.9005 -#> -7.0519 -0.9545 2.6750 5.3558 0.6328 -18.4314 -3.3503 -8.6130 -#> 8.1444 1.3283 -1.1925 -3.6192 8.7250 0.5334 0.0338 12.2328 -#> -13.1485 -2.2758 -0.7310 -1.1975 -0.2313 -1.8796 8.2425 -5.9023 -#> -0.4631 -21.9035 -2.7409 1.5576 2.6490 8.8296 -12.3949 -3.2970 -#> 16.1738 -3.6051 5.2913 -6.8463 -0.1715 3.8577 -4.4052 -7.1396 -#> 8.6622 -2.5606 4.2791 0.3756 -2.5184 -2.4997 1.7716 2.0096 -#> -0.6146 6.1604 0.3380 -3.5578 5.9450 -6.0484 8.9132 2.9027 -#> 11.9040 -15.3711 4.8235 -6.3088 -3.0654 5.7914 -16.9265 10.7671 -#> 9.6258 0.8721 2.1063 -1.4997 -7.8480 0.8220 -1.0926 -5.7129 -#> -0.8224 -4.4742 -5.6826 3.2623 -3.7203 13.1767 5.8564 0.3523 -#> -8.5367 4.8810 4.6889 -6.7867 -10.1217 -1.6946 14.1697 10.3410 -#> -3.1274 -7.0905 -12.2435 2.7620 -10.2159 -6.9943 -0.7448 -2.0076 -#> 6.4558 -4.3491 1.4251 -2.4677 2.4405 -4.2120 0.5065 -1.7806 -#> -#> Columns 25 to 32 17.1026 2.5812 -7.0203 9.4465 6.9103 -0.1044 -2.8706 1.0071 -#> -0.1017 -4.5751 -2.9160 -13.9657 -7.6557 -3.4798 -5.2746 0.7955 -#> 2.6565 -3.7401 3.4362 1.4084 -8.9534 3.8798 0.9005 -5.1114 -#> -13.6695 5.2867 12.6991 4.7499 -5.6753 -3.7951 -0.4789 4.1006 -#> -3.4392 -0.5906 -1.6030 4.2723 -7.6491 0.5247 -3.4139 -9.0802 -#> -8.7874 12.8729 -2.4180 -4.9332 -6.5361 0.6753 -5.9675 1.7622 -#> -0.2850 -1.0839 6.2215 9.3789 -1.1318 -3.4875 -1.5527 -1.2908 -#> 3.6416 -1.7346 6.5360 -5.6006 3.7726 3.3410 6.3362 13.5934 -#> -0.0879 -0.4969 1.7138 -9.7765 -9.0325 1.5765 -4.2677 -6.9798 -#> 0.4966 -10.6000 -5.1651 -9.7292 -4.8165 6.4431 -7.8703 -12.7564 -#> -12.0424 -2.4013 8.9028 -3.3145 7.1346 3.8412 1.4704 -9.6413 -#> -3.1423 -5.1933 -0.0150 1.9071 -0.2029 -0.1701 0.1602 1.7525 -#> -1.8765 1.9230 1.4065 0.4553 -0.2087 8.1157 8.0254 -4.5533 -#> -4.3565 10.7568 7.1821 9.0265 2.6125 0.1118 -1.6544 -0.7527 -#> -0.7168 -2.0606 -8.3614 11.2642 3.2440 1.1178 4.9414 -2.3209 -#> -4.8396 -6.3713 5.5836 5.3154 10.2439 3.4682 -1.7568 -1.7856 -#> 1.6700 -9.3760 -19.3531 -6.9326 -2.3024 -1.7312 2.0125 2.4404 -#> 1.9675 -0.2319 -0.9638 -6.1850 0.7932 3.1152 1.3602 -7.4149 -#> 3.1073 -2.1754 13.7664 0.9557 12.6638 4.6034 1.6815 -3.3305 -#> 0.4340 -7.9141 5.9485 2.6441 6.5036 -10.2815 -8.1310 -2.6895 -#> -5.8532 1.4265 1.8267 -6.1927 3.4157 -7.9354 2.2704 10.0210 -#> 1.6743 -1.2345 0.3427 1.3384 1.5342 -8.3515 -8.5344 5.3275 -#> -2.2453 -5.9218 2.3372 2.9031 9.8127 -13.8977 2.0153 9.5521 -#> 12.5302 -2.4257 -7.4401 -5.6927 5.5491 1.6091 -0.7882 -6.3543 -#> 5.6356 -1.7172 3.7151 -2.9179 3.5514 4.6399 -0.7576 -12.3475 -#> -3.5529 -8.4877 11.6536 3.9503 0.1079 -2.8133 9.7080 0.9127 -#> 7.0334 5.4122 2.6707 0.3616 4.5236 -6.2855 5.1213 14.4611 -#> -11.7232 -12.7253 4.7953 3.4767 5.8241 -8.2993 0.8080 -11.5194 -#> 1.3343 -6.4987 7.4096 -5.8047 7.1858 3.4141 -5.2520 -4.4160 -#> -0.1721 -14.0198 -8.1386 7.9760 -0.4833 1.1973 -5.2975 -2.7722 -#> 10.6461 1.8665 -7.1903 5.6824 6.1101 2.6576 6.4654 1.1838 -#> -6.6105 5.0036 2.2007 10.3305 -6.5373 0.0834 -0.2109 2.9936 -#> -8.1512 -1.1948 3.9096 11.5425 -2.3290 4.5866 1.5263 -2.7111 -#> -#> Columns 33 to 40 -13.6757 3.2050 -1.7102 3.0423 3.4864 -6.0581 -7.0705 2.3436 -#> 2.8154 6.7909 11.3422 10.5838 -1.1585 0.3751 -6.7020 3.0085 -#> -2.3170 5.7280 8.5315 5.9173 -2.9746 -7.2888 9.4413 7.2076 -#> -5.7367 7.5299 -1.4259 -1.6888 -14.4085 4.8950 5.3013 8.0799 -#> 0.9883 0.7480 -2.2983 -8.1308 -4.4850 -8.3801 -8.7969 7.5343 -#> -0.8695 -1.1404 -3.4372 -2.3562 -0.8389 6.6296 -5.0157 3.7476 -#> -0.6605 1.1048 -7.0557 -4.1054 4.1530 -0.1704 9.4408 -5.8617 -#> -0.5488 4.6426 -2.6103 6.9008 6.1783 6.9762 2.5179 -14.8938 -#> -4.0633 8.2448 -3.4058 -0.3540 -8.4536 -6.8800 1.3375 -0.1555 -#> 1.3970 -0.2304 0.0156 -5.2144 3.5280 -1.1255 -12.1279 -0.2955 -#> -2.0486 -10.7888 4.8500 2.6335 2.3910 5.6371 -9.2602 -6.2757 -#> -2.1683 11.4737 14.6329 13.6383 3.6652 -0.8922 1.0977 -11.2232 -#> -6.7400 3.4379 -0.9003 4.8029 -10.6919 0.7908 6.7403 -16.2157 -#> -4.8868 -2.2131 0.3401 0.4005 -6.0264 -4.2342 10.5046 -2.7749 -#> 12.2537 -3.7335 -1.7123 -8.3265 -19.5502 -2.4142 1.6112 1.2166 -#> -6.8644 -1.0912 -2.1300 -4.5600 -3.7915 -4.2678 6.2871 -1.8575 -#> -1.6467 -3.3780 -2.6142 -6.3843 8.7063 -4.1567 14.1473 -0.9055 -#> -4.3967 -1.3374 5.1231 2.8412 -0.3606 1.1262 -8.2442 9.5898 -#> -1.0255 1.8714 5.5460 -3.0920 2.2492 -1.4507 3.2481 -4.1082 -#> 15.2278 -1.4644 -1.6724 7.7990 5.4575 -9.2883 1.3260 -0.2767 -#> 2.4734 1.0298 3.4018 19.7773 6.4603 0.6927 -2.3733 10.8856 -#> 7.5390 5.9991 -1.6665 -1.1839 -15.8801 -4.4179 0.7634 -3.5811 -#> 7.5150 -7.2280 4.7583 -7.6827 7.1725 -8.0871 6.3701 6.3065 -#> -4.4429 -10.0434 -13.7058 -4.5672 2.3839 -7.5390 7.9516 3.9969 -#> 8.9826 8.9293 7.4789 6.6279 10.4749 2.4352 -2.6923 0.4486 -#> 0.9305 -0.8680 -7.8173 -8.4552 -0.5972 -2.2555 3.4722 -5.0586 -#> 0.5122 -4.1389 -4.9615 0.4793 3.6580 -3.4449 4.5336 3.1636 -#> 2.0379 -19.8241 2.8024 2.6805 10.5936 2.1072 1.4258 -11.4136 -#> 4.7889 6.2284 10.5768 4.5156 9.1442 -8.2460 -6.0661 -2.1584 -#> -15.9749 -3.6706 0.0012 -7.2619 -0.1787 -9.4787 11.7275 4.4441 -#> -3.1463 4.3559 -0.9214 -5.9368 -7.9699 3.1736 -0.3443 -1.0745 -#> 2.4498 1.2488 6.3804 -3.6066 -0.9166 -0.3392 -0.6077 -6.2136 -#> -11.3278 5.6230 -1.1469 5.0627 -6.2094 -1.0932 -3.9888 -2.4802 -#> -#> Columns 41 to 48 7.0135 6.3294 -8.8584 -14.5510 10.0676 -0.6149 -4.1354 4.4232 -#> -10.7895 -2.7615 2.1639 20.1588 2.0257 0.0967 -6.4809 2.2671 -#> -4.5688 -2.0050 4.8668 6.7397 -11.2983 5.5228 -5.6060 -14.5368 -#> -7.8524 -5.3228 12.4785 -5.4817 0.3444 -3.9921 -2.4454 -5.6591 -#> 2.3190 15.2425 -2.7544 8.7666 -13.2045 -6.8948 -1.2019 -8.1382 -#> 7.4866 11.4021 -1.9954 1.7636 0.1764 4.6639 5.1616 11.7051 -#> 5.7654 4.7622 5.9318 5.4125 -3.8339 5.5030 1.5205 -8.6833 -#> -3.0645 -1.9771 6.8651 -6.9690 1.3674 6.4900 2.4330 0.6932 -#> 4.0131 14.1373 0.8940 3.5991 -1.6315 1.8919 -5.1338 -4.8808 -#> 0.9488 10.4619 -3.4536 5.8847 -6.6616 3.8535 3.3347 -8.1186 -#> -1.3349 -1.6230 -8.1500 4.8047 16.3066 0.6186 -6.0828 -4.2537 -#> -17.0976 -8.4809 10.3442 -6.5025 -4.5308 6.4224 -4.3393 -4.1076 -#> 1.9242 0.5845 19.1147 0.7265 -2.8166 2.5570 -3.9082 -3.4749 -#> 3.3098 7.0949 7.2359 12.9875 -7.0863 -0.1894 5.2748 -13.0611 -#> 12.1284 16.8619 8.8710 5.6396 -11.2881 -9.3478 1.6231 -0.1281 -#> 4.4235 0.4590 -10.8071 9.9598 -10.6359 -3.5016 4.7264 0.5766 -#> -1.2458 -2.6800 -3.8881 -0.7222 -1.0022 1.7662 3.4330 0.8911 -#> 0.1136 5.5672 0.2567 -0.0802 -3.1427 5.4188 4.5744 -6.9384 -#> -2.1668 1.9430 18.8833 9.1453 9.9037 11.1195 0.9507 -8.9267 -#> 1.1640 -10.1598 3.5957 -4.8968 -5.7853 0.4013 -3.8952 -8.5320 -#> -14.0711 -19.9621 0.9685 11.0321 -2.2575 3.2575 10.8891 -0.8897 -#> 0.3868 7.2838 3.2818 -0.9068 -1.4344 -2.2705 -17.2004 6.3904 -#> -1.4807 6.5627 12.2064 17.1711 0.4854 8.3901 14.3149 2.6802 -#> -6.3829 -12.7508 -9.0151 0.1801 2.1498 3.4304 5.9517 10.1215 -#> -1.7794 0.6457 -5.0495 2.3017 -2.0233 -6.5212 4.6659 5.2673 -#> 4.9225 10.4069 4.0690 2.3700 1.0188 -5.1772 4.6426 8.9672 -#> 3.3292 1.8687 -3.6283 -8.1369 5.5408 -5.3452 -12.8441 -0.9602 -#> 3.5477 -10.2707 -13.2258 10.0899 -0.9471 0.1878 5.2863 -0.4901 -#> 3.1693 2.1068 -3.6108 3.8335 -0.3474 -9.2534 2.9291 -8.8937 -#> -1.5092 0.1917 3.0646 0.2466 6.9229 11.0598 3.4176 4.7661 -#> -1.5282 1.3181 2.3549 -15.1336 -7.1468 -5.1242 -1.5188 4.6585 -#> -1.7846 4.1737 -5.0727 -4.0856 -6.7048 -6.8882 0.4729 6.3599 -#> -1.1021 -4.5408 -5.4687 5.1239 -2.5769 -4.7390 -4.1566 8.3170 -#> -#> (9,.,.) = -#> Columns 1 to 6 -5.9772e-01 -9.8983e+00 1.0404e+01 -6.1356e+00 3.5001e+00 -8.9590e+00 -#> -6.7052e+00 -2.4473e+00 -5.9757e+00 -6.9124e+00 -7.5695e+00 1.5501e+01 -#> -5.1638e+00 -3.0836e+00 -8.2642e+00 -1.0065e+01 -2.8839e-01 1.8010e+01 -#> -3.9167e-01 -4.5961e+00 2.2247e+00 -1.1997e+01 1.2477e+01 -5.6814e+00 -#> 6.0788e+00 1.2863e+01 3.9910e+00 2.3731e+00 7.4136e-01 4.5087e+00 -#> -3.1761e+00 1.3578e+00 -4.6626e+00 2.6179e+00 -3.8575e+00 9.9712e+00 -#> -1.1396e+01 3.3664e+00 1.0453e+00 6.6824e+00 -3.1272e+00 7.3237e+00 -#> 3.7370e-01 -8.0273e+00 6.5533e+00 -1.0999e+01 -9.4513e+00 -8.1806e-01 -#> -1.0576e+01 -1.6506e+00 1.9676e+00 -8.0581e+00 7.0890e+00 -4.9045e+00 -#> -4.7729e+00 1.8349e+00 -1.5519e+00 8.6583e+00 -7.6534e+00 9.1328e+00 -#> 8.7335e+00 1.4546e+01 -5.9599e+00 -5.1928e+00 2.0621e+00 9.8500e+00 -#> 1.0081e+01 -1.6942e+00 -6.2237e+00 -5.0624e+00 7.8544e+00 1.6970e+01 -#> 1.7356e+00 -9.4270e+00 -5.2761e+00 -6.2843e+00 2.3019e+00 1.8504e+00 -#> -5.1167e+00 1.2433e+00 -1.0857e+01 -5.2793e+00 1.4291e+01 -7.3475e+00 -#> 1.5092e+00 5.9874e+00 -1.4614e+01 -1.3576e+01 -3.3374e+00 1.4458e+00 -#> -5.6186e+00 3.3894e+00 2.7115e+00 8.4038e+00 1.0308e+01 -7.5663e+00 -#> -2.6028e+00 -3.0251e+00 -1.3113e+01 8.9866e+00 2.3850e+00 -2.6730e+00 -#> -8.7767e+00 1.3823e+01 4.7101e+00 6.0249e-02 1.1898e+00 8.5789e+00 -#> -6.3649e+00 -7.8938e-01 -1.2153e+01 -5.6260e+00 -6.6550e+00 1.4229e+01 -#> 1.1580e+01 8.7932e+00 -2.5971e+00 -2.3929e+00 -6.2260e+00 1.0944e+00 -#> -4.3348e-01 -4.3489e+00 -5.5791e+00 -1.3491e+01 8.0694e+00 1.1862e+00 -#> 1.6337e+00 -8.1348e+00 -3.0525e+00 4.3178e+00 1.4512e+01 -7.3191e+00 -#> -4.4990e+00 1.0172e+00 -2.2629e+01 4.2001e+00 -7.2138e+00 1.2539e-01 -#> -1.9345e+00 -2.1191e+00 -4.6567e+00 7.5494e+00 1.1774e+01 -6.5196e+00 -#> -7.8636e+00 1.3880e+01 1.7210e+00 -3.0380e+00 3.4229e+00 -4.0000e-01 -#> 6.5259e+00 -4.9772e+00 1.7033e-01 -1.4080e+01 -6.3298e+00 3.6542e+00 -#> 6.0277e-01 1.8627e+00 -5.3594e+00 -4.0737e+00 -5.3710e+00 -3.8900e+00 -#> 6.8195e+00 1.4984e+01 5.6257e+00 1.4304e+01 7.8092e+00 -5.7403e-01 -#> -4.1558e+00 -5.3392e-01 3.6684e+00 -9.8107e+00 -2.4841e+00 1.7410e+00 -#> 2.3968e+00 -1.2837e+01 7.0652e+00 1.3442e+01 5.5320e+00 -5.8472e+00 -#> 1.0226e+00 -3.8321e+00 7.9918e-01 1.0291e+00 6.0297e+00 -4.5766e+00 -#> 1.9976e+01 -9.2697e+00 -1.4449e+01 4.4617e+00 -1.1223e+01 4.3968e+00 -#> 4.3422e+00 -8.1300e+00 2.7784e+00 -8.8056e+00 9.0905e+00 -3.6160e+00 -#> -#> Columns 7 to 12 -7.9886e+00 -4.1789e-01 3.4889e+00 -6.5670e+00 1.7025e+00 -6.1554e+00 -#> 1.2431e+00 -3.3220e+00 -8.6634e+00 3.4699e+00 9.3183e+00 1.4846e+01 -#> 1.4251e+01 -3.7282e-01 -6.0053e+00 -1.1380e+00 3.1382e+00 7.6234e+00 -#> 2.7875e+00 -1.0526e+01 1.5817e+01 5.0829e+00 -1.1793e+01 3.4515e+00 -#> 3.1888e-01 4.2712e+00 5.5459e+00 1.3774e+00 4.5625e+00 -2.3585e+00 -#> -3.6468e+00 9.7521e+00 -3.9350e+00 -2.4761e+00 2.9528e+00 1.9369e+00 -#> 8.7279e+00 -3.0858e+00 -4.8988e+00 1.0578e+01 -2.2710e+00 -8.5113e-01 -#> 6.3638e+00 -3.3849e+00 -1.4784e+01 -5.1161e+00 -2.4338e-01 5.7786e+00 -#> 1.1709e+01 7.0229e+00 1.0275e+01 2.0638e+00 -8.0311e-01 2.2046e+00 -#> -1.9236e+00 -5.5674e+00 3.4492e+00 4.4238e+00 1.4859e+01 9.4732e-01 -#> 7.5186e+00 -7.2396e+00 -1.1684e+01 -8.1177e+00 1.1721e+01 9.8822e+00 -#> -1.7116e+01 -2.7047e+01 -1.5828e+01 1.8199e+00 3.7760e+00 1.2151e+01 -#> 1.5202e+01 -1.6115e+00 4.3183e+00 -2.2170e-01 -1.3697e+00 6.1547e+00 -#> 1.0838e+01 -2.1337e+00 1.5393e+01 -3.0158e+00 -2.5116e+00 -1.3567e+00 -#> 9.8020e+00 3.9353e+00 8.0539e+00 1.0331e+00 8.9065e+00 4.4400e+00 -#> 1.1014e+01 -5.1721e-01 1.0056e+01 -1.8658e+00 -1.0587e-01 -1.0652e+00 -#> -9.9327e+00 -7.8304e+00 6.8668e+00 5.4561e+00 1.7317e+01 -8.1515e+00 -#> -3.6940e-02 -2.0545e-01 -4.2434e+00 2.7756e+00 -1.0222e+00 3.4477e+00 -#> 7.5637e+00 5.2881e+00 5.0657e+00 8.2493e-01 2.7979e+00 4.6146e+00 -#> 6.9088e+00 -1.3395e+01 -1.2399e+01 3.8654e+00 1.9648e+00 3.5629e+00 -#> 6.7725e+00 -7.7367e+00 4.9227e-01 8.8902e+00 -9.8244e+00 1.1019e+01 -#> -5.6635e+00 -1.9914e+00 7.1919e+00 9.3059e+00 -3.8622e+00 -3.8176e-01 -#> -9.7490e+00 1.1049e+01 1.6366e+01 -2.3207e+00 6.3493e+00 -1.8590e+00 -#> -9.0104e+00 -1.3411e+01 1.2738e+01 -5.6306e+00 -3.5250e+00 -8.7249e+00 -#> 4.4496e-03 -2.9238e+00 -6.2142e+00 -1.1927e+00 -1.8179e+00 5.8272e-01 -#> 8.8736e+00 3.5496e+00 1.7175e+00 9.9175e+00 4.7218e+00 4.1414e+00 -#> -6.9808e+00 5.4581e+00 1.1534e+00 -1.2803e+01 -9.3482e+00 -5.8145e-01 -#> 2.9285e+00 -5.5349e+00 -3.5046e+00 7.2675e+00 5.2552e+00 -3.8660e+00 -#> -1.4056e+00 -7.3416e+00 -1.8322e+00 6.9944e+00 7.7574e+00 8.1254e+00 -#> -1.3827e+01 1.4488e+01 1.4742e+01 9.4897e+00 1.2354e+01 -9.0142e+00 -#> -1.9852e+01 -3.4075e+00 -1.7569e+00 -9.4174e+00 -3.2770e+00 -6.5168e-01 -#> -1.2156e+01 -7.0697e-01 -7.5206e+00 1.0838e+00 9.1425e+00 -4.8583e+00 -#> 8.3659e-01 4.9569e-01 -8.0905e+00 3.2208e+00 2.9091e-01 1.9936e+00 -#> -#> Columns 13 to 18 1.4268e+00 -6.4643e-01 -1.7622e+01 -9.2782e-01 -6.9855e+00 -5.9646e+00 -#> 5.6406e+00 6.4896e+00 3.9732e+00 1.1957e+01 5.9071e+00 -3.4283e+00 -#> 1.4863e+00 -3.2103e+00 6.5280e+00 3.0830e+00 7.4875e+00 3.9739e+00 -#> -1.3436e+01 1.1493e+01 -5.9182e+00 2.1436e+01 -3.7133e+00 -4.1458e-01 -#> 7.0292e+00 4.3810e+00 1.8445e+01 9.7100e-01 -1.0219e+00 9.3279e+00 -#> 1.1690e+01 9.6098e-01 -6.6718e-01 -8.3299e+00 1.1058e+01 -4.3912e+00 -#> -8.2923e-01 -3.1380e+00 5.5529e-01 -4.2088e+00 -5.2010e-01 4.0494e+00 -#> -4.9005e+00 9.1976e-01 -6.6567e+00 4.6711e+00 1.0360e-01 2.0231e+00 -#> 1.0765e+00 6.4593e+00 3.4552e+00 -9.2336e+00 4.4749e+00 -3.8992e+00 -#> 2.2649e+00 -1.8091e-01 1.3098e+01 2.2921e+00 6.5336e+00 6.9644e+00 -#> -5.2512e+00 -9.9104e+00 6.5997e+00 2.8382e+00 -5.9939e+00 5.7500e+00 -#> -7.6590e+00 -4.8979e+00 4.1078e+00 -9.0289e+00 2.2878e+00 6.8816e+00 -#> -1.0736e+01 6.7032e-01 6.6919e+00 6.4399e+00 -2.3449e+00 9.0975e+00 -#> -9.6398e+00 -2.0480e+00 4.3620e+00 -2.9848e+00 -2.7863e+00 3.0141e-02 -#> 7.9273e+00 4.9398e+00 4.2805e+00 1.4641e+01 -5.9008e+00 -8.4110e+00 -#> -9.7235e+00 4.5998e+00 1.4957e-01 1.4007e+01 1.1557e+01 3.9815e+00 -#> 1.1669e+01 -1.4081e+01 7.0342e+00 -3.7801e+00 -3.2983e+00 4.9881e+00 -#> -4.2155e-01 -8.0756e+00 -1.6138e+01 -3.7417e+00 -5.7215e+00 -6.8203e+00 -#> -6.5037e+00 5.9370e+00 8.5020e+00 1.4392e+01 1.2422e+01 6.3810e+00 -#> 3.2395e+00 9.6114e+00 4.6214e+00 -2.9670e+00 6.5430e+00 8.2656e+00 -#> -3.6188e+00 1.2400e+01 -1.9658e+01 1.3046e+00 -2.5884e-01 -4.3679e+00 -#> -8.8569e+00 5.8866e+00 -1.1162e+01 -1.3357e+00 -2.4089e-02 -7.8688e+00 -#> 8.4758e+00 -4.6751e-01 6.0961e+00 3.6116e+00 -9.4077e+00 1.3468e+01 -#> -5.6394e+00 -2.3949e+00 -1.3362e+01 -1.1423e+01 5.0500e+00 -6.3947e+00 -#> 5.6710e+00 9.3222e+00 -4.5345e+00 -1.2849e+01 1.2083e+01 -8.1289e+00 -#> -1.7648e+00 1.4789e+01 3.0152e+00 1.3705e+01 -1.6920e+00 9.6765e+00 -#> 2.4847e+00 6.5077e+00 5.2258e+00 4.6694e+00 -1.9813e+00 4.2211e+00 -#> -4.8027e+00 -8.0172e+00 -6.2700e+00 1.1320e+00 -9.2289e+00 -8.3512e+00 -#> -2.1190e+00 1.2911e+01 1.1840e+01 6.4975e+00 4.7221e+00 -1.3022e+00 -#> -2.8746e+00 -8.6969e+00 9.1544e+00 1.7319e+00 6.2875e+00 -3.9191e-02 -#> -7.6621e+00 -4.1807e+00 9.5632e+00 1.3037e+01 -7.3126e-01 3.5519e+00 -#> 1.0035e+01 -9.1915e+00 2.2972e+01 8.9583e-02 -5.2422e+00 9.7728e+00 -#> -4.1256e+00 9.7167e+00 1.2161e+01 5.9556e+00 9.9613e+00 6.0163e+00 -#> -#> Columns 19 to 24 -1.4788e+00 6.5384e-01 3.4991e+00 -5.0203e+00 1.1833e+01 6.5428e+00 -#> 1.0071e+01 2.6512e+00 -1.8944e+00 -8.1566e+00 3.5798e+00 -1.0335e+01 -#> 3.6254e+00 -6.5613e+00 3.1483e+00 6.8949e+00 -7.4044e+00 -1.5177e+01 -#> 9.3763e+00 8.3278e-01 2.8163e+00 -1.2261e+01 2.4854e+00 -6.6130e+00 -#> -6.1210e-01 -5.5153e+00 -9.4884e+00 -8.2171e+00 -3.7525e+00 -5.8954e+00 -#> 8.1481e+00 5.1251e+00 -2.1194e+01 9.2614e+00 3.1435e+00 -6.1527e+00 -#> -2.6929e+00 4.8587e+00 2.8694e+00 4.9911e-04 -1.0671e+01 1.7866e+01 -#> 9.1479e+00 2.6889e+00 -4.1593e+00 5.9204e+00 2.1084e+00 4.5870e+00 -#> -6.2372e+00 -8.9761e+00 -4.5223e-01 -1.2422e+01 8.3073e+00 -1.0398e+01 -#> -3.1561e+00 -7.1218e+00 -6.7626e+00 -7.2100e+00 -1.9478e+00 -1.4644e+01 -#> 1.2066e+01 -2.1823e+00 5.2041e-01 4.1326e+00 -2.5256e+00 9.1427e+00 -#> 5.0821e+00 5.6678e+00 3.3381e+00 1.1488e+01 -7.2279e+00 6.3522e+00 -#> -7.3032e-01 -2.6469e+00 3.1833e+00 8.2736e+00 -5.3655e+00 -6.5288e+00 -#> 4.6056e+00 -2.7989e+00 -1.0808e+00 4.8777e+00 7.8457e+00 -8.4777e+00 -#> -4.3655e+00 -8.9310e+00 -1.7927e+01 -5.9276e+00 -5.2484e+00 -6.5302e+00 -#> 7.2124e+00 -3.5134e+00 5.8129e+00 -8.7390e+00 1.4915e+01 -8.4247e+00 -#> -2.9340e+00 1.6258e+00 -4.3462e-01 -7.5343e+00 3.9852e+00 4.6746e+00 -#> 6.7741e+00 1.7857e+00 -4.5787e+00 1.1741e+01 5.7908e+00 -5.6829e+00 -#> 1.2928e+01 5.4310e-02 3.4659e+00 1.9901e+01 2.8366e+00 2.5718e+00 -#> -4.6991e+00 -3.8227e+00 3.3682e+00 -3.3209e+00 -1.0747e+01 9.3639e+00 -#> 4.2007e+00 -7.2732e+00 2.2640e+00 1.5743e+00 6.7326e+00 7.5268e+00 -#> -4.4983e+00 -4.3638e+00 1.5736e+01 -1.3776e+01 4.9824e+00 3.0897e+00 -#> 8.9306e+00 3.4701e+00 -2.1963e+00 1.0783e+00 2.4437e+00 1.6603e+01 -#> -1.1078e+01 -1.6464e+00 1.2810e+01 -6.5072e+00 2.1953e+00 -1.2031e+01 -#> 1.7704e+00 2.0420e+00 -2.6357e+00 -3.8983e+00 1.6743e+01 -1.0175e-01 -#> 5.4945e+00 -3.4841e+00 -5.5210e+00 -6.9099e+00 1.0233e+01 1.2634e+01 -#> -5.5126e+00 4.0176e+00 -1.0951e+00 -1.7160e+01 -8.5082e+00 1.4385e+01 -#> 2.0308e+00 -8.7291e+00 5.6261e+00 -3.4511e+00 -9.2142e+00 -3.6996e+00 -#> -5.1241e+00 -9.3991e+00 -7.1547e+00 -3.7527e+00 5.2667e+00 -7.4117e+00 -#> 2.3922e+00 -2.5907e+00 3.1423e+00 5.9142e+00 -2.4298e+00 -5.1382e+00 -#> 8.3085e+00 4.0955e+00 -1.1548e+01 2.3639e+00 -7.7381e+00 -8.7379e+00 -#> -8.2212e+00 -5.8799e+00 -1.5753e+01 8.4185e-01 -1.7796e+01 2.3051e+00 -#> 4.0001e+00 -4.4698e+00 1.2708e+00 -1.2979e+00 1.1418e+01 -7.5451e+00 -#> -#> Columns 25 to 30 7.9131e+00 2.7154e+00 1.4872e+01 -3.7386e+00 1.0172e+01 5.7043e+00 -#> 1.2539e+01 -5.7977e+00 1.4523e+00 -9.3483e+00 -3.8128e+00 8.8916e+00 -#> 5.1914e+00 7.2358e+00 -1.7755e+00 -4.7946e+00 -7.9701e+00 -4.5566e+00 -#> 5.7365e+00 -4.7697e+00 3.2019e+00 -1.5064e+01 -3.9985e+00 9.4707e-01 -#> 1.3723e+00 6.2656e+00 -6.1616e+00 7.3707e+00 7.5894e+00 -2.7101e+00 -#> 3.7877e+00 -1.2307e+00 -1.3621e+00 -4.5134e+00 -1.5887e-01 5.9096e+00 -#> -2.1808e+00 9.1400e+00 -1.8969e+01 1.0006e+01 4.8628e+00 -4.7325e+00 -#> 2.6560e+00 -7.3123e+00 -1.2357e+00 -7.4553e+00 1.0254e+01 -8.7145e+00 -#> 2.8662e+00 3.2610e+00 -6.6772e+00 7.8181e-01 -1.5148e+00 -2.0216e+00 -#> 4.2846e-01 -2.5294e+00 3.9553e+00 -7.1783e+00 4.7985e+00 1.1505e+01 -#> 9.3605e+00 -7.1987e+00 -8.8640e+00 4.7875e+00 -2.3975e+00 -7.0165e-01 -#> -1.1350e+01 3.1453e+00 -1.0100e+00 1.1050e+00 -8.0596e+00 -5.7571e+00 -#> -9.8132e+00 1.3579e+01 -2.7146e+00 9.5315e+00 -1.0039e+01 6.6940e+00 -#> 6.9763e+00 6.9958e+00 8.9224e-01 5.1560e+00 4.1708e+00 -9.1118e+00 -#> 8.2931e+00 2.2381e+00 -8.5325e-01 9.0331e+00 3.3555e+00 -4.0119e-01 -#> -3.5723e+00 4.3296e-01 -9.3968e-01 -9.5574e+00 1.9474e+00 -2.2715e+00 -#> -4.5663e+00 1.2040e+00 -7.5696e+00 6.6384e+00 4.2499e+00 8.5393e+00 -#> 5.1188e+00 5.2276e+00 -4.3429e-01 -7.5187e+00 4.4111e+00 -6.0097e+00 -#> -3.9696e+00 -5.2015e+00 -1.2500e+01 -2.1139e+00 -1.9498e+00 3.8756e+00 -#> 1.5824e+01 1.9293e+00 -8.2049e+00 2.0788e+01 6.3813e+00 -5.7230e+00 -#> 2.1565e+00 -5.5108e+00 -1.5508e+00 1.4097e+00 2.1417e+00 -1.3263e+01 -#> 8.3929e+00 -6.8624e+00 6.7333e+00 2.3216e-01 6.8439e-02 -9.4227e+00 -#> -2.0929e+00 2.5265e-01 -7.1508e+00 4.1465e+00 3.3456e+00 -7.6238e+00 -#> -8.5652e+00 -7.6295e+00 3.7898e+00 7.8058e+00 -1.4628e+01 9.4347e+00 -#> 1.1126e+01 -2.1903e-01 -3.5481e+00 -1.7516e+00 -9.3764e+00 5.1781e+00 -#> 4.5579e-01 -9.3047e+00 1.5870e+00 2.4300e+00 1.7651e+01 -4.7332e+00 -#> -7.6143e+00 1.1278e+01 -7.4274e+00 1.3487e+01 -6.3099e-01 -2.0147e-01 -#> -1.1968e+00 8.2476e-02 5.4258e+00 1.4257e+00 9.6109e+00 -3.0527e+00 -#> 5.3096e+00 8.0046e-02 7.3033e+00 -9.9037e+00 1.3863e+01 1.9194e-02 -#> -1.4189e+00 6.8598e+00 5.9213e+00 -2.9910e+00 4.5579e+00 8.9131e+00 -#> 3.3798e-02 8.6234e+00 -3.4579e+00 -6.6073e+00 2.8995e+00 8.0238e+00 -#> -1.2174e+00 1.7211e-01 -7.2720e-02 5.1705e+00 -7.9263e+00 6.3274e+00 -#> -1.0282e+01 5.6483e-01 5.9698e+00 -1.0672e+00 -4.2042e+00 -1.7796e-02 -#> -#> Columns 31 to 36 1.1844e+00 -7.2629e-01 -1.8941e+00 -7.6083e+00 -4.2147e+00 2.5961e+00 -#> -4.1939e+00 -2.9817e+00 -4.4593e+00 -5.4455e+00 -5.9977e+00 1.0099e+01 -#> -4.0233e+00 6.7645e+00 2.6826e+00 -5.2496e+00 3.4417e+00 2.0870e-01 -#> 4.8918e+00 1.9567e+00 -1.6991e+01 1.1052e+01 -2.3925e+00 1.4149e+01 -#> 3.3314e+00 -6.8083e+00 1.0376e+01 -3.6049e+00 9.1836e+00 -1.4545e+00 -#> 1.6061e-01 -1.3304e+01 9.7826e+00 8.3859e+00 8.1357e+00 9.6278e+00 -#> 1.5257e+00 2.2147e+00 8.2917e+00 -5.8503e+00 8.1073e+00 -6.5636e-02 -#> 4.5712e+00 1.0947e+00 2.4104e+00 -1.5979e+00 -1.6107e+01 1.4548e+00 -#> -1.9434e-01 -2.3166e+00 -2.6052e+00 9.9696e+00 4.3325e+00 1.0414e+00 -#> -1.3384e+00 -2.7895e+00 6.8319e-01 2.3454e+00 -1.4055e+00 2.0878e-01 -#> 5.9838e+00 1.0921e+00 1.9156e+00 -4.6902e+00 7.7425e+00 -5.7468e+00 -#> -7.3102e+00 5.0646e+00 -3.5681e+00 -9.5191e+00 -6.9702e-01 -2.1993e+00 -#> 4.7431e-01 4.3941e+00 6.1900e+00 5.5946e-01 2.3139e+00 1.3793e+00 -#> 9.5830e+00 -2.6180e+00 2.3090e+00 1.0488e+01 3.5078e+00 7.5632e+00 -#> 2.1537e+00 7.1923e+00 7.6483e+00 -3.7928e+00 6.7952e+00 -3.0645e+00 -#> 5.4097e+00 2.5142e+00 1.7182e-01 6.4000e+00 -4.7737e-01 -1.1004e+00 -#> -4.6148e+00 4.4991e-01 -6.9146e+00 -6.9988e+00 -3.4297e+00 -1.2070e+00 -#> 1.4174e+00 -4.9231e+00 6.5810e+00 -1.2322e+00 -3.7342e+00 -1.9426e+00 -#> -2.0832e+00 1.0677e+00 3.1704e+00 -4.3507e+00 1.0039e+01 -2.5501e-02 -#> -3.6884e+00 4.0975e+00 3.6172e+00 -4.0332e+00 -3.9415e+00 4.6806e+00 -#> -6.6195e-01 1.4217e+00 3.0961e+00 2.9925e+00 -7.7697e+00 6.8150e+00 -#> 1.5677e+00 8.5009e-01 -1.0536e+01 5.7060e+00 -4.8438e+00 8.2975e+00 -#> 1.4551e+00 -4.0907e+00 3.4321e+00 -6.6505e+00 1.5629e+01 1.1222e+01 -#> -4.7277e+00 -3.8640e+00 5.8621e+00 -3.9335e+00 1.0030e+01 -2.7991e+00 -#> -5.8399e+00 -2.7366e+00 2.3711e-01 3.0909e+00 3.6587e+00 -8.1478e+00 -#> 1.2677e+01 -1.5665e-01 -1.6311e+00 -4.9779e+00 3.6424e+00 3.0476e+00 -#> 2.8489e-01 -4.8924e-02 7.1833e+00 -2.2264e+00 4.6149e+00 6.1171e+00 -#> -3.6819e+00 5.9067e+00 3.2096e+00 1.0361e+00 7.6217e+00 -3.1803e+00 -#> 7.7033e+00 3.5124e-02 1.5344e-01 6.7063e+00 -1.2995e+01 -2.4705e+00 -#> -7.7243e-01 5.8335e+00 -9.0640e-01 7.5684e-01 1.5670e+01 -1.5860e+00 -#> 2.1432e+00 6.0483e+00 4.0227e-01 -7.1577e+00 -2.8218e+00 -5.0067e+00 -#> 2.5484e-01 1.0097e+01 2.4836e+00 -1.1225e+00 5.1515e+00 -2.9331e+00 -#> 2.1140e+00 1.2718e+00 -6.7375e+00 3.4060e+00 1.5925e+00 6.5458e+00 -#> -#> Columns 37 to 42 -9.0413e+00 4.2641e+00 1.5076e+01 -2.2102e+00 5.7407e-03 -2.3147e+00 -#> -2.5366e+00 9.2487e+00 5.6971e+00 -7.5793e-01 3.1941e+00 6.2136e+00 -#> -1.0855e-01 5.7570e+00 -2.9134e+00 3.9281e+00 7.0567e+00 3.6867e+00 -#> -5.7815e+00 6.1053e+00 2.5409e+00 -4.6347e+00 4.2671e+00 1.3293e+01 -#> 8.7166e+00 -1.9208e+00 -2.0736e+00 1.6139e+00 2.7751e+00 4.2373e+00 -#> -7.0585e+00 8.1723e+00 -9.2192e+00 -5.7954e+00 1.1677e+01 1.4162e-02 -#> 9.3435e+00 1.7496e+00 -6.9971e+00 1.2072e+01 4.4855e-02 1.0169e+01 -#> -5.4095e+00 -5.3696e+00 -1.0430e+01 5.7073e-01 1.6181e+00 3.0055e+00 -#> 3.1037e+00 2.6920e+00 1.0490e+01 9.2666e+00 2.1840e+00 5.4630e+00 -#> 4.2201e+00 -1.5488e+00 9.5045e+00 6.7580e+00 2.3942e+00 5.4924e+00 -#> 1.5608e+01 -1.2334e+01 -2.4745e+00 8.0635e+00 4.5486e+00 3.4762e+00 -#> -5.2987e+00 3.9534e+00 -4.9090e+00 -5.6588e+00 -1.2606e+01 -2.5031e+00 -#> -9.3156e-01 6.0539e-01 -2.3783e+00 2.9253e+00 8.3484e+00 9.2735e-01 -#> 7.2469e+00 -6.1220e+00 -1.6000e+00 5.0525e+00 3.5506e+00 9.3286e+00 -#> 1.0778e+01 -2.7546e+00 -5.9711e+00 -6.2599e+00 1.7342e+01 -1.2956e-01 -#> 8.0654e+00 -7.5368e+00 4.2084e+00 -2.8554e+00 5.2685e+00 5.8940e+00 -#> -4.1574e+00 3.0891e+00 -1.7797e-02 2.6921e+00 -6.5087e+00 3.0199e+00 -#> -1.3272e+01 -3.6118e-01 -1.0552e+01 -3.4381e-01 -5.9032e+00 2.5379e+00 -#> 7.1396e+00 -2.4519e-01 1.4729e+00 2.1580e+00 4.3612e+00 6.5780e+00 -#> 6.7803e+00 -6.1947e+00 5.5043e+00 1.2180e+01 -8.9490e+00 -3.8433e+00 -#> -1.3721e+00 -9.1113e+00 5.5899e+00 3.0196e+00 -9.7791e+00 6.4865e+00 -#> 1.7351e+00 3.5879e+00 1.5976e+01 4.0530e+00 2.1135e+00 -5.7476e+00 -#> 1.5130e+01 -1.0406e+01 -6.5211e+00 -7.1649e+00 -3.4955e+00 1.1013e+00 -#> 3.6414e+00 5.7045e+00 2.1475e+01 -7.1721e+00 -7.4133e+00 8.6888e+00 -#> -1.0062e+01 2.4678e+00 -4.0043e+00 -7.9186e-01 -3.4969e+00 -3.5037e+00 -#> 3.4416e+00 -3.2598e+00 -3.1747e+00 -4.2460e+00 1.4092e+01 4.3662e+00 -#> 1.3994e+00 1.1805e+00 7.6355e-01 8.1165e-01 -7.8432e+00 4.3089e+00 -#> 1.7457e+01 -3.9223e-01 -1.0589e+01 -5.4922e+00 1.0360e+01 -4.8000e+00 -#> -6.0066e+00 -5.0413e-01 -3.5014e+00 7.0190e+00 -6.6348e+00 -4.3041e-01 -#> 5.7813e+00 9.0441e-01 8.9422e-01 -4.0966e+00 9.7021e-01 -6.7914e+00 -#> -8.4177e+00 7.5051e+00 -8.2865e+00 -5.5693e+00 3.7791e+00 4.6787e+00 -#> 9.4031e-01 -5.7692e+00 -9.4047e+00 -2.7730e+00 1.0719e+01 -5.6055e+00 -#> -2.0762e+00 7.6085e+00 4.4843e-01 4.3434e+00 8.3749e+00 -4.6087e+00 -#> -#> Columns 43 to 48 4.3737e+00 -2.5087e+00 6.3889e+00 -8.0037e+00 1.2070e+01 7.6759e+00 -#> -2.0968e+00 -2.8082e+00 -1.3237e+00 1.1395e+01 -8.8094e+00 -2.2518e-01 -#> 4.4428e+00 -1.5217e+00 7.1631e+00 1.0258e+01 7.0296e+00 -6.6016e+00 -#> 9.3830e-01 1.0754e+00 6.3390e+00 6.4996e+00 1.1364e+00 6.5365e+00 -#> -7.3669e+00 1.2879e+00 -3.6561e+00 6.0226e+00 1.0079e+00 -5.1102e+00 -#> -3.1869e+00 6.5918e+00 2.0732e-01 -4.5766e+00 -6.3377e+00 2.8138e+00 -#> 2.0437e+00 -1.0918e+01 1.2007e+01 -9.5038e+00 -3.3561e+00 -5.4644e+00 -#> -2.2200e-02 -2.5758e+00 2.5701e+00 -1.5186e+00 -1.5802e+00 -6.7054e+00 -#> -1.1495e+00 7.3191e+00 -8.1472e-01 6.5960e+00 -4.5626e+00 -6.9297e+00 -#> -3.1921e+00 1.1575e+00 1.1334e+00 1.0130e+01 -3.0875e+00 1.0981e+00 -#> -1.3860e+01 -7.7124e+00 -8.0822e+00 4.3692e+00 -1.8890e+00 -1.1706e+01 -#> -1.0236e+01 -1.4173e+01 6.1403e+00 1.3038e+00 -6.5673e+00 -4.4560e+00 -#> 1.6044e+00 -2.1522e+00 -3.1384e+00 4.5684e+00 -1.2475e-01 8.4231e-01 -#> -9.8657e+00 6.5702e+00 4.0053e+00 8.1363e-01 -3.0906e+00 -8.5054e-01 -#> -6.7798e-01 6.9365e+00 6.7617e+00 7.0552e+00 5.7285e+00 -7.3362e-01 -#> -1.9092e+00 -1.2977e+00 1.3451e+00 2.6043e+00 7.1558e+00 2.2102e+00 -#> 1.3258e+01 -4.5233e+00 4.2634e+00 2.9977e+00 1.4320e+00 6.0799e+00 -#> -9.4925e+00 -2.7318e-01 -5.4690e+00 -6.3278e+00 1.4818e+00 1.2148e+01 -#> -1.5550e+00 -4.9477e+00 -3.7344e+00 6.9157e+00 5.1898e+00 -2.2928e+01 -#> -4.9061e+00 2.7184e+00 2.0085e+00 1.0371e+01 -5.9143e+00 4.4246e-01 -#> -1.1608e+01 6.4426e+00 2.3071e+00 -2.4539e-01 -1.0232e+01 4.0275e+00 -#> -1.8309e+00 8.3980e+00 4.2186e+00 -3.7221e+00 -1.2211e+01 9.1400e+00 -#> -4.2552e+00 2.3704e+00 1.4545e+00 -3.9296e+00 1.6062e+00 -1.3370e+01 -#> -2.1842e+00 1.5637e+00 -5.2733e+00 1.9259e+00 -6.5270e+00 9.1398e-01 -#> -5.7691e+00 -2.3866e+00 -3.4253e+00 3.6446e+00 -7.2987e-01 -9.9818e+00 -#> -5.3362e-01 -4.7567e+00 2.5631e+00 5.8880e+00 3.7545e+00 -9.2315e+00 -#> 8.1852e+00 5.2404e+00 4.6403e+00 -5.1693e+00 5.6024e+00 -1.9305e+00 -#> -1.9630e+00 -1.4028e+01 1.1234e+00 -4.9099e+00 -8.5770e+00 9.8347e+00 -#> -2.3442e+00 8.7382e-01 6.1165e+00 9.1952e+00 3.3971e+00 -4.7489e+00 -#> 1.4446e+01 -5.3570e+00 -9.9253e-01 1.6825e+00 1.0498e+01 -7.8251e+00 -#> -1.2283e+00 8.3363e+00 3.4820e+00 8.8028e+00 1.2035e+01 -5.4122e+00 -#> 3.3773e+00 -5.1518e+00 3.0077e+00 1.2355e+01 5.0393e+00 1.6614e+00 -#> 4.3089e-01 -4.3456e+00 1.1466e+00 5.3134e+00 -2.3118e+00 8.6921e-02 -#> -#> (10,.,.) = -#> Columns 1 to 6 -9.5958e+00 -7.3723e+00 1.2047e+01 6.5237e-01 -4.7717e+00 -1.2016e+01 -#> 4.8334e+00 5.0307e+00 8.7751e+00 -4.1145e+00 -2.9787e+00 -7.0777e+00 -#> 5.3799e+00 2.5190e+00 -1.0327e+00 -3.6891e+00 -3.6897e+00 -2.2001e+00 -#> 3.0633e+00 -7.7809e+00 9.6613e+00 -4.0909e+00 -3.6797e+00 -5.8602e+00 -#> -8.7687e+00 7.7281e+00 2.8332e+00 -9.3733e+00 -4.9154e+00 4.3284e+00 -#> 2.6050e+00 7.4308e+00 2.4147e+00 -1.8745e+00 5.2642e+00 7.2496e+00 -#> 6.1107e-01 3.7037e+00 -3.6763e-01 -3.7137e+00 -6.8335e+00 -2.2041e-02 -#> 6.3727e+00 -4.0470e+00 9.8372e+00 -9.1580e-02 1.2816e+01 -7.3477e+00 -#> 4.6817e+00 1.1897e+00 -4.7628e+00 -3.2549e+00 -5.2942e+00 -4.7634e+00 -#> 1.2118e+00 3.4150e+00 2.0200e+01 4.0394e+00 -6.6602e+00 -4.1004e+00 -#> -1.8507e+01 1.0203e+01 -6.0837e+00 4.8159e+00 -1.6472e+00 2.0550e+00 -#> -1.1630e+00 3.0891e+00 7.1104e+00 -2.4322e+00 -2.3919e+00 2.3690e+00 -#> 2.2192e+00 5.9340e+00 -4.8951e+00 1.4631e+01 3.8242e+00 -3.2868e+00 -#> -3.2127e+00 -3.8830e+00 -6.4114e+00 -5.0048e-01 8.1078e+00 -5.9243e+00 -#> -4.0585e+00 -2.5125e-01 -1.9762e+01 -5.1743e+00 -7.5492e+00 1.3552e+00 -#> 6.4482e+00 -4.5000e+00 -5.4551e+00 1.3725e+01 -3.2925e+00 -9.4615e-01 -#> 1.8685e-01 -7.3666e+00 1.3488e+01 1.1538e+01 2.1585e+00 -3.5440e+00 -#> -8.2571e+00 -7.2357e+00 2.6185e+00 -5.7009e+00 1.0855e+01 -6.0345e+00 -#> -1.0681e+01 7.8015e-01 1.1207e+01 2.2349e-01 -1.9994e+01 -4.4592e+00 -#> -2.9950e+00 7.2179e+00 3.6860e+00 -3.2118e+00 3.0814e+00 -8.9734e+00 -#> 1.2245e+01 4.6379e+00 8.6471e-01 -1.1092e+01 1.4047e+01 -2.6863e+00 -#> 3.6473e+00 9.6788e-01 -1.5797e+01 -7.5575e-01 1.0347e+00 -1.4668e+01 -#> -8.9120e+00 -7.8948e+00 -2.6262e+00 -2.0313e+00 4.8857e+00 5.1242e+00 -#> 5.4599e+00 2.0633e+01 1.3699e+01 -6.1316e+00 -7.5593e+00 6.0918e+00 -#> -2.3579e+00 3.9159e+00 -2.1032e+00 -1.1156e+01 -1.1244e+01 -1.0482e+00 -#> -1.1059e+01 3.7168e+00 1.0351e+01 3.6616e+00 -2.1056e+00 -4.0807e+00 -#> -3.3395e+00 4.6382e+00 -5.7943e+00 -4.8561e-01 -4.6544e+00 2.2636e+00 -#> -4.9313e+00 1.4905e+01 -1.6279e+01 1.7440e+00 1.5006e-01 1.0771e+01 -#> 1.8941e+00 -2.4229e+00 6.7721e+00 -1.3173e+01 -3.8212e+00 -9.3271e+00 -#> -9.6208e+00 -3.8180e+00 9.4949e+00 1.4090e+01 2.8480e-01 8.8882e+00 -#> -1.6255e-01 -3.3643e+00 1.2077e+01 3.0009e+00 -2.8539e-01 -2.1709e+00 -#> 1.0549e+01 4.7026e+00 -8.8984e-01 2.2968e+00 -1.7090e+00 1.1730e+01 -#> 5.7700e+00 5.7668e+00 8.5146e-01 1.1141e+00 -3.9964e-01 -4.2993e+00 -#> -#> Columns 7 to 12 -5.5264e+00 5.2635e+00 -4.5586e+00 -7.7724e+00 -6.7307e+00 -8.4080e+00 -#> 4.2303e+00 -3.7358e-01 7.6817e-01 6.5046e+00 -3.3298e+00 1.6328e+01 -#> 4.7472e+00 -1.0522e+00 1.4437e+01 6.2140e+00 3.2500e+00 7.5858e+00 -#> 1.5293e+01 5.4497e+00 1.0482e-01 8.9316e-01 -2.1978e-01 5.0263e+00 -#> 2.2517e+00 1.9998e+00 -2.8958e+00 8.6876e+00 5.9590e-01 -2.5562e+00 -#> 1.5858e+01 2.7522e+00 -4.7113e+00 6.5628e+00 -3.8360e+00 -6.7326e+00 -#> -2.1941e+00 -6.8718e+00 3.7223e+00 -1.0419e+01 -1.1370e+00 -4.4845e-01 -#> -1.1046e+01 3.7126e-01 -4.4773e+00 -3.0009e+00 4.1116e+00 3.8538e+00 -#> -4.7099e+00 -1.8970e+00 1.4741e+00 7.1659e-01 3.7944e+00 6.2840e+00 -#> -7.3374e+00 7.9661e+00 8.4182e+00 3.5170e+00 -6.3520e+00 -1.7288e-02 -#> -1.7208e+01 -3.3698e+00 4.1292e+00 -2.0220e+00 -1.0347e+01 1.5438e+01 -#> -1.0082e+01 -6.4995e+00 5.5515e-01 5.9078e+00 7.1912e+00 3.6817e+00 -#> -2.3130e+00 8.0063e-01 1.1618e+01 9.1944e+00 -7.0368e-01 1.0089e-01 -#> 1.9452e+00 4.3857e-01 9.5861e+00 -5.9947e+00 3.2025e+00 -4.1489e+00 -#> 2.3898e+00 5.1869e+00 7.7287e+00 3.4755e+00 5.7937e+00 5.4252e-01 -#> -5.9263e-01 9.0569e+00 1.5395e+01 -2.7766e+00 1.0821e+01 -1.7293e+00 -#> 4.5239e-01 -1.0123e+01 -3.0120e+00 -4.2275e+00 -5.5924e+00 -1.9935e+00 -#> 1.8789e+00 -5.4162e+00 -1.0534e+01 3.1018e+00 -6.8032e+00 -4.4391e+00 -#> -1.0391e+00 7.1574e+00 -2.2047e+00 6.4773e+00 2.0316e+01 2.3733e+00 -#> -3.7556e+00 6.4101e+00 4.5364e+00 -4.5714e+00 -1.0283e+01 3.7152e+00 -#> -1.0023e+01 4.5924e+00 -7.0045e+00 -3.7151e+00 7.6043e+00 1.0022e+01 -#> -4.4115e+00 4.4164e-02 3.7332e+00 -1.2023e+00 -6.7839e+00 3.3401e+00 -#> 4.4598e+00 7.0718e+00 2.9898e+00 3.5532e+00 1.2489e+01 -4.6912e+00 -#> 7.4195e+00 -3.9552e+00 -3.8079e+00 -6.0742e+00 -6.6667e+00 -1.4535e+01 -#> 3.6264e+00 -1.3222e+01 -6.8985e+00 -2.4302e+00 1.0476e+01 3.2049e+00 -#> 6.4970e+00 5.5345e+00 2.7960e+00 2.5680e+00 1.0286e+01 5.2805e-01 -#> 2.8223e+00 4.0463e-01 -3.0897e+00 -2.8653e+00 -4.7640e+00 3.1136e+00 -#> 1.0542e+00 -1.2056e+01 2.3570e+00 -1.0653e+01 -2.1960e+00 -5.1746e+00 -#> 2.5563e+00 4.3190e+00 6.0527e+00 -1.5861e+00 1.3050e+01 3.7657e+00 -#> -2.7193e+00 8.2389e+00 4.5454e+00 1.9444e+00 -6.1626e+00 -1.5945e+01 -#> -1.5180e+00 -3.8249e+00 -4.1468e+00 -2.9882e+00 -4.2244e+00 -2.6074e+00 -#> 1.0457e+01 1.1844e+00 1.2966e+01 7.0765e+00 -3.6259e+00 7.7815e+00 -#> 4.6665e+00 5.6338e+00 1.1487e+01 1.3988e+01 6.3386e+00 -8.3386e-01 -#> -#> Columns 13 to 18 2.0886e+00 -7.3745e-01 3.8596e+00 1.1250e+01 7.6722e+00 5.0116e+00 -#> 2.1444e+00 -1.6645e+00 6.8044e-01 -4.4825e+00 -2.3838e+00 -8.9263e+00 -#> 1.7066e+00 8.2733e-01 1.3864e+00 -5.2769e+00 2.1373e+00 2.8794e+00 -#> -9.2881e-01 5.1343e+00 -4.6063e-01 -6.8650e+00 5.7070e+00 -3.9509e+00 -#> 8.9417e+00 -3.2170e+00 -7.5011e+00 -1.3436e+00 2.3125e+00 -1.0677e+01 -#> 2.6758e+00 2.1154e+00 -1.4393e+01 1.9364e+00 -1.8340e+00 -1.4569e+00 -#> 1.2783e+01 -2.6929e+00 3.6496e+00 -6.7124e+00 -3.2855e+00 -7.3697e+00 -#> 3.5376e+00 -1.2442e+00 6.6557e-01 -9.2502e+00 -4.3490e+00 5.2186e+00 -#> -1.0287e+00 1.7489e+00 -6.3890e+00 -3.3085e+00 4.4065e+00 4.9332e+00 -#> -6.0900e+00 -6.4055e+00 -2.4614e+00 -3.4166e+00 3.4514e-01 -1.0389e+01 -#> 1.3470e+01 -1.1791e+01 -2.7705e+00 8.8360e+00 8.9835e+00 4.3432e-02 -#> -2.2199e+00 3.6928e+00 7.5354e+00 1.7304e+00 -6.0398e-02 5.4968e+00 -#> -1.4242e+00 -8.0811e-01 -3.7795e+00 -1.4466e+01 1.3720e+01 3.5974e-01 -#> -1.2734e+00 1.6379e+00 -1.8523e+00 -1.4098e+01 1.9150e+01 4.3712e+00 -#> 3.7641e+00 2.0794e+00 -4.8163e+00 -1.0482e+01 6.0374e+00 -6.2612e+00 -#> -5.7567e+00 -3.5951e-01 3.5209e+00 -9.5553e+00 7.4145e+00 -1.5226e+00 -#> -4.3966e+00 -8.0719e+00 -7.3956e-01 8.1789e-01 -5.3823e+00 -4.8919e+00 -#> 5.3892e+00 -5.2176e+00 -1.4142e+01 7.2664e+00 1.4347e+01 2.4555e-01 -#> 7.7653e+00 -5.8794e+00 1.2585e+01 2.1114e+00 -1.4813e+00 -7.0001e+00 -#> 3.9473e+00 -5.0185e+00 1.1463e+01 7.1674e+00 -1.6456e-01 5.4728e-03 -#> 2.4814e+00 -1.7935e-01 6.9283e+00 -1.1585e+01 1.4784e+00 2.3654e+00 -#> -6.4205e+00 1.1133e+01 2.9250e+00 1.7329e+00 7.0688e-01 5.7559e+00 -#> 1.0456e+01 -6.8576e+00 4.8140e+00 -7.6468e+00 -2.7327e+00 -6.2993e+00 -#> 6.8432e-01 -7.0138e+00 1.2649e+01 -9.6007e+00 -7.8085e+00 -2.0163e+01 -#> 5.6327e-01 -9.2424e-01 7.8005e+00 1.6437e+01 4.0998e+00 1.4246e+00 -#> -3.8699e-01 -3.5711e+00 4.4379e+00 -9.7836e+00 -6.2413e+00 -2.5627e+00 -#> 1.2638e+01 8.7734e-01 -3.8200e+00 -2.4395e-01 -3.4485e+00 1.2097e+00 -#> -5.7193e-01 1.1901e+01 2.9845e+00 -1.4858e+01 -6.7063e-01 2.3071e+00 -#> -9.6732e+00 -5.1042e-01 8.5809e+00 -3.6766e+00 4.3520e+00 6.9453e+00 -#> 1.8797e+00 -2.1659e+00 4.6082e-01 -3.9050e+00 -7.2183e+00 1.1659e+01 -#> 1.8875e-01 3.7510e+00 -4.7080e+00 2.4531e-01 -2.2246e+00 -9.3851e+00 -#> -7.1921e+00 -8.4018e+00 1.4588e+00 2.4293e+00 -7.9774e+00 -1.2957e+00 -#> -7.1650e+00 6.5995e+00 -8.2609e-01 -4.5824e+00 1.8581e+00 1.0594e+01 -#> -#> Columns 19 to 24 -1.6547e+01 -1.7455e-01 -1.3965e+01 1.1901e+01 1.1083e+01 7.0531e-01 -#> -9.7378e+00 4.9298e+00 -3.4363e+00 -8.7591e-01 -2.4670e+00 -5.0297e+00 -#> 4.3643e+00 -1.5493e+00 -5.8622e+00 -7.1102e+00 -1.2048e+01 2.1534e+00 -#> -1.5390e+01 2.9438e+00 3.5965e+00 -4.0425e+00 4.1766e+00 -2.8972e-01 -#> 2.0704e-01 -8.7539e-01 -6.3799e+00 -4.0346e+00 -1.2937e+01 -2.5051e+00 -#> 1.3951e+00 -1.5320e+00 -2.3383e+00 8.2027e+00 -4.7966e+00 2.8239e+00 -#> -8.0676e+00 -6.4718e+00 5.0619e+00 -5.0724e+00 -6.6217e+00 8.5565e+00 -#> 3.2756e+00 7.5328e+00 -5.7972e+00 9.7473e-02 3.8714e+00 -7.3040e-01 -#> -9.6483e+00 -2.0600e+00 -9.7193e+00 -4.9492e+00 -7.4333e-01 -8.6179e+00 -#> -1.0411e+01 -6.1163e+00 -6.2946e+00 -6.3084e+00 -8.5870e+00 -4.3519e+00 -#> 1.3475e+01 -6.2340e+00 9.6245e-02 4.0572e+00 4.9656e+00 6.8442e+00 -#> 1.2770e+01 7.3423e+00 5.9753e+00 -3.9718e+00 -5.8074e+00 1.2654e+01 -#> 6.1425e+00 -1.5625e+00 -5.7367e-01 -8.6138e+00 -5.9765e+00 -2.7418e+00 -#> 2.2703e+00 -2.2491e+00 8.4373e-01 -6.4280e+00 3.2190e+00 3.5527e+00 -#> 1.1549e+01 -7.9122e+00 -1.8938e+01 -1.5693e+01 -4.1604e+00 1.1917e+01 -#> -5.9029e+00 9.0893e+00 7.5842e-01 -5.2862e+00 8.7295e+00 -7.4853e+00 -#> -1.2061e+01 -6.9287e+00 -3.7148e-01 5.6541e+00 1.5193e+00 3.5425e-01 -#> -5.0123e+00 -4.6697e+00 -1.0465e+00 1.5650e+00 -1.0680e+00 2.7245e+00 -#> 1.0073e+01 7.4417e+00 9.5893e+00 4.2450e+00 4.2740e+00 6.7323e-01 -#> 4.4783e+00 -4.2850e+00 5.4657e+00 -3.8559e+00 -6.5410e+00 3.1018e+00 -#> 2.0051e+00 2.6213e+00 3.6959e+00 1.6965e+00 5.2516e+00 -1.1366e+00 -#> -1.3355e+01 -2.0926e+00 1.9470e+00 -9.4491e+00 7.5002e+00 -6.4533e+00 -#> 1.8658e+01 4.7384e+00 1.5558e+00 2.8851e+00 6.7274e+00 -1.1672e+00 -#> -3.9936e+00 -6.7976e-01 3.4615e+00 1.5546e-01 -4.7202e+00 3.0028e+00 -#> -1.1919e+00 -1.3104e+00 1.5569e+00 1.2456e+01 3.1025e+00 2.3721e+00 -#> 1.1173e+00 4.5066e+00 -9.3090e+00 -8.5264e-03 1.7803e+00 -6.3240e+00 -#> 1.9434e+00 1.0001e+01 -4.6523e+00 -7.2178e+00 1.9133e+00 4.2573e+00 -#> 1.7640e+01 2.0178e+00 1.2119e+01 -1.2353e+00 -1.2177e+00 3.3461e+00 -#> -1.1616e+01 -2.9746e+00 -7.5183e+00 -4.7508e-01 4.7900e+00 1.1289e+00 -#> -4.1454e+00 -8.5355e+00 -4.8618e+00 7.4444e+00 -6.7522e-01 1.5182e+00 -#> 9.5614e+00 7.6493e-01 -1.7208e+01 -4.1176e+00 5.2517e+00 3.0220e+00 -#> 7.9180e+00 -8.7570e-01 -7.2966e+00 -8.8530e+00 -1.2356e+00 9.8385e+00 -#> 8.0671e-01 -7.2320e-01 -2.2440e+00 6.0880e+00 2.4059e+00 -2.9827e+00 -#> -#> Columns 25 to 30 -1.7654e+00 6.1573e+00 -4.7365e+00 -8.6320e+00 -1.0574e+01 -2.4142e+00 -#> 9.6550e+00 -1.3046e+01 3.9530e+00 -7.7652e+00 -7.8142e-03 -2.3785e+00 -#> 7.7095e+00 -4.5380e-02 -2.2185e+00 -2.1431e+00 -2.8480e+00 -4.1918e+00 -#> -1.2611e+00 -6.1082e+00 8.4187e+00 -1.1930e+01 -1.8830e+00 9.4992e-01 -#> 2.9813e+00 1.7482e+00 -3.5632e+00 1.0422e+01 7.9552e+00 -9.4522e+00 -#> 2.3804e+00 -3.7486e+00 -7.8931e+00 -8.6517e-02 3.5921e+00 5.1656e+00 -#> -1.0735e+01 2.3283e+00 -1.0351e+01 9.9058e+00 -1.6368e+01 1.7590e+01 -#> -3.3455e+00 6.6997e+00 6.1150e+00 -1.3539e+01 3.1509e+00 9.2606e+00 -#> 9.9220e+00 2.6841e+00 -7.4650e+00 -3.8941e+00 4.9308e+00 -9.6910e+00 -#> 1.2039e+01 -5.5931e+00 3.8058e+00 -3.7076e-02 4.8538e+00 -1.3906e+01 -#> -1.2222e+00 -5.8971e+00 5.7628e+00 -1.3399e+00 8.7426e+00 2.8881e-01 -#> -3.1671e-01 -1.3056e+00 -8.4503e+00 6.6954e+00 4.4994e+00 7.4874e-03 -#> 5.8190e+00 1.4675e+01 -4.8289e+00 5.9821e+00 6.1915e+00 -1.3343e-01 -#> 3.3220e-04 4.6068e+00 -4.6546e+00 5.7364e+00 -2.2543e+00 2.7028e+00 -#> 9.3891e+00 -2.0363e+00 -8.9049e+00 4.3891e+00 -8.8804e+00 -9.8594e+00 -#> -1.0224e+00 5.4304e+00 5.4160e+00 1.0815e+01 -9.0729e+00 3.5370e-01 -#> -1.5242e+00 -5.8958e+00 -9.4318e+00 1.0289e+01 -2.9070e+00 -4.4505e+00 -#> -2.7731e-01 -6.3541e+00 1.7985e+00 4.7676e+00 1.1100e+01 5.3108e+00 -#> -4.6237e+00 5.7527e+00 1.9249e+00 2.4828e+00 -6.9288e+00 4.5745e+00 -#> 1.0163e+01 5.8311e-01 -1.0615e+00 -6.9244e+00 -7.4802e+00 -1.2862e+01 -#> -2.9575e+00 2.7526e+00 3.0471e+00 -1.4789e+01 4.2150e-01 4.5949e+00 -#> 5.3139e+00 -8.4273e-01 -1.1009e+00 -1.0658e+01 2.4117e-01 -5.2637e+00 -#> -1.8128e+00 3.4119e+00 -9.5389e+00 4.7784e+00 1.5970e+00 8.7575e+00 -#> 7.7266e+00 -1.3421e+00 -1.6817e+01 6.9556e+00 8.4550e-01 -3.6458e+00 -#> 2.7317e+00 -5.6457e+00 -4.8476e+00 4.2472e+00 -7.1554e+00 -4.5735e+00 -#> 2.0287e+00 1.2595e+01 5.9875e+00 -6.9651e+00 -3.8292e-01 4.1640e+00 -#> 4.7382e+00 1.5736e+00 -1.6172e+01 8.5885e-01 -4.5621e+00 -4.1614e+00 -#> -1.3447e+01 -8.9498e+00 1.1139e+01 1.4416e+01 -6.6645e+00 1.1075e+01 -#> 1.9097e+00 -3.9173e+00 1.9079e+00 -7.1014e+00 -6.5555e+00 -1.1570e+01 -#> -6.9789e+00 1.3138e+00 2.7922e+00 1.4033e+01 -3.5511e+00 -6.7887e+00 -#> 4.4467e+00 4.0878e-01 -4.3903e+00 -4.1820e+00 -1.0072e+00 -4.2013e+00 -#> 5.4239e+00 -5.3318e+00 1.1106e+01 8.8659e+00 -6.3158e+00 -8.8431e+00 -#> 1.7667e+00 7.0973e+00 1.4381e-01 -1.7409e+00 -1.4521e+00 -8.0895e+00 -#> -#> Columns 31 to 36 5.7877e+00 -4.6230e+00 7.2711e-01 -5.3029e+00 -1.0070e+01 1.5749e+00 -#> -6.4950e+00 -8.0376e+00 -4.5445e+00 -6.8741e-01 7.7429e+00 -1.0083e+01 -#> -1.5847e+01 1.7480e-01 -2.1960e+00 -3.0769e+00 1.8834e+00 -1.6263e+01 -#> -6.5188e+00 -4.4961e+00 -7.4398e+00 4.9623e+00 -5.0504e+00 -1.0719e+01 -#> -6.9759e+00 -2.7550e+00 3.8908e+00 -1.4866e+00 1.4092e+00 8.6068e+00 -#> 1.1555e+00 -7.9262e+00 -1.7627e+00 1.4443e+01 5.2027e+00 1.9195e+01 -#> -1.1875e+01 1.7406e-01 3.8870e+00 -1.6582e+01 -7.6358e-01 -9.4129e+00 -#> -5.6199e+00 2.9915e-01 -4.2304e+00 1.3446e+00 -8.7123e+00 -8.4327e+00 -#> 1.4075e+00 2.2520e+00 -5.4539e-01 1.2540e-01 2.7837e+00 -1.9737e+00 -#> -1.0888e+01 2.6625e+00 -2.2844e-01 -5.8239e+00 9.2506e+00 -4.6154e+00 -#> 3.2156e-01 -4.3800e+00 1.0448e+01 1.1336e+01 -5.0925e-01 1.1347e+01 -#> -8.9026e+00 8.4311e+00 -1.1829e+00 8.5838e+00 4.1165e+00 -2.6017e+00 -#> -1.0618e+01 4.5853e+00 -8.2806e-01 3.4717e-01 -5.1240e+00 -7.8976e+00 -#> -1.1902e+01 1.0810e+01 3.4999e+00 3.2599e+00 9.5764e-01 -8.7301e+00 -#> -1.3804e+01 3.4591e+00 2.7214e-01 3.7915e+00 8.5312e-03 -5.0884e-01 -#> -8.1244e+00 1.0172e+00 1.8569e+00 -2.3522e+00 6.7773e+00 -1.2270e+01 -#> 1.8376e+00 8.6732e-01 1.6401e+01 -7.6989e+00 -2.0114e-01 2.8492e+00 -#> 1.0577e+00 -1.9100e+00 3.3745e+00 1.2003e+01 3.8370e-01 -1.6340e-01 -#> -2.0837e+00 8.0215e-01 4.6182e+00 2.0742e+00 -5.6774e+00 1.3857e+00 -#> -6.0330e+00 1.0794e+01 5.6998e+00 -6.3962e+00 4.2862e+00 -2.6237e+00 -#> -3.0855e+00 7.1994e+00 -2.1118e+00 8.3192e+00 1.5085e+00 -1.3735e+01 -#> 2.6148e-01 1.1909e+01 -1.5958e+00 -8.5097e+00 -1.8810e+00 -1.4911e+01 -#> -8.3884e+00 3.3756e+00 1.4041e+01 9.2340e-01 8.0913e+00 2.0051e+01 -#> 7.7919e+00 -1.8161e+00 -5.1652e+00 -1.1685e+01 6.0734e+00 5.4655e+00 -#> 7.2264e+00 3.7018e+00 -1.2938e+00 -3.1015e+00 7.4691e+00 3.9491e+00 -#> -1.2834e+01 9.6482e-01 2.5271e+00 -7.3225e+00 -1.5531e+00 3.0103e+00 -#> -3.2939e+00 -7.1995e-01 -1.5961e-01 -8.3620e+00 -3.4409e+00 2.4556e+00 -#> 8.2753e+00 -4.5557e+00 9.8521e+00 1.1992e+01 1.5305e+01 -8.5115e-01 -#> -7.4579e+00 8.0515e+00 -3.0819e+00 -1.4713e+00 9.2651e+00 -1.5457e+01 -#> 9.7496e+00 4.9703e+00 1.0001e+01 9.2289e+00 1.9981e+00 8.2831e+00 -#> -1.0757e+01 3.1880e+00 -2.7738e+00 1.1525e+00 -1.5764e+01 1.4619e+00 -#> -5.5567e+00 -7.2724e+00 4.9576e+00 -2.6230e+00 5.0559e+00 9.4046e+00 -#> -7.6573e+00 3.0302e+00 -5.1165e+00 3.8154e-01 4.2957e+00 -2.9837e+00 -#> -#> Columns 37 to 42 1.4781e+01 5.8403e+00 -7.9739e+00 4.3284e-01 4.0315e+00 -1.8781e+00 -#> 1.0714e+01 -3.9545e+00 5.9812e+00 -1.0373e+01 4.7588e+00 8.2855e+00 -#> -2.0618e+00 -2.1084e+00 4.1794e+00 -1.0581e+01 4.8777e+00 8.7927e+00 -#> 1.1532e+01 -1.2932e+00 -6.2592e+00 -2.0058e+00 1.2718e+01 -1.0143e+01 -#> -6.9627e+00 -1.5297e+00 -4.0824e+00 -1.5353e+00 -6.1356e+00 3.2539e+00 -#> 1.0944e+01 -1.2924e+00 -8.0913e+00 8.3946e-01 5.9463e+00 4.7262e+00 -#> -8.8214e-01 -7.6500e+00 -9.0881e+00 1.2230e+01 7.8738e+00 1.2728e+00 -#> -3.6937e-01 -7.4623e-01 2.8593e+00 6.0178e+00 -4.1878e+00 -6.2481e+00 -#> 4.6512e+00 -3.8977e+00 -8.6991e+00 -7.2066e+00 -8.0469e+00 4.5942e+00 -#> -1.1658e-01 -2.2086e+00 4.4043e+00 -8.7130e+00 -8.1418e+00 6.6675e+00 -#> 3.2866e+00 -1.4364e+01 -1.1202e+01 2.3180e+00 9.5690e-01 -2.2525e+00 -#> -6.0650e+00 4.9470e+00 -8.6046e+00 -3.2993e+00 1.0456e+01 -8.4463e+00 -#> -1.1800e+00 4.0888e+00 1.0231e+01 -6.0781e+00 -1.3315e+01 8.7547e+00 -#> 5.2642e+00 -5.8803e+00 2.3019e+00 7.4136e+00 -4.8173e+00 4.0485e+00 -#> -2.2386e+00 -9.5951e+00 -1.3358e+00 -4.3168e+00 -5.1432e+00 1.0044e+01 -#> 4.5064e+00 -1.4719e+01 8.1893e+00 -3.3648e-01 -9.4532e+00 3.5265e+00 -#> -8.1611e+00 4.5290e+00 -4.8454e+00 3.8065e+00 5.1539e-01 -3.5527e+00 -#> 5.1429e+00 -6.1683e+00 -7.7518e+00 2.2368e+00 3.3335e+00 -3.2178e+00 -#> -7.6238e+00 -8.8446e+00 3.7186e+00 -4.0304e+00 -1.2876e+00 3.0657e+00 -#> -9.6459e+00 7.1043e+00 1.1225e+00 -8.9580e-01 3.6940e-01 8.4879e+00 -#> 1.3031e+00 -6.8280e+00 -9.7464e-01 -1.7561e+00 4.3782e+00 -5.3408e+00 -#> 6.9679e+00 -5.8045e+00 1.6076e+00 -1.3174e+00 -2.8242e+00 3.5411e+00 -#> 4.8554e+00 5.4867e-01 4.6937e+00 -8.3519e-01 2.8432e+00 1.4558e+00 -#> -4.0727e+00 5.3502e+00 -6.1120e+00 -1.2640e+01 6.2638e+00 9.2251e+00 -#> -4.0730e+00 -4.3857e+00 4.1525e-03 -5.7744e+00 5.5562e+00 1.8900e+00 -#> -1.4340e+00 -1.0864e+00 6.5575e+00 2.4515e+00 -8.7689e+00 1.5012e-01 -#> 7.8476e+00 3.2083e-01 -1.9992e+00 1.2308e+00 1.4226e+00 4.5515e+00 -#> 1.4021e+00 -6.5290e+00 3.1490e+00 1.2606e+00 1.7082e-01 -2.3939e+00 -#> -6.8526e+00 -6.5598e+00 7.8596e+00 -2.0899e+00 -4.9527e+00 3.3671e+00 -#> -7.9504e+00 1.3332e+01 -5.1713e+00 -4.8172e+00 3.8252e+00 -9.7357e+00 -#> 4.2879e-01 1.0015e-02 -3.9781e+00 -9.1234e+00 1.0227e+00 -1.5022e+00 -#> -9.6010e+00 7.0668e+00 9.2475e-01 3.4143e+00 -3.9469e+00 2.5812e+00 -#> -7.8083e+00 9.0833e-01 7.7153e+00 -7.5587e+00 -5.6300e-01 4.6345e+00 -#> -#> Columns 43 to 48 1.2810e+01 -1.3189e+01 -7.7055e+00 -6.3976e+00 1.5547e+01 8.8330e+00 -#> -1.5065e+01 9.7159e+00 -6.3723e+00 3.3489e-01 2.0888e+00 3.7348e+00 -#> -4.2313e+00 6.3053e+00 1.9464e+00 -8.2544e-01 8.5206e+00 5.7207e+00 -#> 2.0316e-01 1.0027e+01 -5.5466e+00 9.0533e+00 1.8833e+00 5.5861e+00 -#> 3.3048e+00 3.1319e+00 6.2988e+00 3.1838e+00 -2.9775e-01 -9.9813e+00 -#> 1.6776e+00 -1.1691e+00 -5.3201e+00 3.5656e+00 -4.7045e+00 -8.3086e+00 -#> -4.5635e+00 2.0188e+00 9.2125e+00 -3.9032e+00 -3.8079e+00 -6.3385e+00 -#> 5.3327e+00 7.2008e+00 1.8978e+00 -5.8745e+00 -3.4445e+00 5.1775e+00 -#> 5.9848e+00 2.5756e+00 2.4851e+00 -4.3141e+00 -7.1245e+00 -4.7120e+00 -#> -4.2353e+00 4.1360e+00 2.4754e+00 -3.3324e+00 6.6476e+00 -2.6378e+00 -#> -3.8737e+00 -2.7430e+00 -4.2719e+00 6.1796e+00 3.7464e+00 3.1966e+00 -#> -8.9031e+00 5.3667e-02 -2.0811e+00 4.1157e+00 7.7390e+00 4.8121e+00 -#> 2.6389e+00 4.2188e+00 -3.3962e+00 4.2881e+00 -2.4315e+00 9.4363e+00 -#> 8.9556e+00 1.2537e+00 -2.8011e+00 -7.3613e+00 1.4005e+00 -3.8574e+00 -#> 1.2019e+01 1.4726e+01 6.9024e+00 -2.1664e+00 5.1618e+00 -1.1799e+00 -#> -3.3054e+00 3.7320e-01 1.0245e+01 -1.4051e+00 1.6680e+00 -8.1630e-02 -#> 2.8422e-01 -7.4191e+00 2.8573e+00 1.6843e+00 1.4833e+01 -2.8418e+00 -#> -4.4291e+00 -2.8135e+00 -1.5023e+01 -3.1354e+00 2.7697e+00 -1.9877e+00 -#> -1.1016e+00 7.1654e+00 1.2580e+00 7.7710e+00 4.1162e+00 5.1481e+00 -#> -6.9081e+00 5.2128e+00 2.9860e+00 -1.4196e+00 -1.7435e+00 8.4676e+00 -#> -5.3983e+00 1.0983e+01 -3.1357e+00 -1.0623e+01 -4.6322e+00 7.8922e-01 -#> 8.3450e-01 -4.0142e-01 -7.6967e+00 -1.4488e+00 -8.4348e+00 2.2065e+00 -#> 5.7479e+00 3.1151e+00 1.8005e+00 -5.7872e+00 3.0122e+00 -9.6315e+00 -#> -7.8079e+00 -7.6238e+00 -1.7575e+00 1.8205e+00 7.2539e+00 3.8008e+00 -#> -1.0109e+01 -3.5757e+00 8.2105e-01 -4.4458e+00 -5.1365e+00 -1.5818e+01 -#> 4.6419e+00 8.5954e+00 8.1159e+00 -1.8950e+00 1.0888e+00 7.3860e+00 -#> 1.0279e+01 4.0486e+00 3.8668e+00 6.8263e+00 4.4360e+00 1.7308e+00 -#> -1.8266e+01 3.0640e+00 4.1436e+00 8.5667e-01 6.0407e-01 -1.2738e+01 -#> 7.3158e-01 8.1287e+00 8.3764e+00 -5.1014e+00 2.8116e-01 -7.9374e+00 -#> 3.0224e-01 -5.2574e+00 4.2073e+00 6.6237e+00 5.8265e+00 1.7484e+00 -#> 5.1284e+00 5.3158e+00 4.4600e+00 1.2923e+01 1.3637e+01 -3.8000e+00 -#> 2.7022e-01 -4.0573e+00 9.2956e+00 1.5066e+00 -9.9581e+00 -8.9386e+00 -#> 5.2094e+00 1.3030e+00 2.6377e+00 3.6572e+00 -1.2780e+00 1.9778e+00 -#> -#> (11,.,.) = -#> Columns 1 to 8 2.8781 2.2097 -1.0166 8.1775 -0.7718 8.1902 -0.9899 -11.8470 -#> -6.2581 5.5743 7.7125 -12.8765 -3.1623 0.1986 -5.9956 -9.2363 -#> -0.2172 -0.8420 9.4055 -11.2229 -1.8000 -1.2664 -1.7218 -1.0896 -#> 11.4138 1.5981 0.7542 1.7948 -1.1702 -1.7719 -9.2642 -9.8508 -#> -3.8479 8.2731 3.5893 -6.5526 3.9244 -7.8203 -9.8070 3.4631 -#> 9.6756 1.5901 -10.3462 -10.4983 5.3770 -0.4591 -2.7411 3.5073 -#> 5.7282 -6.6532 7.0050 -3.8995 11.4720 -1.5143 -3.0646 7.9134 -#> -10.4324 2.0450 1.2011 5.9723 -2.9955 5.4474 7.2957 -3.6325 -#> 8.0859 1.0551 0.5931 0.0603 -6.0898 -3.1183 5.7426 -6.1302 -#> -12.6554 11.6160 9.1281 -8.6554 -0.7018 3.1528 -6.0132 -5.0604 -#> -14.8811 9.6235 -2.2084 0.9378 5.9259 2.6533 -0.6158 -0.3892 -#> -7.6016 11.6173 -4.0455 -9.1039 7.3522 1.6756 1.2683 7.6569 -#> -4.3699 -5.2974 5.9473 7.1043 -3.2537 5.4743 -1.0449 -5.1495 -#> 5.2138 -2.3339 -1.0099 0.8368 2.8119 2.7774 3.6714 0.0031 -#> 8.2720 -6.8858 7.2203 -5.7084 -10.2222 -5.5913 -9.6192 0.9030 -#> -2.1772 -4.7607 14.7243 3.4797 -0.6596 -1.4458 2.4896 -5.4278 -#> -7.1641 -0.1249 -7.9006 -3.6037 -3.3484 8.6855 -5.7696 2.6918 -#> -2.5526 -4.4526 -2.4355 -3.2067 14.2889 3.3216 1.6071 0.7929 -#> -15.2051 14.2628 12.3325 3.1186 -1.8251 -7.4071 -2.9547 -1.7397 -#> -5.2387 9.3243 -7.0639 -0.8562 -4.1239 3.8398 1.4057 -1.6789 -#> 0.5528 -0.8265 -10.4466 -4.1306 6.3734 1.5282 12.3649 4.0504 -#> 11.1108 -3.0633 -3.6177 5.7682 5.1008 -2.4357 7.9087 -9.6357 -#> -10.3575 15.5950 -6.8540 -12.3639 -0.0412 -8.0590 -2.7639 6.2915 -#> -7.8598 14.0037 -4.7523 2.2997 6.1635 1.7026 -6.2875 -1.1226 -#> 3.5824 -5.8863 4.7592 -3.0903 2.6901 -1.9886 2.6103 5.2841 -#> -1.5312 4.2905 8.7163 1.5517 -6.8520 1.7236 -6.1933 -7.1499 -#> 9.4862 2.0688 -4.7118 -1.9646 -3.7517 -5.6849 -2.6916 -0.4818 -#> -6.3272 -8.3648 1.5437 -12.2026 4.5188 11.4448 -0.7743 21.8454 -#> -4.0406 3.5181 4.3892 -3.6517 -10.6121 1.8866 2.7786 -0.3579 -#> -14.2391 7.8265 -13.1909 0.1867 -0.9296 1.5074 5.8571 -1.2301 -#> -0.2945 6.9921 6.2890 1.7168 -1.7118 -6.4731 -5.1135 -1.5401 -#> 2.0167 -0.0208 2.3058 4.0023 -9.0107 0.2120 -7.0342 6.9787 -#> 7.1531 0.8605 9.0946 -3.1487 0.0899 -3.9209 2.6381 -3.0970 -#> -#> Columns 9 to 16 -6.1915 -8.2626 -9.0006 -4.0418 -5.2003 -5.3085 -4.1699 16.7122 -#> 19.5822 3.5655 -0.5455 2.5924 2.8166 -7.1293 2.1717 0.6939 -#> -0.0666 -1.4415 2.4082 7.5363 0.5220 -1.4166 4.4682 -3.4278 -#> 4.9616 0.3099 -4.0076 11.1014 -5.7296 2.0147 -7.3795 9.0969 -#> -8.2887 -3.4343 9.5883 -2.1146 0.5618 -2.2804 5.8889 -7.3778 -#> 14.1456 5.9190 -2.0328 -2.6687 2.2124 0.8063 6.1982 2.2195 -#> -1.5928 0.4058 8.9747 1.6188 3.1345 -9.7028 -7.3274 6.0963 -#> 10.4677 2.9256 0.9422 11.0137 -2.2644 -11.5415 7.8916 2.9541 -#> -6.0708 4.4333 3.1589 -5.0676 -7.9851 2.3739 -6.0588 -2.1832 -#> 10.8193 4.0739 0.8660 -2.5729 0.8408 -8.6158 4.2908 -6.9327 -#> -6.8986 5.9385 4.4483 -11.7290 -14.5286 0.4827 17.4183 -2.5515 -#> 5.1375 4.0198 8.6159 -4.3389 -13.8087 0.7683 8.3413 2.9647 -#> -6.5967 -1.5645 -1.2306 12.1490 -6.1894 14.5564 8.7421 -4.9975 -#> -11.2655 0.8737 -6.1851 0.2204 -5.8681 4.7180 5.9456 1.9183 -#> -12.2213 -15.1589 -2.7198 2.5953 -6.5069 -10.0845 9.1320 -10.0574 -#> -0.7626 1.8518 -2.9269 1.0017 12.3075 5.7738 -5.0093 9.1991 -#> 3.9567 -2.3733 5.1063 4.6667 -0.3354 -10.5436 -11.7094 8.5977 -#> 2.9944 -6.6322 4.3978 -6.5010 -0.0759 -10.2965 4.9730 2.9660 -#> 8.9919 2.6644 -4.4370 7.2588 4.9745 1.3050 19.2494 10.3855 -#> -1.0392 -2.9545 -5.6787 3.8888 -1.4883 -5.5207 6.1828 -3.4936 -#> 10.7327 4.6214 -0.5641 0.3687 -1.4311 -4.9733 3.9833 0.7176 -#> -8.3585 -5.7510 -4.0912 -2.6170 -1.7727 7.6623 -6.1395 -9.1816 -#> 2.2889 -0.2266 3.1189 -2.6195 4.2082 -5.6330 9.8348 7.9776 -#> 8.7933 12.3731 -3.5023 2.7876 -0.1834 2.7520 -10.9274 -2.5809 -#> 11.7923 6.6071 -4.3929 -12.0979 -4.8559 -3.5806 -1.5844 13.4595 -#> 0.7568 -5.8900 2.0718 14.9415 2.2399 -6.6502 14.8196 8.2244 -#> -3.6494 -1.7169 6.9579 -0.4423 -3.3180 -1.1368 -11.8920 6.4290 -#> 6.7029 -10.1809 -1.9819 -5.6661 -2.7719 -2.5286 -3.1997 -9.4372 -#> 8.7309 -4.0389 -11.5737 -8.1458 -3.5622 -0.5749 8.0254 -1.8079 -#> -10.1502 -2.2120 -3.8260 6.5190 6.1603 0.6438 0.1826 1.3515 -#> -3.4241 -1.1157 6.5079 11.2266 -4.7174 -7.4841 2.2192 -3.6866 -#> -1.7339 -2.9707 -5.3129 -6.4849 -7.4874 8.1912 9.1298 -6.6721 -#> -3.9521 1.4523 0.2940 -1.9947 -3.2829 16.9990 4.8489 6.0814 -#> -#> Columns 17 to 24 5.6664 11.0845 -10.2942 -4.7752 -6.0080 8.1106 -3.5690 0.9122 -#> -7.6126 -19.9690 4.6329 -8.9806 -0.4967 2.6903 7.8598 -4.5414 -#> -11.8706 -5.2885 -0.0619 -1.4086 3.6219 0.9005 1.2591 -2.8981 -#> -5.7637 -11.8687 -0.4377 -7.7378 4.4512 -7.9747 -8.7335 0.4574 -#> 3.1455 -1.0820 2.9134 0.2344 3.9357 -11.2709 0.1974 -3.3359 -#> -12.8174 -2.0394 -9.5841 2.4451 7.2319 -0.3387 0.5652 -0.1930 -#> -13.7959 8.0847 3.8370 -1.3418 -10.4121 5.0823 -8.9951 -1.7491 -#> -6.6635 3.5142 1.4026 -11.0846 -2.7440 9.2802 3.5806 5.7885 -#> -1.1767 -0.9026 -8.0627 -2.0746 12.1456 -3.9612 -5.6939 3.9885 -#> -1.7274 -16.3384 7.3582 -6.4908 -0.4155 -1.4168 10.5900 -8.3264 -#> 22.9371 -2.6684 7.5032 -5.1639 -0.7731 -15.6504 2.5496 -25.6584 -#> -9.9856 13.7973 0.9767 -14.3910 -3.7744 0.1570 -6.9156 -0.4604 -#> -1.7880 3.5345 1.4397 1.9685 8.2998 -7.1436 2.2520 -1.8548 -#> 2.6198 6.9447 1.9977 -4.7944 7.8972 -9.6881 -14.5384 2.0402 -#> 2.4234 3.0974 -5.9777 12.3375 2.4437 -5.6257 0.3055 3.5800 -#> -2.0321 -4.9167 1.8321 -1.6269 -7.0493 -7.4840 -6.6248 8.4155 -#> -6.3536 -5.2497 -0.8554 11.0275 0.3792 2.4790 6.4480 -4.4525 -#> 0.3469 6.9205 4.5552 -4.9644 9.5360 -1.3360 -9.3090 -4.5213 -#> -6.4502 -11.7928 -6.1335 -8.0252 0.8029 -9.4381 5.9685 -4.3368 -#> 8.8216 0.8329 -0.2735 -7.1837 -8.0822 -9.6586 1.2252 0.6287 -#> -1.3171 -6.6742 6.8636 -18.5117 5.8795 5.2577 0.1714 3.6908 -#> 1.8756 8.8235 -7.2492 1.8759 -0.0843 -4.4299 -16.8106 9.7002 -#> 5.3941 -1.8122 -2.9792 -10.6169 3.2304 -1.0800 -0.9455 -3.4328 -#> -7.0644 -8.4130 4.7494 5.3553 7.0956 -8.0271 20.0530 -7.7643 -#> -2.8851 -2.7643 -9.6940 0.6988 -13.5548 -11.2170 -6.3717 6.5294 -#> 1.4785 -1.5019 -13.8826 2.5866 -6.0112 1.4736 5.9116 13.5663 -#> 4.4456 3.7312 -3.7568 -0.6211 3.6518 0.4688 -5.2054 4.7016 -#> 4.1419 7.2961 22.8483 18.7562 -13.6192 10.4419 5.3509 -19.8762 -#> -1.3481 -10.2438 -1.7149 -11.4341 -10.5016 4.5516 -6.6300 10.4442 -#> 4.8445 8.3227 -5.4541 7.5567 0.8922 -0.4789 0.3967 -5.8435 -#> -4.6249 13.5624 -6.0266 -7.0758 0.9136 -2.8116 0.4097 5.0527 -#> 2.1420 -5.8441 1.0169 3.2376 -15.4424 -10.0194 7.6189 0.4871 -#> -6.5184 -1.2838 -10.3528 -3.5135 -7.1717 -4.0199 -10.5834 9.9731 -#> -#> Columns 25 to 32 -10.3515 3.4975 -3.6517 7.7534 0.3283 0.2941 -7.0501 -6.8274 -#> 1.4270 12.2531 9.8292 -0.3401 3.0283 1.5113 2.4921 3.2593 -#> -7.5005 7.8531 8.1607 1.5169 3.4709 -12.8539 -1.9708 12.2384 -#> 2.4944 3.9962 0.7539 3.6741 -9.7667 -4.9758 3.8036 2.6912 -#> 5.5318 2.7070 1.6485 -5.0410 -0.7996 -4.4982 4.1551 15.4612 -#> -10.5261 -5.4848 4.7405 -1.1069 3.9326 5.6820 -5.2473 -8.3052 -#> -1.1329 -1.5255 12.7872 -3.1154 -0.9708 2.8787 1.7427 14.3937 -#> -12.2924 -4.7862 7.1931 0.5332 7.4366 7.9451 -9.8216 -8.7823 -#> -0.7814 4.8600 0.4755 11.8892 -3.6985 -3.3411 5.5341 0.2644 -#> 3.4088 5.4109 8.5450 -2.0580 -0.5300 -3.7219 7.3510 7.8470 -#> 9.0335 9.4278 11.0499 -3.8728 -1.4293 -12.1467 1.2945 -5.4632 -#> 6.8743 11.7608 1.6648 -4.8751 13.1060 -2.2418 4.4712 3.7576 -#> 6.7995 3.0815 1.0554 -3.7866 1.1449 -10.6269 -4.1921 -1.4030 -#> -0.4489 3.0557 1.8424 2.0436 -6.4865 -3.8906 0.9938 -2.0567 -#> -10.4877 -4.4549 1.0167 0.4971 -7.5357 -6.5673 -10.2117 4.3548 -#> -0.8417 2.0043 -0.5281 -1.6514 -12.2684 -1.2485 -2.1550 2.2551 -#> -12.7666 -6.6186 0.8067 5.8773 -3.6131 -2.2668 11.4308 7.8232 -#> -9.4452 -1.5626 6.0157 -1.6627 4.5507 -0.6084 -3.8642 1.6012 -#> 2.9794 0.1325 3.8421 -8.8259 -0.4939 -18.2918 -11.1140 -4.3855 -#> 6.2471 4.9542 9.4580 0.6201 -2.4679 0.5176 9.5204 6.5565 -#> -0.3907 4.6630 12.7648 -0.0060 7.7318 9.3784 -0.8921 5.6945 -#> 12.5322 13.0636 -1.7965 4.8947 -6.6470 2.9905 9.2692 -3.8457 -#> -0.9241 -11.4957 -1.6680 -5.9663 9.7777 -4.1905 -8.1391 -2.7973 -#> 5.2600 -4.3981 -7.8513 -1.1895 4.0581 -2.9813 10.5569 5.8662 -#> -6.3160 9.7658 2.4030 10.7301 -2.1206 0.5316 2.3882 3.0174 -#> -9.0679 -6.4728 0.3893 0.7980 -4.6750 -4.5644 -10.7221 4.0892 -#> 1.4679 -5.2258 -3.4626 -0.5695 2.0237 -1.4349 2.7530 -4.8818 -#> -7.9846 6.8764 4.2227 2.3961 0.2227 10.2062 9.4042 9.6672 -#> 0.3163 12.4046 -0.8354 0.3742 -5.6021 -1.8028 -0.4147 4.0197 -#> -7.1034 -3.1581 -15.2460 3.4164 1.7271 -13.3808 8.1067 5.0651 -#> -5.3282 -7.1169 -14.3244 -2.9758 9.1079 -3.9552 -3.5461 -5.6140 -#> 15.6531 4.8156 1.1635 -6.2082 -6.7749 4.0234 -1.0711 -5.8332 -#> 4.3762 14.6048 -1.9513 1.7074 -6.0273 -8.7757 -5.1166 8.4456 -#> -#> Columns 33 to 40 -10.1023 -3.9219 -9.1799 9.5749 8.4975 -11.1930 9.7151 7.1257 -#> -4.6959 -0.7548 12.8207 10.9396 0.5109 -2.8755 13.5778 -6.5760 -#> -5.4996 -7.8202 4.7448 6.0284 -1.3823 8.4791 13.4582 9.2748 -#> -6.7925 3.9432 4.9907 6.8099 3.3344 2.6087 3.8572 -13.1837 -#> 4.0631 2.9278 -2.1702 7.0277 -5.9275 -6.6336 -14.0380 -0.8810 -#> 13.7508 7.2422 10.8456 3.6631 -9.4903 -0.0949 -9.4799 -13.1683 -#> -6.4126 -4.4764 -3.1920 3.6200 9.2765 -0.6455 -2.4736 -3.5236 -#> 5.1701 -2.2355 -4.0393 -2.0168 9.7074 -5.0182 5.5302 0.5364 -#> -4.7535 2.4163 -7.0061 14.8368 -6.1606 2.4834 -2.1907 0.2419 -#> 1.8371 -6.5680 2.0259 6.1676 1.0617 -9.2078 1.2528 6.3787 -#> 9.4472 -6.5630 1.2068 10.8190 -6.5610 3.5445 -2.8692 2.9642 -#> -7.2891 6.0978 -3.3751 4.1682 2.6300 1.0304 6.1595 -13.8636 -#> 0.6786 -0.7477 -7.2634 -1.8376 3.1016 5.0136 3.9515 1.5174 -#> -12.6435 4.9984 1.0216 3.7301 -11.5680 9.5201 -1.4775 -8.6870 -#> -3.6625 4.2057 5.5981 4.0501 -3.0935 9.0510 -4.4381 -1.4277 -#> -7.8689 12.5444 -3.1417 10.3896 -9.9695 3.2747 2.8322 1.7519 -#> 1.8620 -12.7264 2.2756 -1.6458 1.6316 -13.8687 -4.1662 16.7878 -#> 0.3420 3.7973 -1.1359 -1.4965 -4.0083 -4.1914 1.7148 2.4032 -#> 6.3714 2.4896 7.8242 6.8193 7.0321 3.2046 6.8605 -4.6502 -#> 2.8786 -16.2526 2.4352 -3.8882 2.2542 -2.4454 -1.5951 4.7389 -#> -23.1875 -4.1333 6.3962 6.6887 -6.1362 0.8058 15.6956 7.2013 -#> -8.7329 7.1801 -5.0613 1.1381 -2.0916 10.0014 -3.2677 -12.7174 -#> -4.5880 11.6920 14.5654 1.7693 4.4473 -4.4503 -6.5495 -8.7196 -#> -12.2589 -7.2170 7.9125 3.5010 -9.2617 -7.6637 7.5840 8.4875 -#> -5.3479 -5.0334 5.0921 6.9245 -8.3232 7.1854 4.1576 -1.5226 -#> 8.1612 -11.9883 -3.6214 13.7768 3.6118 -6.4589 -4.7312 -0.0460 -#> -4.4050 2.7748 -5.1623 7.4096 5.4151 -9.0273 -6.8535 -1.7368 -#> -2.0037 -5.8586 4.3548 -7.4482 -17.8355 7.5988 -5.9774 3.9989 -#> -4.5390 -2.9569 6.4785 2.0471 4.1070 -0.4610 10.3860 -6.3458 -#> 2.7225 0.7216 -7.2652 -4.7884 1.4376 -6.6834 -0.9750 8.7055 -#> 6.1952 6.8069 -4.0349 3.0092 -4.8960 1.1997 -1.7488 -7.0332 -#> 3.4292 -9.9456 6.0187 -7.4573 6.3376 -2.4621 -2.0052 0.2439 -#> -2.6957 6.6021 -1.6683 9.6269 -5.0222 3.0561 -0.4602 -3.9711 -#> -#> Columns 41 to 48 -2.3662 -4.1178 6.0542 -17.2736 1.1544 -14.6532 -4.2327 -1.6723 -#> -9.6362 -4.0162 8.9831 0.0999 0.5063 -5.7267 -0.4100 -3.6152 -#> 4.8958 -15.0266 -0.8540 -2.6382 2.8027 3.7457 6.4793 -3.2880 -#> 0.2428 -20.9852 -2.8938 -5.1603 11.2823 -5.0218 3.5444 2.0790 -#> -4.0056 2.6767 -14.2301 -2.8351 -1.5785 9.4275 -3.2266 0.2698 -#> 13.7052 -6.2582 -17.5901 13.6814 2.5738 7.9126 -6.7576 3.7483 -#> -0.4129 -9.1739 -1.6229 1.1630 0.6235 4.9927 0.3573 -0.4206 -#> 9.1233 3.1910 10.6905 9.6390 2.7398 -2.4940 0.5708 -2.8679 -#> -7.8422 -9.4731 -9.4229 -5.5931 0.8138 8.7929 -2.3753 -1.6864 -#> -3.2461 3.6471 0.5257 -10.9632 3.6299 -0.0675 3.1798 -15.2807 -#> -14.9918 -3.9584 -9.7075 14.5392 -5.4990 -1.5539 -12.2078 8.0035 -#> 9.5730 11.4291 -7.0751 0.0573 -15.4524 8.8957 3.7455 2.1895 -#> 7.8082 -3.8102 6.6591 -1.2311 -4.3900 0.9965 9.2195 -10.1490 -#> 3.8526 -15.6402 -5.1694 -3.2650 5.3320 0.6164 3.4690 -5.3112 -#> -1.2557 -12.2737 -7.2549 -3.3846 7.4051 1.0831 -4.2061 -6.6609 -#> 0.8970 -14.5638 8.9470 -8.6792 -1.5913 10.7282 3.4311 -6.1475 -#> 4.0533 3.6346 22.0376 -11.0145 -3.3696 -10.2912 1.8133 -4.4615 -#> 6.9292 1.4592 -10.4186 9.6825 1.6393 7.4733 -6.5915 1.2002 -#> -9.6357 -5.2183 6.0034 6.9522 -8.2266 2.9211 -1.4970 7.7575 -#> -4.3216 -1.2334 -8.8963 -0.6198 2.8602 -9.6112 -12.6712 -1.2095 -#> -0.2654 -4.1480 0.0734 8.8209 2.3408 0.8615 -4.3848 7.7182 -#> -7.8335 -11.9261 -1.9578 -4.8017 -4.1270 -2.7510 -7.9025 3.6308 -#> -5.1774 1.1036 -3.9699 -7.1527 4.0851 8.1275 0.2665 4.3704 -#> -10.5100 8.6654 3.9859 -11.6839 -6.7456 -1.7912 4.1879 6.8301 -#> -8.2875 -7.6335 -5.0641 3.4509 -0.7044 5.7884 -1.5903 5.4270 -#> 14.7860 1.6551 6.0810 -0.1190 -0.6695 -3.8393 -2.5997 -2.9030 -#> 3.0867 -8.2454 -2.4351 3.7962 -2.4292 0.2244 -0.2360 6.7647 -#> 2.3410 2.5302 -3.1083 -6.5420 -13.2832 6.7880 2.6604 2.6548 -#> -0.9989 -4.2706 7.0629 -3.4784 6.5034 -6.2423 9.9963 -14.3233 -#> 4.2896 -4.0390 -1.4066 -14.4308 -2.8039 9.3129 -0.1018 2.7691 -#> -2.0399 6.2666 2.7508 -5.6485 -1.2999 4.8603 2.6266 -1.8459 -#> -3.3496 -4.6115 1.7148 -2.4124 11.2194 -6.3833 11.5255 -9.6916 -#> 3.6975 -7.3336 -0.8093 -4.4650 -9.5633 5.6365 0.4674 -2.7346 -#> -#> (12,.,.) = -#> Columns 1 to 8 -1.6277 1.2656 -1.4825 11.3328 10.6316 3.1802 2.7966 7.3602 -#> 0.7563 11.4658 3.0252 -5.1515 -9.5619 -2.4433 -5.7716 3.6334 -#> -10.5882 12.0208 10.2176 0.2408 -6.5390 -8.0264 -0.8759 -1.1588 -#> -0.7622 4.3897 -1.1896 -1.7668 -8.9001 1.3332 -7.5461 8.9010 -#> -3.4753 -0.1087 8.2465 -6.7310 -5.2670 -7.9490 10.5989 5.9618 -#> -5.2578 -9.5628 3.8548 -2.9971 -7.3184 -3.6002 -3.0816 -13.1817 -#> -1.4256 1.5171 -0.5876 -11.1181 -4.1151 -5.4291 6.7526 8.3762 -#> 11.6134 -3.6160 3.2140 -0.5094 0.0399 -2.3758 -3.6816 5.9869 -#> -3.0449 -2.3651 -3.8286 4.9898 -7.6807 3.3386 0.9428 -0.6329 -#> -8.3678 1.8662 0.6236 -3.0013 -9.7194 -8.1617 -1.3421 4.1308 -#> 15.7022 3.1097 -0.6749 -15.2644 -1.2098 -6.0772 6.4032 -5.7096 -#> 1.1139 12.4873 7.5104 -6.8317 3.5047 0.8986 8.7020 5.2628 -#> 10.1008 0.3333 1.9349 -2.2046 -10.1011 -1.9751 -6.3534 0.1331 -#> -6.7504 4.4257 -8.3930 4.4600 -6.5332 1.4475 0.3728 -2.6905 -#> 2.4878 5.3327 -1.4330 -2.6156 -6.1966 0.1642 9.1931 0.4174 -#> -7.9441 -2.8109 -10.7725 1.8952 -18.0593 0.0408 -9.0493 -2.1951 -#> -15.9260 9.6935 2.5494 -2.1085 -0.7622 -8.1082 2.7542 10.5612 -#> -15.6643 -9.0980 9.7987 -4.7460 -1.0128 -5.6632 4.0524 -1.4456 -#> -0.5765 -0.7434 -8.1518 -23.3519 -10.5594 -6.9221 -8.8655 1.9328 -#> -1.7641 8.7819 11.6229 2.5808 6.2782 -1.7949 8.3206 13.0015 -#> 7.5008 1.6181 3.6726 -4.3403 0.8462 -0.1154 -2.3327 4.6810 -#> 14.0324 2.9959 -11.2916 8.4420 -1.3429 10.1206 6.9614 -1.7276 -#> -9.8757 -3.8125 -9.9773 -9.4471 5.8427 -8.9117 -6.0019 -4.9530 -#> -7.5694 13.4387 -14.0810 -13.3506 -0.1395 -1.3370 4.0942 3.1384 -#> -15.3091 1.5466 -7.9623 -5.0435 0.9694 9.4680 9.3944 -0.9201 -#> 10.9647 -3.0660 -0.2466 -3.3298 -10.6204 -13.0028 -1.8922 10.1039 -#> 1.9402 5.1355 11.9231 5.3982 4.5491 2.5113 4.7216 5.9898 -#> 3.4196 5.3555 -3.5012 5.1005 2.0839 0.7908 10.2228 -5.0219 -#> -12.1527 0.5536 -4.1541 5.5112 -1.1533 6.0899 -1.1417 5.2220 -#> -5.5688 -0.8625 -9.4234 -4.4074 -1.7161 -8.4068 -7.6863 -3.9451 -#> 7.5764 6.6458 -1.9726 -7.0190 4.9921 6.4549 1.7532 6.0018 -#> 7.9313 12.2998 -4.0938 -7.3600 -3.4463 5.2630 -3.4504 -4.2298 -#> 5.3973 -4.2705 -3.6850 1.1945 -8.5218 -1.3317 -2.4017 0.3735 -#> -#> Columns 9 to 16 -9.1700 -5.4425 -9.0628 2.4983 4.2815 11.4832 3.8628 -12.8614 -#> -2.9090 -0.2501 -6.0506 14.9251 0.4494 -0.0867 5.6211 1.6150 -#> -8.0287 -5.9114 -0.7751 4.1525 0.1630 7.7034 5.3568 3.9979 -#> -13.7131 -4.1252 9.3736 11.7485 9.8657 -4.1714 1.9415 -9.1430 -#> -0.1387 3.0135 0.7491 -1.8394 0.1614 2.2556 8.3285 -0.2316 -#> -7.1668 2.9922 2.4090 15.7913 -6.2054 1.9636 -3.4500 7.7883 -#> -4.2918 -4.8643 9.0532 -4.6586 1.6853 11.3118 -0.0763 12.0158 -#> -3.6470 2.7414 1.4341 3.2723 -2.6892 -4.6365 0.6876 2.2663 -#> -1.0103 11.8353 2.4095 3.1762 3.4075 2.2852 10.2877 -2.9286 -#> 0.0213 -3.4105 -5.0831 -0.0772 4.8501 1.4105 6.8952 0.7864 -#> 20.3048 2.1310 -2.1970 0.2460 3.9726 -6.4222 -8.8292 -4.2873 -#> -9.2621 -9.3889 -2.6775 3.2402 -10.3275 6.7707 -0.3343 -5.5113 -#> 10.8026 8.6806 0.4268 -7.1872 -6.6650 8.9344 7.0174 -5.5875 -#> 1.4095 2.6411 5.9594 -5.0065 8.6666 0.3373 11.5285 -3.7100 -#> 8.7947 7.9955 5.4419 -9.5228 12.7298 8.9183 5.0018 4.0518 -#> 0.5228 -2.5511 -0.3390 1.5011 7.9529 -9.4144 -2.0756 -8.6872 -#> -7.0986 -6.2085 -1.4724 -1.9024 -6.0415 5.1373 -8.1891 -2.0590 -#> -8.7808 -0.9564 -12.4611 -8.2841 -4.8299 -0.9901 -2.5113 -4.2294 -#> 5.1060 3.8005 10.3105 18.3401 -9.3370 -6.2471 -4.2201 -1.0089 -#> -4.3429 -12.8005 -3.6652 -3.7718 6.2987 10.2018 4.2139 10.7305 -#> -17.5902 -7.8866 12.9877 4.4011 2.7936 -7.6180 13.9074 1.7445 -#> 4.2564 8.6423 2.3472 -9.2550 8.6073 9.5538 6.6635 -6.1956 -#> -13.6064 -5.4308 6.6853 3.2149 -1.3079 -3.1043 -5.8836 0.8049 -#> -1.6895 4.3386 8.2895 3.9100 -12.1941 5.3036 6.2939 -16.0732 -#> -8.1796 -5.9860 -5.0674 9.5823 -5.6035 5.2585 -8.3461 4.2852 -#> -7.3154 -9.5797 6.5789 5.3402 -3.7686 1.3971 -2.6134 1.4748 -#> 0.2336 9.6401 6.4530 8.9854 9.1470 12.0268 -3.3962 2.6446 -#> 10.4801 -7.9177 8.4949 -1.7847 -3.1570 4.6640 -3.5705 11.5037 -#> -3.3292 -7.2115 -7.2814 5.5860 2.5205 -4.7651 2.1381 11.7788 -#> 1.8813 -1.8615 -7.5800 2.1251 -14.1831 -0.7501 -3.1613 -4.1596 -#> 7.9042 5.9622 5.9364 7.8909 1.2282 4.5489 -10.1794 -10.3223 -#> 1.7342 0.1356 1.8621 8.5985 10.3165 1.0473 -3.2761 5.2590 -#> 2.2492 2.5793 -1.9641 3.2908 -4.4731 -0.2186 0.3898 0.2026 -#> -#> Columns 17 to 24 -4.2233 6.4903 6.4824 -1.4196 4.8213 6.1748 6.0226 2.4244 -#> 3.0449 3.7639 -2.5759 6.9809 -0.7800 1.7511 3.8831 4.2390 -#> 1.0488 -2.6044 -4.1870 0.7029 5.0582 13.9877 5.3120 3.5888 -#> 6.0070 6.5205 -8.5075 13.3583 -0.2409 6.9363 0.4215 1.7057 -#> 4.8480 3.5007 -1.8699 -10.8386 5.1291 -4.9043 -2.1371 -5.0365 -#> 19.0496 -13.6680 -6.3532 9.1346 -4.0195 -3.4128 8.0442 -18.1827 -#> -10.8550 -11.0827 8.8548 -1.0606 -3.7950 6.9387 1.0117 4.5115 -#> -1.6574 -3.8386 1.0533 -3.1069 -7.0465 2.9287 9.3957 2.9272 -#> 4.9568 3.9067 -6.9688 -5.0304 6.6911 -2.0881 1.5644 4.6007 -#> 1.8367 7.9528 0.6834 -2.0801 4.4093 -1.0517 3.4551 12.8767 -#> -2.4516 1.4194 -6.7113 2.9893 11.2839 -8.8115 -12.6606 -2.3365 -#> -5.6027 4.1230 -0.2919 -10.3406 8.2155 1.4383 -3.7078 -15.4252 -#> 6.5895 2.5759 -1.3891 0.8086 -11.1012 9.8928 -10.9215 15.6942 -#> 1.6642 -1.2642 -0.4765 2.7864 -2.2443 6.1590 1.0146 2.5986 -#> 1.0419 1.3659 0.6039 -5.0359 -2.2402 14.3927 9.0557 5.2491 -#> 6.4229 3.5484 -1.2524 -0.4892 -3.4019 3.0809 3.7813 4.4377 -#> -5.1540 0.9143 14.0819 -1.7899 0.6947 1.4108 -3.0455 4.5929 -#> -10.6781 -6.3872 -4.5548 -6.6081 -0.5025 4.2739 5.1464 -5.9072 -#> 8.7503 -5.9570 2.4831 -4.3309 4.5570 6.4887 7.9486 3.6529 -#> -6.3005 4.0479 9.9377 3.0526 -0.2363 -7.0282 -0.0508 4.9989 -#> -6.6149 4.3750 -2.5539 0.0184 -6.1014 -5.4061 6.5405 3.4628 -#> -9.9721 8.5501 -2.9886 9.5830 -8.8987 -0.2068 2.3084 -0.8445 -#> 7.9700 -2.0417 7.5790 -15.0848 2.8634 -5.3935 2.9578 -3.5884 -#> 3.0857 -0.8034 -2.5963 -4.5656 2.6836 -8.8144 0.8256 4.5884 -#> 1.8981 -7.6407 0.2676 2.9430 9.5955 -6.7089 7.3951 -6.9108 -#> 9.6305 4.2658 1.6670 -3.3669 -5.1492 -0.5657 11.6158 5.5527 -#> 1.0008 7.9193 9.3322 -5.0705 4.1512 6.1594 -3.7197 -6.7994 -#> -2.6899 -4.9149 -2.1636 7.0355 -8.7861 -5.2892 -10.5869 -17.6919 -#> 0.6838 0.8199 7.2066 2.8384 4.5133 5.0984 3.4396 3.6857 -#> 4.4714 2.8224 7.5290 -7.9173 3.3588 -8.8742 -11.3356 -7.5563 -#> 8.5606 6.7477 3.0738 -9.4245 6.5681 11.2474 -3.1492 -0.1624 -#> 8.2534 -2.3553 5.9498 1.4656 7.2768 -0.2787 -7.3997 6.9294 -#> 9.2180 6.3995 -3.5352 4.6713 2.0107 1.4465 -1.4998 -13.3744 -#> -#> Columns 25 to 32 1.2006 -21.4737 0.9001 5.9005 6.3590 13.1644 12.8056 -2.2806 -#> 1.8015 1.9923 7.9683 -2.1466 1.5827 1.7011 -0.8395 12.8080 -#> 5.4818 -0.8662 3.4282 3.0912 0.0991 -15.9086 4.2903 9.6336 -#> -4.9352 4.1994 -1.9150 3.4635 -0.2113 -2.8963 2.8749 -2.0883 -#> 9.9119 -2.3160 8.6780 -10.4556 -3.6645 -2.5613 -7.9138 1.2894 -#> 7.8871 -15.3103 -8.5266 -3.0464 4.2798 2.4715 -6.7049 17.0776 -#> 4.4950 -10.8351 3.4407 19.0463 -7.8943 -0.6938 4.3225 -0.0173 -#> 3.2397 -3.7939 -4.6765 8.3715 4.3350 2.6491 5.8642 -4.2448 -#> 3.7390 0.5699 1.5634 1.3752 -8.9437 -1.8978 -5.1261 -3.9542 -#> -4.8704 5.9890 -1.4254 -10.6813 6.4396 0.1326 11.2238 8.7575 -#> -10.8583 -4.5291 11.5939 -4.8853 -2.0942 -8.0907 -3.6970 4.8505 -#> 10.9864 5.7334 -10.6233 7.9893 -1.8921 -14.8004 7.8390 10.5799 -#> -0.9188 11.9645 0.7103 -1.0071 -5.8439 -12.7488 4.8880 -0.4999 -#> -9.1022 1.4576 4.3436 -4.2208 -4.4709 -6.3411 -0.1393 9.1185 -#> -7.6898 -9.3494 9.6089 -16.1926 -1.2733 9.3045 -3.8841 -5.5536 -#> -0.3512 -3.1724 1.7844 -1.9881 -15.5762 2.9981 -0.3080 -6.5510 -#> -7.1989 -0.4827 -6.9958 6.3886 11.6115 4.9651 14.6168 3.8082 -#> 8.4152 -8.2617 4.9626 -1.7486 -6.7048 4.1693 2.4776 20.8345 -#> -9.3380 -1.1969 -0.6511 2.3570 -4.3794 -1.5527 -4.2422 -17.0752 -#> -13.8299 6.4868 -8.9810 0.4807 13.4931 4.3804 10.0231 1.8395 -#> 4.7416 -5.4565 2.1269 8.8037 2.9015 -2.4739 -10.3819 3.0256 -#> 4.5900 1.1081 7.7875 4.9756 -5.1406 9.5396 0.5284 -7.9706 -#> -10.6671 -1.8440 -5.7507 -9.6134 0.4597 2.5786 -10.8169 6.2005 -#> -2.9426 -5.4963 0.2590 6.3128 4.9737 0.8832 -0.7476 -6.4966 -#> 6.2490 -8.2589 2.1780 -4.8646 -2.6697 14.5472 -7.1675 -2.4236 -#> 6.2239 -4.4174 -3.0557 -0.4300 -0.0547 10.7877 2.4003 -14.0559 -#> -6.4656 -0.9594 -4.6296 10.1206 -4.0693 0.3593 -6.3911 -5.4361 -#> -7.3114 -12.1247 3.1014 10.2935 -7.8968 -12.6844 -6.3857 -8.1799 -#> -5.6425 8.9205 -3.6614 -5.8458 3.4007 3.4536 7.0519 -0.3873 -#> 9.2386 -12.7774 -2.1179 -5.5670 1.9680 -9.7900 10.5361 0.4400 -#> -7.3526 -4.2630 -2.5834 -8.1674 3.0012 9.5988 -1.7162 -3.8316 -#> -8.0502 21.1164 -10.2557 -11.3802 5.5937 -13.6895 4.5989 6.8188 -#> 16.1440 -3.9049 5.8081 -1.8903 -4.4917 0.1119 -8.0698 3.1317 -#> -#> Columns 33 to 40 13.8945 5.1376 11.2892 4.7304 5.2678 -3.4176 -3.9634 6.5021 -#> -8.8483 -1.0424 -7.2540 -1.0065 -9.8846 -5.2708 -4.0247 8.4387 -#> -10.8568 7.6129 8.4755 -3.1647 -4.1767 1.8637 -5.9465 6.6877 -#> -4.7889 2.6010 -11.0272 -0.9510 -7.6316 5.0427 -2.6191 1.1624 -#> -8.7421 -19.1770 -4.6337 -12.8940 -2.2656 1.9619 0.7837 -0.8762 -#> 12.0341 -4.1220 -0.8017 8.1249 3.1926 1.6903 -11.3236 9.4815 -#> -9.4582 -4.6722 1.3877 2.0304 5.5673 7.5023 -1.9270 13.0872 -#> -4.0317 0.9335 8.0035 4.8700 6.0325 -2.9948 -8.9064 -1.1722 -#> 0.2323 8.4465 -1.6160 -7.1798 -0.7423 1.5307 1.2431 6.8303 -#> -7.3312 -0.3445 -5.3169 -14.5565 -6.2619 -10.5918 4.3772 5.1601 -#> 2.6516 -13.7986 -4.7847 -6.2661 10.1320 -5.9332 4.6941 -3.3675 -#> -7.1772 -1.6759 7.9337 0.4910 1.9311 -5.4967 -6.4506 -0.9066 -#> -1.4451 8.3609 -1.1478 -4.2416 -2.5835 3.1141 4.8414 -4.9543 -#> 2.4346 2.4627 6.9711 2.8532 -1.3404 18.3892 5.8931 1.3370 -#> -0.6633 -8.7064 0.6492 -5.9467 -6.5899 10.7705 14.2726 -1.0755 -#> -8.1477 -5.0272 -2.3789 -7.4334 -8.4012 4.5336 -5.6346 1.4006 -#> -2.9836 4.9160 -1.7135 2.6437 -10.8257 -3.6766 5.8261 1.5997 -#> 1.9477 1.5791 0.3320 -3.6651 -1.7538 -9.1123 -4.8502 5.0894 -#> -11.1569 -1.6841 -2.3055 -0.9117 -7.9841 -3.5817 14.3205 12.3716 -#> -10.0935 -2.0168 -5.7619 7.2712 13.5324 6.1849 6.3196 -13.0108 -#> -10.8633 2.1813 13.8934 7.0051 9.0097 -9.0596 -5.2719 4.5808 -#> -3.4282 5.9018 -7.4001 -6.6291 -0.2669 13.3171 4.1376 -2.6469 -#> 6.5581 -1.5709 9.2863 6.6821 -1.6363 8.3377 6.7423 -1.2998 -#> 2.4276 3.2596 -1.8186 2.4292 -10.6535 -12.4743 0.6836 12.4658 -#> -2.0732 -0.9243 -3.3457 5.6381 0.9751 2.0973 -2.0401 7.6741 -#> -3.1224 -5.8254 2.7578 5.8531 -4.7348 2.8611 -9.0410 -2.6366 -#> 2.3521 -2.6690 1.4861 8.4215 12.6360 9.4422 4.0272 -4.4723 -#> -4.9759 -4.6801 3.2078 6.0508 -0.4729 12.5741 7.1624 0.3861 -#> -3.6352 -1.5695 -2.0715 1.1457 -3.7931 1.3690 1.7022 -1.2421 -#> 3.2337 3.5467 6.6668 -9.6162 -6.5442 1.8203 8.3872 8.9253 -#> -6.3699 -3.4739 0.2831 -2.6326 -3.4428 -0.5782 3.6256 -2.2742 -#> -2.2679 -5.6235 -3.2640 5.4520 -7.1208 5.1629 -5.4996 -18.7610 -#> -4.0078 -3.6185 1.1123 -10.8744 -4.2643 10.0603 -11.6839 -1.0615 -#> -#> Columns 41 to 48 -5.8828 2.6631 -3.0110 -6.7809 0.0293 -9.6178 -4.4755 -2.6119 -#> 4.1385 -0.2211 -5.7847 -0.0816 4.6601 -1.8591 1.6024 -4.3605 -#> 4.1862 1.2995 -10.7519 -5.6809 1.6123 -1.3902 2.0055 -6.9491 -#> 7.3520 -4.4029 -4.7040 13.0503 -17.7398 -1.9633 4.4114 3.0867 -#> -6.6644 3.7429 8.4155 -15.3608 9.3341 0.3042 6.8965 -2.5737 -#> 3.2898 6.2717 12.4690 8.1115 -9.0811 1.5432 2.7472 -3.1159 -#> -3.8740 -6.8035 9.4345 -3.9589 0.1824 4.4107 5.2822 4.0166 -#> 2.5250 -2.8588 4.2052 -0.8799 -2.6479 14.9133 -5.0021 -3.4389 -#> 3.1106 -1.1804 -7.3940 4.4292 -1.1075 -1.8306 8.6412 3.1824 -#> 5.9995 8.5145 9.8750 -12.1270 4.5127 2.0419 2.5088 5.1208 -#> 2.3775 1.5137 -4.6429 -16.8772 17.8410 8.7111 -9.2237 4.7237 -#> -6.3879 -10.6307 -10.1143 -0.3322 7.1997 4.7498 -13.2150 -12.5384 -#> -1.1561 5.3692 5.9235 14.8841 -11.4622 7.4580 -0.6476 7.9789 -#> -2.6277 0.0278 4.4945 5.5632 -3.9244 -11.6345 13.3383 -2.1119 -#> -13.9300 -0.8732 8.3611 -12.4894 -7.5368 4.1963 4.3556 -8.3775 -#> 2.0822 12.0235 5.5337 -4.5093 -0.8370 -1.8867 12.7920 -2.3058 -#> 5.4194 3.3481 8.3929 -4.1561 -2.4016 -0.4821 -2.4102 5.9310 -#> -2.7429 2.2809 13.0835 -13.3393 -4.6388 -2.4268 -4.5999 -6.9089 -#> 11.8154 -0.0086 -8.7581 -1.8602 6.0035 7.5693 -1.2043 9.0918 -#> 3.5296 -8.8532 -8.3397 -6.5962 -0.2161 4.7971 -1.8156 -0.7327 -#> 3.9167 -5.9942 -4.0317 -1.1674 15.1550 -5.2999 -0.7470 -2.4903 -#> -13.4262 -8.0015 -4.5641 11.9610 -6.9310 -14.0014 5.2533 7.4856 -#> -0.4048 10.0374 12.3274 -7.3601 13.5395 -12.6968 -3.3714 -0.7631 -#> 8.6631 5.0238 5.7300 13.5159 5.2352 -13.0944 8.8396 3.8043 -#> -2.9619 -11.5246 -16.4590 -2.5229 -0.7031 -3.5418 11.5896 -11.8498 -#> 6.3225 4.4412 10.7698 5.8504 -2.5050 5.8819 -1.0052 -3.7116 -#> -9.8836 -0.2882 -1.3196 -10.6948 -0.0785 9.0144 11.0455 5.9665 -#> -0.6816 -3.5955 4.0003 -4.2055 6.3928 3.0317 0.4626 -1.8370 -#> 1.2728 -12.7381 -11.3945 -3.2397 9.7050 -1.7765 6.0116 -12.0875 -#> 9.3059 15.8616 -3.6912 11.0371 6.4374 -8.0325 -1.6756 7.2352 -#> -1.4773 9.8128 -8.6768 -7.0641 -0.5934 10.5564 7.9960 4.2665 -#> 1.8995 6.3863 -1.6942 4.8941 0.6913 12.0428 -9.7984 -1.0175 -#> -2.9466 4.8298 -9.6127 13.0258 4.2701 -3.4983 3.2971 -9.5837 -#> -#> (13,.,.) = -#> Columns 1 to 8 -13.7181 -0.3447 5.8356 1.5973 6.2881 7.5921 -5.4251 7.5284 -#> 6.1287 1.0172 4.8493 -11.8992 0.0052 0.0865 8.0248 -0.1148 -#> -2.7358 6.5331 9.9707 -0.5685 -7.6053 -2.8306 4.6883 -3.0714 -#> 10.5341 0.0862 2.1901 0.7360 5.3057 -1.7246 10.2474 -0.4840 -#> -5.9171 -1.9733 -0.1388 -11.1198 6.1604 1.0356 0.4460 0.5814 -#> -1.5607 -0.5406 7.0331 7.1030 5.0921 4.7854 3.2406 -1.8484 -#> -5.0085 2.8199 4.1907 -1.4205 -1.0750 11.5682 3.4545 10.3755 -#> 3.5391 0.1704 0.2633 0.0874 -16.5464 4.8874 3.8234 0.3897 -#> 2.8643 -0.8036 10.4355 0.0936 3.7378 -3.9538 3.5668 -6.0208 -#> -0.3664 0.5333 2.8271 -6.5794 7.8843 2.2596 12.7841 1.2995 -#> 1.6696 17.1502 -12.6774 -21.5333 -0.3590 -2.8230 0.7653 -5.5555 -#> 0.3097 -3.6061 -5.9365 -14.6621 -9.4706 -1.2509 -8.0828 16.4437 -#> -0.9067 -5.4808 -5.9482 3.4010 -11.3237 -5.6148 0.5647 2.0937 -#> -2.1402 -2.1574 1.6820 1.7839 9.6963 -5.1659 10.5187 -17.2321 -#> -12.0609 8.6689 7.5419 -5.2172 -1.5569 1.8496 9.3615 -19.1509 -#> -7.8590 -8.8998 3.7620 -0.8961 1.9522 -4.9404 13.7316 -8.6320 -#> -1.3495 1.6034 4.7689 7.2803 3.5203 12.2188 -7.4717 19.8181 -#> -5.8713 0.0045 16.6472 -7.4291 2.1585 1.1381 -5.0502 -6.4395 -#> -8.0780 10.4833 -7.7087 -4.3378 -6.3973 7.0898 -0.6432 -5.0435 -#> 1.3963 3.7577 -13.2720 2.1819 -3.5963 3.7020 4.4230 11.6741 -#> 15.3527 -7.6608 6.6919 -3.0026 5.1196 -7.1289 2.5166 -8.8008 -#> 12.4373 -10.7005 -10.4661 -0.0378 5.0841 -9.1134 6.2372 -1.9012 -#> -3.8107 -2.2387 -2.3170 6.6957 7.6508 11.8243 -6.0962 -4.2987 -#> -6.0589 -6.3465 -3.5900 8.8230 10.4383 0.5384 -1.9179 11.7249 -#> -8.0175 8.1418 4.3234 -0.7016 -1.8186 4.3396 1.3985 3.0427 -#> -9.0953 0.0205 -16.0239 7.2141 -6.1833 5.6397 12.4842 -7.2513 -#> 1.8304 -2.2553 3.7247 4.4194 -2.4612 5.6431 -11.8009 13.6524 -#> -1.0657 13.1455 -6.4824 2.7998 -2.0011 2.4593 -0.1380 -18.9450 -#> 2.1227 1.4994 -2.0051 1.0570 -2.4858 9.9079 11.9125 -8.3056 -#> -18.7767 7.6160 2.8001 13.6778 9.9759 11.7203 -7.3760 14.7773 -#> -9.2775 3.5073 5.8065 -3.2242 -0.1886 1.7389 10.1169 5.1351 -#> 2.8881 1.9169 -14.5476 0.0994 -0.6918 1.7538 6.6907 -2.9264 -#> -1.5610 -15.5603 -9.3431 -5.6707 -2.7868 -0.6588 0.4745 7.5594 -#> -#> Columns 9 to 16 -0.4537 13.3681 -18.0402 5.3283 7.8392 -6.5154 2.1867 -0.0730 -#> 0.4234 -2.8830 8.3738 5.2371 -5.4255 -9.4456 7.8477 -3.3967 -#> 0.8610 -4.0427 7.0516 8.8728 -8.9026 -6.1186 0.6182 -0.7321 -#> -12.4426 18.5727 2.2000 -3.9517 1.9837 -4.6114 12.5401 -3.7230 -#> 11.1755 -13.6765 2.2766 7.0844 -5.9924 2.1159 -2.8463 -9.8425 -#> 3.4147 -0.2241 1.2905 2.3795 -0.3784 1.7020 -1.0787 2.1012 -#> -12.1654 3.6320 0.2117 13.0024 -15.6153 0.4192 0.7902 11.0978 -#> -3.3314 11.4002 1.2202 11.2767 -9.8292 1.9364 -3.5906 0.8670 -#> 1.3514 -4.4492 -4.2492 -10.7704 8.9424 -7.2758 0.2141 -8.3028 -#> 3.5126 -18.3866 -2.6702 9.4522 -6.4155 -10.0933 14.7170 -5.0721 -#> 7.3894 -9.3259 19.8691 -1.6937 -5.9989 5.7458 -7.6616 4.0485 -#> -2.3914 -0.6550 -3.2121 10.7993 -6.6451 -0.9181 -8.1194 8.8540 -#> -5.0520 11.2931 4.6047 2.6079 -2.1932 1.5243 0.4934 -4.4654 -#> 3.8567 13.9962 9.6761 -5.5595 0.2366 -1.1271 9.6884 -5.6431 -#> 4.9715 10.0399 18.6660 6.2897 -8.3365 -3.5119 1.8689 -19.4082 -#> 1.4753 -2.3583 1.3196 4.5634 5.9071 -7.1842 11.7466 3.7223 -#> -1.7783 -7.2138 -12.7813 -4.8437 -0.8411 -10.0547 -6.0221 -7.1953 -#> 3.2388 -6.6003 -9.2452 5.9463 -6.5550 6.6169 -1.9871 -0.7962 -#> 0.6546 8.8318 12.1956 20.1071 -6.5240 8.5515 -3.0849 4.9281 -#> 2.2310 -1.9946 14.7048 2.0993 -13.3510 1.7606 4.6692 0.4065 -#> -7.1707 0.6846 3.2662 -1.9376 -3.8787 -3.6412 4.8520 2.4025 -#> 2.3054 7.0808 10.2049 -20.7992 -2.9452 -0.6283 9.2520 1.4946 -#> 9.4116 -2.0544 11.1145 8.0121 -7.4675 15.5080 -6.3952 9.1342 -#> -4.9404 -15.8663 -9.9113 0.3530 3.1372 -6.2037 3.0748 9.5932 -#> 4.7657 -7.6916 -1.1407 2.0595 8.4072 -2.1366 2.2217 1.0098 -#> 4.1171 3.3204 5.5477 16.9107 -11.5887 2.8421 1.1560 -8.0653 -#> 6.5559 9.8319 0.0996 -2.4849 -0.1135 10.7154 -12.6500 -6.1913 -#> -1.2950 -24.7869 11.4024 2.9801 -18.1488 -3.0154 3.3089 2.1303 -#> 1.2271 3.9828 -0.3498 6.1050 -2.0436 -0.7848 4.1063 -10.9277 -#> -10.3081 -13.2193 -5.8505 -0.1730 4.0578 2.0895 -1.7765 10.1179 -#> 14.2637 11.8619 -5.1396 0.1210 -2.3210 0.0329 -7.7430 -10.3766 -#> -6.8249 6.6138 12.0710 -1.9869 5.8455 -0.9538 2.8660 5.0700 -#> 1.6378 4.3483 8.2711 -2.6529 7.7409 -0.4693 -0.2296 3.8611 -#> -#> Columns 17 to 24 -18.2529 -0.7843 11.8839 -4.1371 -7.5558 7.9947 11.7894 2.7651 -#> 12.3867 4.1103 -4.3933 -6.1404 11.4483 -3.2177 -6.0363 -4.0282 -#> 13.4974 2.2192 5.8247 3.9458 6.4274 4.3778 -9.2454 -6.9374 -#> 10.8095 -0.0647 6.7780 -4.2264 10.1764 -9.3652 8.2440 5.3695 -#> 1.5547 1.4251 -10.5475 3.5811 0.9859 4.3994 2.8293 -1.6326 -#> 7.2752 -7.2462 -9.2140 1.8691 -10.8841 9.8740 4.3779 -14.9532 -#> -10.0497 -4.1462 5.6859 -1.9422 -7.2311 -2.2039 -6.7706 -7.7515 -#> -7.2611 0.3434 10.8226 -2.5999 -11.7364 -1.3330 -11.1642 -6.6722 -#> -0.5093 -2.7057 1.1337 5.9100 -0.0395 8.2051 8.4493 2.2405 -#> 3.0575 8.8751 -8.7246 -5.4412 6.1884 3.0971 -1.2062 15.2766 -#> -9.0075 6.7520 -11.1594 -3.4083 11.5826 10.3835 -13.9912 5.7969 -#> 11.1581 3.9730 1.6344 9.1667 -8.0174 -3.3163 1.8599 -14.1742 -#> 8.8165 -10.4774 5.9238 -7.9778 15.0429 -2.0832 -8.3252 -0.3596 -#> 7.0435 -16.1194 0.4444 3.5330 2.8708 7.6799 -3.6777 4.3081 -#> -11.0742 -13.8376 2.3848 -3.8527 10.4867 1.3087 -13.1818 8.7516 -#> 9.0784 2.0644 4.0886 4.4662 0.3837 4.4890 3.2950 7.3878 -#> -3.0825 2.8900 1.5499 -1.8036 -6.2011 -5.9667 3.2099 18.0180 -#> -6.6086 -0.4012 -0.6056 -1.2367 -2.0511 6.0924 -0.1952 -1.4324 -#> -5.3393 5.1967 -0.9222 -0.1756 -2.3408 2.1642 -3.2054 3.4058 -#> -2.7763 -1.9330 5.2968 6.1661 11.7368 -1.3482 -13.2555 10.1649 -#> 7.7419 -1.5774 6.6621 4.8569 -2.8487 -1.1811 -1.2904 -3.5283 -#> -2.1136 -3.8623 10.4048 5.8220 10.9268 -1.0691 -0.8613 6.4316 -#> -5.7798 1.2923 -15.6228 1.4064 -8.7560 -1.9435 0.4758 -2.0679 -#> 9.5343 4.2179 -6.6049 -5.6621 -1.8436 2.4079 3.7554 -6.3271 -#> 4.9440 5.6170 -2.8385 6.9373 -4.0377 1.8346 3.6950 0.9082 -#> -9.6127 7.4395 3.0402 -2.2246 -7.6659 -1.1280 -1.8699 9.9940 -#> -12.7051 4.4045 -3.1659 -2.2974 -1.9191 -4.9103 1.0011 -13.5765 -#> 1.8682 -9.5825 -10.4569 -11.7557 4.8390 6.9839 -4.2444 9.4414 -#> 6.5605 1.6008 -2.1709 3.3722 1.2255 -0.9195 0.8182 14.2683 -#> -5.5613 -12.4914 -7.0133 -0.1660 -9.0405 12.4403 5.1551 7.0832 -#> -7.8657 -4.8879 2.4015 -0.1641 -8.1200 -4.2953 -0.9813 -2.0122 -#> 13.1090 -1.1549 -9.9288 3.4933 10.4903 -6.2657 4.1032 11.1627 -#> 22.9732 5.6470 3.7499 6.1426 6.4469 4.5686 8.4451 -10.3352 -#> -#> Columns 25 to 32 5.7054 -5.3441 -6.0978 6.0447 -10.4055 8.0234 -0.7850 -5.8711 -#> 4.4482 5.9026 8.5244 -6.5094 -4.0751 6.0330 -6.4294 0.0522 -#> 9.7296 3.1358 8.8016 -8.0296 5.5832 -3.2571 5.4091 4.3074 -#> -5.8150 4.7967 -0.4531 0.9397 -2.1079 3.9001 -4.1233 0.6805 -#> 1.8793 6.4777 -0.0286 10.9774 -0.6168 -2.8688 -0.0493 11.3595 -#> 16.6737 1.7602 -10.4666 -16.6184 8.4079 11.1176 -7.8161 12.0327 -#> 13.1589 -15.4014 3.1226 7.8401 11.9626 -20.9714 6.7551 3.3441 -#> 5.5559 -8.3731 -10.7067 2.6777 -6.6518 6.9165 5.1903 -0.0387 -#> 0.6119 9.3752 1.1777 10.1055 -7.7612 2.4682 -2.6647 -3.2515 -#> 1.4933 0.8544 6.1785 1.2085 0.6575 -1.3207 0.0227 5.0657 -#> 3.8068 -3.8701 6.7254 6.9575 -7.4438 5.2901 1.0952 -7.0580 -#> 10.7262 7.8740 0.3326 -21.5471 -7.1646 -4.9007 9.6586 9.2646 -#> 0.9556 -0.0027 11.4575 -1.3609 -1.2580 -5.2391 2.3976 8.6294 -#> -0.6346 4.0469 3.8204 -2.8324 -1.9279 -6.9823 0.8844 9.6346 -#> 10.8041 -2.5132 4.4946 3.0518 -0.3110 0.5100 2.4100 -2.7544 -#> 1.7201 -0.0440 -3.2218 2.4878 0.9819 0.4591 -1.7396 4.6635 -#> 3.1439 -8.0099 -2.2478 2.0788 19.5492 -13.2788 -6.1485 10.1090 -#> 17.1915 1.1694 -0.3155 -3.8923 -4.0340 -0.6389 0.5847 10.7342 -#> -3.1465 -5.3648 4.2524 -2.8767 4.0003 2.2823 3.7417 -0.6386 -#> -7.0934 -3.8575 1.9482 8.5610 -4.5604 -11.2780 8.2424 -1.5578 -#> 1.1857 -1.3874 1.4302 2.4410 -17.7996 -2.9959 -4.0117 -7.0299 -#> -4.5234 -1.2063 1.6614 6.9609 -5.7689 -0.8851 -1.3269 -8.5309 -#> -1.9256 3.0458 -3.4664 -10.7231 5.4695 -2.0249 1.7123 5.4240 -#> -15.2104 4.3967 -0.5276 2.4381 -4.1284 6.0441 3.7916 -6.4473 -#> 7.1233 0.2088 0.9726 -8.5699 -3.5048 -4.0555 -10.4901 -1.2846 -#> 1.7306 0.8744 -4.3449 13.0051 -0.1197 2.0478 -3.5184 5.3571 -#> -3.3896 -4.1441 -3.2758 -1.9248 3.6815 -1.8899 4.8495 0.0962 -#> -0.2889 -3.1917 0.6236 -3.3692 14.2575 -8.5865 -10.0056 -3.6139 -#> -3.2592 8.1670 2.5470 -6.6863 -1.6065 -5.4853 -9.3110 -1.9122 -#> 6.8977 -4.4494 -8.8062 11.3942 14.2317 -5.9056 -6.1580 0.3269 -#> 1.0336 -5.3697 -12.3649 -15.0611 5.9503 13.7932 10.8067 4.7749 -#> -2.3155 12.2920 2.8133 -11.2705 -3.1521 -5.7135 4.6695 -6.1455 -#> 2.0441 -0.5978 2.9946 2.2039 1.6005 0.5476 -10.1008 5.1450 -#> -#> Columns 33 to 40 13.9888 1.6621 -6.2028 -5.8881 -15.0765 -7.4038 -6.6962 -5.2960 -#> -2.1251 -0.9159 0.2313 6.4890 5.8609 1.1657 -1.4320 1.8357 -#> -13.0969 -3.3009 4.5870 0.1656 6.5823 4.0142 0.2058 5.6121 -#> 4.0445 -7.3677 -3.3419 6.4198 -2.8480 5.5827 13.7324 2.4249 -#> -18.3887 -10.2412 13.6661 7.2739 0.7158 3.9746 -4.5782 9.0107 -#> 1.4559 -11.6066 -6.2290 -2.9388 -5.0435 -1.1411 -1.3263 -4.6638 -#> -9.8511 -7.0693 11.5878 9.9078 -6.7654 -3.8763 0.2616 1.9187 -#> 6.3136 -5.5531 -0.5147 2.4884 5.5770 0.8150 -1.4526 -5.0775 -#> -13.1313 3.6136 -6.0809 1.1632 6.8753 4.5192 2.9325 -4.9329 -#> -16.4135 -5.5086 5.1798 0.9344 4.7721 -5.9660 -6.8268 12.8024 -#> -7.5546 8.2615 3.2973 -7.1439 6.9062 -3.2143 3.7799 -3.6713 -#> -10.0234 -4.3302 6.5380 -3.5359 -0.6433 -12.9074 0.1519 6.9613 -#> 1.6368 -3.8118 7.6408 -0.7032 9.1176 7.9364 -6.1997 0.0139 -#> -11.1897 8.9903 6.2937 -8.6550 0.1846 3.1229 7.4796 -9.4778 -#> -13.8819 -10.0532 17.6174 -8.0613 -1.6298 9.8393 -9.3358 -9.0107 -#> -1.8548 1.0385 -0.4659 6.4678 10.0948 -3.8695 14.7695 8.0527 -#> -8.6322 -9.9610 -2.1486 5.3907 -7.8803 -8.2907 -17.0719 9.2418 -#> -5.7645 -4.1250 -0.8757 -5.3394 -1.9818 0.1242 1.4888 1.3666 -#> 4.2515 -3.5361 3.5730 9.5375 8.4103 2.8748 11.2373 2.8970 -#> -7.1270 7.0362 6.7197 0.2017 -9.1090 -3.3480 -11.7893 -5.8566 -#> -0.3841 5.4366 -0.9445 -2.6105 -5.0244 3.3680 13.2543 -1.5195 -#> 1.0747 14.1734 1.1987 -1.7408 4.8632 1.5659 5.5709 -19.8471 -#> 1.6983 -1.2366 4.8798 -2.5299 -9.1657 0.9159 2.1604 -8.2412 -#> -1.0699 -3.8091 -0.6495 0.3911 -17.1536 -1.6623 1.0929 8.2718 -#> -1.9780 0.0814 0.5713 5.0296 -6.1497 2.7823 4.6570 2.8901 -#> 9.9274 -15.5823 11.1468 13.4251 -6.8414 -1.5312 -4.1873 -3.8815 -#> -3.2455 -0.2084 -6.4184 2.6211 -4.3341 -1.4415 -4.0072 -10.8088 -#> -5.4387 -5.7921 0.6411 -5.3342 -2.9096 -17.0201 8.5314 9.1331 -#> -9.3609 2.5416 -0.6532 0.8389 3.4634 -5.3714 1.6504 6.9934 -#> 1.2955 -0.9298 0.3603 -5.5105 -4.1873 -4.4538 2.8055 14.2132 -#> -10.8144 -7.4770 -1.9756 -0.7660 0.2557 5.3072 3.9560 -1.3456 -#> -12.0214 2.5167 8.1731 -1.4205 -1.2927 8.0969 -3.6897 16.6704 -#> 1.7988 0.0107 8.2771 4.0196 2.4219 -0.4908 1.5162 0.4107 -#> -#> Columns 41 to 48 6.8223 -1.1953 -16.5700 -0.2202 -1.8059 -4.1714 2.2586 3.1392 -#> -9.7408 -4.4622 9.2715 9.7497 -9.3702 -6.9846 1.4185 0.2367 -#> -1.3460 -4.0343 6.5043 8.9333 -2.8347 -0.7131 -4.1512 -1.6062 -#> 0.1806 3.0887 -2.6297 -1.5396 3.7892 -3.0346 -1.3348 9.8749 -#> -6.5430 1.0502 4.3606 7.9445 2.2091 -2.4708 -2.1457 -11.0325 -#> -0.3296 -1.0538 19.1806 -0.6681 7.7155 1.0968 14.9470 9.6190 -#> -13.6597 1.3308 4.1055 18.2011 5.4537 1.0127 -9.1060 -0.3762 -#> -0.2162 -1.0438 5.0688 -8.3406 2.6659 -4.0165 -0.4390 -0.7727 -#> 0.0441 -5.9841 -4.3727 12.3835 -6.2386 11.5710 -3.6306 8.3481 -#> -1.1854 2.9303 -2.0192 6.8946 -1.6485 -7.4601 -0.4546 -1.8196 -#> -12.3307 -2.6553 18.9656 -0.5494 0.6465 -8.9811 -3.0752 -17.2511 -#> -5.0318 0.9506 13.3649 -5.5659 -9.9188 -11.2742 -10.3484 -13.8310 -#> -7.5226 8.9456 -1.2620 -12.0812 6.8642 5.7276 -10.8331 1.1360 -#> -10.2075 10.5311 7.0963 -2.4166 -4.3044 9.0821 -1.8509 4.5489 -#> -10.6392 2.3903 7.6245 -5.2061 1.1225 0.5899 0.8800 -0.8110 -#> 14.0755 5.1738 -7.7262 3.4819 -10.6563 11.5738 -15.0048 10.6674 -#> 4.9958 8.7628 -13.1731 -2.1885 5.2119 -3.2639 -12.4037 3.4309 -#> 6.2419 10.6203 11.5762 3.5668 -7.6028 -9.1614 4.8107 -7.8572 -#> -0.7675 -12.4405 2.2323 -6.8869 3.5631 -2.5244 -6.3250 8.6155 -#> -4.1113 2.0419 1.0660 -4.4972 4.0218 -6.5370 -6.7348 -5.9009 -#> -0.9550 8.0446 7.0813 0.4395 -12.6257 -4.2030 3.9091 -0.8361 -#> -0.6885 7.8596 -12.6022 -7.9810 -2.3373 11.8033 1.6770 -1.6105 -#> -6.5064 4.1790 4.0999 1.1366 7.9675 11.6719 3.3407 6.3710 -#> -1.5531 -11.0727 -1.7039 10.3451 7.2884 6.7369 0.9745 10.8481 -#> 17.5924 -8.7264 11.3075 2.8726 -14.5478 -11.4686 -3.0124 10.2257 -#> 7.9230 -6.6112 1.0433 -3.9745 6.6238 -8.8205 -12.9009 0.8202 -#> -4.9997 -3.1634 -10.4045 6.3738 4.9820 12.5160 -2.0337 -2.6327 -#> -5.0487 -8.2076 19.7135 2.4629 -0.8365 2.6996 -0.6732 -1.3441 -#> 7.2947 -0.0056 5.3458 -0.1922 -20.1747 -14.5637 -0.0136 -0.7484 -#> 5.6158 -6.5718 1.9386 -6.6233 14.2731 6.9216 -3.8060 3.3327 -#> 4.9116 -5.7868 -13.6616 -10.8264 3.4249 4.6447 6.1558 2.3811 -#> -0.3448 12.0279 -1.1617 -2.9150 -6.2155 -14.8863 -4.9204 10.1638 -#> 2.2497 -1.4470 0.6458 -3.5691 -0.2670 -2.3919 -8.6552 -3.1209 -#> -#> (14,.,.) = -#> Columns 1 to 8 -2.1444 0.6275 -5.3745 7.5895 16.7731 -13.2129 -10.5422 9.0688 -#> -8.6767 -10.2206 9.7955 6.8703 2.7551 -14.1020 -4.8347 -11.2143 -#> -7.6052 -18.8394 2.9669 5.0451 4.2223 -1.5257 7.6646 -7.3693 -#> -10.0294 -4.8756 15.5625 8.1806 3.2329 -9.9985 2.3995 2.0965 -#> 4.9436 1.4775 -3.2586 -7.1490 -10.9452 7.2870 3.4860 0.2213 -#> 6.0503 -10.1194 16.5172 0.6114 -16.4727 1.2303 4.3059 -0.2589 -#> -1.6701 3.0303 2.6328 1.9484 -4.5294 1.2428 -4.5693 -9.4682 -#> 5.1566 -4.0968 -5.4200 5.1647 -1.7696 -1.5406 -3.8415 -1.7984 -#> 3.6765 3.4669 -5.7077 0.1770 -7.3732 -7.5296 -0.5793 -2.3397 -#> -5.7430 -8.4634 -5.5900 1.6437 6.9694 -11.7347 -6.2177 -4.6205 -#> 0.6710 -4.0882 9.8013 -1.7805 1.2103 15.6890 -3.8652 -8.9470 -#> 12.5851 4.4384 10.4783 1.7717 1.1894 14.1159 -11.6373 0.3138 -#> 0.3266 -3.3944 -12.8845 -4.9403 -1.4425 -2.2774 4.4732 -4.2001 -#> 7.7831 -0.6257 12.6476 -4.7705 -1.3084 3.2592 -4.7172 3.3935 -#> 0.2951 -13.8237 -0.6221 -3.9925 -6.7304 4.5764 -0.5526 -9.1893 -#> -3.1852 -3.4848 -5.6748 -3.6255 12.7578 2.8069 1.2936 9.8560 -#> -8.7286 10.1477 3.0263 -0.5387 2.2513 -3.9209 -9.0792 -0.1861 -#> 8.2871 -9.3489 6.1300 5.8867 0.3383 16.0642 13.6681 5.7864 -#> -2.4832 -11.1671 4.5390 16.8200 1.3044 -0.3483 -5.8973 -11.7801 -#> 1.2391 -1.0048 2.2145 11.8020 -6.3125 -1.3823 -4.2399 -10.7427 -#> 6.6959 -6.8043 7.1455 3.5484 -2.3540 -6.0205 -8.9290 -1.8643 -#> -0.3061 12.4510 10.5427 -7.4929 0.3782 -12.1543 -9.1551 6.2211 -#> 9.7317 4.5159 10.3419 -2.1879 -2.6870 7.7015 -1.9451 4.4211 -#> 1.7871 -0.0642 -10.1845 0.0613 -3.8851 -5.8144 -6.2688 1.7174 -#> 13.4599 -2.8482 8.3512 16.3706 3.3594 5.3914 1.4781 -11.9550 -#> -4.2157 -11.1060 -12.5199 8.8629 -1.2847 -3.2273 -1.1830 -2.4803 -#> 2.6273 15.6500 5.7461 -16.2158 -15.0618 -6.3968 -4.2644 -2.5628 -#> -4.7286 5.6862 14.9851 -0.5636 -7.2220 -2.7243 -2.4637 2.2551 -#> -3.3462 -7.6389 1.9538 5.6908 4.4791 -4.6541 -5.4730 -10.1804 -#> 2.2027 2.9326 -3.4673 -5.2041 11.2478 4.3450 -5.5164 13.3213 -#> 13.4535 6.0531 0.5279 -0.7927 -3.0388 -6.7141 -5.2771 0.3138 -#> -9.7670 6.7562 -5.5766 -8.8379 5.7746 2.1539 0.0484 -15.2464 -#> 0.5020 -6.4378 -6.9533 -6.7794 6.2663 -0.8423 -2.0734 5.1950 -#> -#> Columns 9 to 16 -0.9762 3.0234 1.0104 7.0104 6.8566 -10.5834 16.1784 -4.9472 -#> -3.3504 -8.0560 1.4816 4.3687 -7.1903 3.3486 1.2851 -8.6248 -#> -5.5563 -8.5923 5.3647 3.1766 -4.4003 -2.6062 -1.3784 -10.4659 -#> -10.4741 -5.5723 -0.6460 13.2420 3.4350 -11.2556 0.8239 -15.1732 -#> -4.9091 -10.6420 12.9322 -3.6824 -3.0656 1.6655 -5.3298 4.0569 -#> -13.2547 6.9211 6.5194 -13.3752 -5.8592 6.8385 -10.8662 6.6610 -#> -6.7226 -2.5321 0.3613 3.7425 0.5684 -2.5713 -4.4943 10.9557 -#> -3.2104 1.4335 -3.7294 3.8372 -5.0969 5.9416 8.7966 -0.9288 -#> -7.9040 5.0802 1.7175 9.3299 -16.1795 -0.6950 -4.9105 -7.8795 -#> -1.7059 -23.6254 8.1958 13.6085 -8.8340 5.5690 3.1202 -6.7429 -#> 6.1610 0.7591 -0.4275 -16.4001 -3.7464 14.8188 2.7395 -6.1317 -#> -4.6173 -10.0567 8.0694 -2.3891 -5.2332 2.1577 -2.2042 5.9672 -#> 5.1777 -7.5451 -6.8466 -0.0412 4.9360 5.7510 4.0687 -7.6527 -#> -3.6797 -2.3772 2.3120 -1.6865 -4.3676 8.5093 -5.7215 -7.1319 -#> 0.6331 -0.7750 5.7840 -3.2438 2.5939 -0.9735 1.9388 -7.4345 -#> -3.6778 -6.0983 -16.4988 10.7501 -8.7596 6.7864 -1.5069 -5.2718 -#> 12.5570 -12.5301 3.2381 10.6822 -0.4238 -1.2423 1.9058 0.9420 -#> -8.1062 -3.3461 1.2314 -5.4680 -1.8716 14.2853 2.3487 4.5719 -#> -7.1079 -11.9893 0.3655 4.2615 4.4804 6.5035 -8.9245 -4.1921 -#> 2.1747 -8.1742 7.0437 1.0222 2.6713 -3.7678 -6.9374 9.9402 -#> -1.1591 -3.6203 -0.1846 -5.2856 -8.4578 -5.6804 -3.7338 7.4090 -#> -4.8591 16.9304 -8.2518 6.5114 2.1960 -4.1031 -0.9657 -2.2481 -#> 5.8799 -13.4325 1.9374 -7.1436 -4.0939 -1.5348 -4.6633 2.8750 -#> 11.9135 -5.9931 10.6113 -0.2672 7.9714 -6.8968 5.2002 14.0565 -#> -5.4820 -0.7658 0.9432 -0.8519 -2.5855 -5.5044 -12.3018 1.7075 -#> -6.5375 -9.9194 -10.3241 11.1755 3.7264 0.8428 6.9324 4.6708 -#> 1.6639 13.2267 11.1546 1.4382 -10.2988 -11.9187 -4.2637 -0.9742 -#> 7.6500 3.4899 4.6013 -9.5289 -8.5190 1.7298 -7.5066 1.5496 -#> -4.4808 -9.7392 1.8155 3.2473 -9.2740 2.8275 -5.9945 -12.2909 -#> 4.8999 -8.4155 -6.2310 -7.1038 6.0395 -7.5361 8.3684 -1.2237 -#> -0.3172 2.4322 10.8408 4.7848 -1.3948 -2.0308 -3.4644 -3.6158 -#> 1.2032 -0.9063 -0.4967 -4.0826 1.0183 -3.8041 1.0185 -9.7255 -#> -4.6066 0.2925 -11.5063 -3.3608 3.6180 1.6864 2.4220 -4.2550 -#> -#> Columns 17 to 24 3.7132 2.7026 -1.0514 -6.3857 5.8852 9.2122 -7.7985 -10.9549 -#> -3.5542 -10.4813 -10.9183 5.2912 0.7644 -1.3522 5.8959 4.7606 -#> -1.8724 -4.6788 -9.0386 -0.9991 8.7160 1.1418 3.4454 0.1548 -#> 0.3223 -5.1287 -7.9411 2.5696 13.8285 -13.6017 -11.8921 -1.6590 -#> -7.4142 8.6525 -8.4901 11.6494 -4.1533 0.5475 -0.9908 -3.4346 -#> -4.0823 -2.7602 -2.7770 7.8444 -8.3679 -1.6173 -2.7454 -4.3434 -#> -8.7138 -5.5199 5.0951 -9.3132 10.9098 -0.3270 -5.0873 -10.6834 -#> 1.7729 0.1606 3.5235 -3.2069 -1.4126 5.3862 1.6555 6.0412 -#> -2.3556 -3.7612 -5.9476 -0.6006 8.3621 -6.4094 0.4973 8.1502 -#> 5.8877 -8.2092 -14.1709 4.3547 -1.2038 5.5549 6.8928 1.8750 -#> 2.7522 11.6348 -15.4113 14.9461 -7.2504 3.8405 -1.0440 1.2750 -#> -2.4977 6.1348 -1.9506 -13.9133 4.9634 8.0606 10.5353 -9.5552 -#> 3.4657 0.2648 -3.3996 -1.7110 5.0830 -2.7579 0.3588 4.0522 -#> -2.9026 -0.4903 4.1745 1.7057 12.6265 -10.0762 -9.1472 -2.8776 -#> -24.6963 4.9028 2.2497 12.7402 6.2606 0.9909 5.0337 3.1554 -#> 2.6343 -1.5731 -0.3098 1.5872 8.9305 -9.5032 -3.1197 -4.7093 -#> 11.4809 -10.5834 7.0071 -0.9855 0.1514 11.3191 16.7707 -4.3523 -#> -5.0535 12.5095 -3.0928 -3.5276 3.5058 -5.5731 -6.5917 -3.1749 -#> -4.4402 -4.3286 -2.5488 4.6506 -4.4229 -2.9584 -2.2505 0.4066 -#> -1.5600 -3.5898 2.8236 -0.6024 4.6846 4.3762 5.7524 1.0433 -#> -14.8293 -1.8766 0.7403 -6.9910 16.1613 1.9846 -0.6524 2.4959 -#> -5.4296 0.8858 -2.0244 0.4708 10.3446 -7.2269 -11.1847 6.7621 -#> -6.3072 -7.2304 5.8156 6.9135 -6.5860 -9.2921 -0.1708 -7.5872 -#> -1.2483 -13.9717 -4.7192 -0.4497 -5.7567 0.0720 -3.2322 -5.1351 -#> -3.1657 -4.7625 8.0419 -5.4016 -1.2495 -8.2174 9.3439 -3.0667 -#> -1.3187 1.6499 -6.5667 4.9880 1.5150 0.8097 -3.4620 -2.6438 -#> -2.3831 -2.7993 7.8687 7.6837 5.1708 -8.4030 16.2167 2.6972 -#> -1.5868 2.9036 0.2707 3.7706 8.1882 14.5434 5.2464 -5.5609 -#> 5.3385 -10.0304 7.2175 -3.0437 11.3225 2.2313 12.7802 2.7522 -#> 2.7248 4.7965 -6.9295 1.4670 -5.0119 18.2201 -6.2492 -13.9647 -#> -0.5215 6.2616 6.8941 3.0859 5.0107 0.5953 7.2855 2.0261 -#> 4.9730 -10.6661 2.0325 5.6798 -10.4148 6.9674 -0.3137 -1.7415 -#> 1.3191 1.6653 -0.9755 -6.9042 1.9775 1.5980 -2.9887 -7.3644 -#> -#> Columns 25 to 32 -2.2584 -3.1411 -1.7190 5.6135 -0.7605 -1.8872 8.8467 0.3317 -#> 1.8911 2.5071 4.7714 2.8800 -11.7468 10.1048 -1.3142 -1.2634 -#> -0.3020 -6.2638 0.6670 0.2206 1.6318 -0.1777 3.8507 1.0090 -#> 3.9061 11.3403 -0.1531 2.3443 -0.2155 13.3499 -17.7884 5.8076 -#> 4.6891 4.9715 1.8122 -3.1096 15.1079 -8.8061 5.2626 -1.7620 -#> 11.8195 13.2659 -0.9978 -3.7929 10.2876 -14.6448 0.8412 5.7287 -#> 4.9830 -6.6374 0.5644 -1.2754 -0.3460 -2.1989 -15.5581 7.8374 -#> 3.7161 -4.0108 -5.7117 0.2989 -4.1622 -9.0537 -7.2892 12.5500 -#> -4.2194 12.9871 -8.7733 -1.4598 -6.5197 -3.6219 -2.2843 -3.0821 -#> 5.8795 0.5183 3.4822 1.5533 -7.1084 -0.2438 8.3458 1.0638 -#> -4.5325 -12.4497 -9.7003 13.0726 2.2119 -18.1307 11.2652 -3.5084 -#> -12.6648 -0.2436 6.2022 -5.6211 12.8947 -5.3681 -11.4360 6.7527 -#> -10.3917 -4.2899 -4.2763 3.9326 -9.5481 8.7687 -0.3994 1.4870 -#> -1.9819 -3.9799 0.4570 0.6859 8.2083 -2.0543 -8.9140 3.7974 -#> 7.5153 -4.8186 -5.2606 -6.9110 19.4118 -2.6405 1.9254 -4.9273 -#> -7.9555 12.0409 -8.3323 -0.7393 -3.4030 11.7891 -4.2925 2.8215 -#> 10.9420 6.2364 4.0612 -0.0166 -4.1987 6.9281 -6.5285 -7.9298 -#> 8.7943 1.2502 4.2291 -1.3830 10.3950 -2.8952 -1.2324 7.2852 -#> 5.0857 -6.5046 -11.5604 3.5139 -9.5032 1.5154 -8.9062 1.5487 -#> -4.3588 -12.7515 4.9994 7.2508 -1.5145 3.3324 -0.5210 -0.6055 -#> 3.6695 -6.7156 -2.2916 -3.3440 -8.8127 -3.9514 -8.6936 5.4232 -#> -17.4519 0.3753 -0.1577 0.2455 -3.8726 2.1221 3.0209 -8.5284 -#> 7.7403 3.7682 11.5437 -1.9562 11.3489 2.4882 -14.7405 5.0131 -#> -3.4348 -6.1867 -2.1967 -0.9979 -15.9601 12.7055 7.6482 -5.3109 -#> 7.6323 3.3177 1.1548 -1.6291 -1.5787 -4.0459 -5.3410 -8.6709 -#> 1.7209 5.6033 -4.2626 -4.5748 5.7546 -4.3819 -3.5714 9.4888 -#> -7.2977 4.9754 0.1989 -1.4958 -2.5134 2.0138 -8.3286 -1.0149 -#> -5.3932 -8.4312 -1.7728 -1.3764 -1.8622 -2.0587 1.6540 -13.3436 -#> -0.6438 -4.0675 9.3477 -2.1537 -6.8739 6.6785 -3.6679 2.0648 -#> 3.0280 14.1391 -0.7212 6.4096 9.0701 -3.9991 10.6811 -11.6501 -#> 7.7576 3.7023 2.2892 -3.6257 11.5897 -3.1216 -11.6945 3.9960 -#> 0.0091 1.1377 5.4606 3.3294 2.3344 0.6012 -3.5069 1.8844 -#> -14.2707 -0.6768 -4.0255 -3.9383 3.2496 -0.0379 8.5746 0.7396 -#> -#> Columns 33 to 40 -7.0296 -12.8663 14.3987 3.2322 9.2232 0.7883 -1.6867 7.8755 -#> -1.5344 4.4558 8.8551 0.2860 3.9697 4.3488 3.3198 -2.3698 -#> 1.2647 5.6437 5.3609 1.3164 -0.4757 2.6660 5.7626 0.0191 -#> -6.5931 5.5725 7.9819 -2.8026 -7.9575 -0.1073 0.7064 6.7801 -#> -6.1830 -5.9598 -9.5774 -15.6217 -0.8016 4.2489 1.4874 1.1143 -#> 5.8825 9.6267 -5.7148 1.8366 -3.2282 0.2424 -7.8084 9.1993 -#> -3.0657 -5.0323 7.8977 0.8028 -12.1188 -3.9341 -4.0435 5.0439 -#> 2.6450 6.5642 4.8798 1.0387 -9.5429 0.2942 2.1092 -2.6703 -#> -2.1335 3.0644 -8.9395 -4.4608 3.8032 9.3958 14.5904 2.0776 -#> -4.7665 0.4154 -1.4496 1.7975 9.1528 -1.9465 6.8242 2.5062 -#> -6.3660 -3.0849 -4.3183 -17.0690 5.4184 8.2195 -5.6109 -3.5211 -#> -6.4113 2.3761 4.4897 7.6312 -2.9152 -0.5913 -7.5612 10.6324 -#> 7.5895 10.0746 -10.9153 0.2363 -1.3659 4.2210 4.9896 -9.9209 -#> -5.0598 12.0518 -9.0581 -0.0027 -3.1009 6.3472 -0.5132 -1.0659 -#> 0.3511 -1.0510 -7.4713 -7.5874 8.9896 19.5600 10.1285 4.5457 -#> 7.4178 3.2575 -6.0450 1.1417 -5.1602 -17.7806 -3.1945 -1.6117 -#> -4.3079 -1.4211 0.9433 25.7182 16.7786 -0.5493 -1.7191 3.3322 -#> -8.1885 1.3356 1.4096 -8.2148 -2.4566 -1.6299 -9.5060 -3.4289 -#> 3.3245 5.9569 5.9231 -0.3552 -18.8013 -5.1879 2.9843 -2.1808 -#> -8.7187 -7.0404 -4.1315 13.2902 6.9334 6.2543 1.4562 -8.1287 -#> -6.5858 10.9018 20.6336 10.7662 0.3862 -2.1395 -6.7924 -5.6453 -#> -2.4868 -2.3452 -6.7193 -4.8742 9.7127 -1.3753 4.8112 3.5227 -#> 0.2052 6.9031 5.0824 -2.2491 -0.1196 0.8441 -9.4147 1.7011 -#> -8.9176 -3.0189 9.5077 4.9989 -1.1380 -5.3146 -5.5733 -6.7260 -#> -0.4082 4.1821 5.2904 2.3013 1.2743 -2.9406 -6.0362 -3.0038 -#> -2.0832 -1.8657 4.9447 1.4180 0.1416 -0.1899 4.1653 -3.4298 -#> -4.8597 0.9150 0.4392 -4.8070 7.4141 14.8686 3.7364 10.1229 -#> 8.3424 -6.1335 4.3202 -2.1732 -11.5991 -13.4142 1.2317 -8.0982 -#> -0.2153 6.0834 3.4220 10.8916 4.3969 8.2267 10.9871 -7.5181 -#> 2.3822 -8.7823 4.6057 10.1561 -1.5710 -10.0309 -2.0746 -2.2866 -#> -5.8263 -0.6575 -6.0233 -4.0320 -7.1885 4.3454 9.4840 7.2089 -#> 13.3708 2.2496 -7.7976 8.7236 5.5935 5.6348 4.9756 5.5582 -#> 5.9179 1.3443 -3.3823 -1.2847 8.5173 0.9727 -0.8258 -1.4907 -#> -#> Columns 41 to 48 -4.8874 4.2648 1.4254 -11.1819 -7.5432 -11.3109 -9.2888 0.4967 -#> 9.5153 11.7349 -2.2791 0.8585 -2.8558 0.2460 1.0491 1.4689 -#> 9.3801 15.6711 2.0565 4.8049 3.5439 4.0364 15.1548 -3.6749 -#> 16.3325 -9.3679 2.0192 -3.3858 -9.8849 2.9303 1.8756 4.9476 -#> -0.8684 4.5410 -9.3713 -3.4632 12.9519 2.7012 12.1265 -1.4925 -#> -10.3416 -2.8101 -0.1286 -2.5805 10.9838 4.5101 -2.4895 11.1958 -#> -2.4855 -0.2830 -1.4296 -3.1296 -0.3691 11.4104 -3.1588 -15.5533 -#> -8.7580 -3.4773 10.9539 -2.5102 3.5234 5.4549 -5.1429 0.9529 -#> 8.7106 8.7735 -1.1035 -5.3187 -10.2647 3.4108 -11.0404 -0.5582 -#> 1.5927 12.1160 -0.9894 -0.0483 -0.4213 -11.2137 3.4887 -5.2136 -#> 8.0233 -7.5385 -1.8384 -3.2708 12.0102 -10.5743 9.0551 -0.5432 -#> 2.3194 -5.6273 -8.4390 9.7349 4.6735 9.8609 -3.3499 -5.7117 -#> 8.2240 9.1067 0.2126 0.6903 -3.8826 2.3915 8.0595 -2.0850 -#> 8.5066 5.3333 -5.2188 -13.0931 -3.6299 0.0899 -1.9622 -4.6138 -#> 8.9595 7.7200 -12.6849 -11.6411 11.2263 -0.6814 -1.3068 4.1395 -#> -3.7121 -0.5109 7.3844 4.4809 -0.1424 12.3762 1.8442 3.6132 -#> 3.2045 -0.0386 -0.3754 5.7350 -9.9900 -16.1272 2.7240 -17.6808 -#> -2.4684 3.9191 -5.6213 -6.9191 7.2749 -3.8292 -12.5974 10.8177 -#> 0.0444 1.4745 -0.4502 -0.5448 3.1292 4.7605 12.5336 -0.3326 -#> 15.8767 2.8549 -9.2265 -2.5690 -2.1676 1.7361 7.1797 -3.3801 -#> -4.3841 5.1055 6.1935 1.2127 0.0729 3.1731 -5.4489 2.8111 -#> 14.3846 1.3005 -4.7567 -11.0473 -13.8530 3.5960 -16.5615 -2.5091 -#> -2.9694 -2.3522 -5.3276 -5.7980 4.1376 -3.9642 14.9809 -3.5241 -#> 5.5171 -2.1734 0.9172 4.5474 1.9186 -9.3068 0.5903 -19.9337 -#> -9.6287 -6.7558 -12.1460 -1.5988 -3.5005 0.7460 -12.0347 4.5429 -#> -3.4691 -4.8915 8.4898 1.0439 6.9208 8.3621 3.7009 2.0818 -#> 2.2342 -6.7275 -13.9861 -5.2344 -2.4440 7.9633 3.3043 1.9516 -#> 3.4051 -5.0082 12.2827 14.6407 3.5717 1.9640 6.0396 -9.4632 -#> -9.2561 4.1927 -4.7531 -2.0654 -6.8197 1.7003 -2.1707 -0.6996 -#> -6.5002 9.6357 6.3987 5.9241 -6.2198 -14.6919 6.5156 -6.7869 -#> -4.5410 -3.4889 -8.7651 -1.1678 3.2435 -0.3625 -6.3166 3.1699 -#> -1.0508 -10.6551 -7.6276 12.8458 1.8464 -5.5304 10.2748 -19.5096 -#> -7.3590 3.5982 3.9005 4.6067 -1.3096 10.7647 1.2693 2.7367 -#> -#> (15,.,.) = -#> Columns 1 to 8 7.1115 -4.8512 6.6240 -2.8056 -3.3284 -8.8784 -1.8778 2.6504 -#> 2.0700 2.2446 3.5921 -8.6059 11.0003 4.2419 -2.6831 -13.7168 -#> -3.0170 -4.9289 -6.9572 -6.9360 3.4179 10.0614 -1.7287 -5.8838 -#> 6.8548 -18.6215 4.7617 1.7003 5.2943 -13.6111 7.5037 7.0371 -#> 0.2961 6.8685 -6.2783 5.1014 0.8988 2.8195 3.8814 6.8040 -#> -4.8547 5.2219 4.0477 16.4903 -12.9067 4.1274 18.0589 2.3969 -#> -3.1119 -5.7625 2.6623 1.4139 2.1222 -0.2198 10.5954 7.2307 -#> -7.6110 2.3029 -1.1292 -7.6671 -3.8505 7.4110 3.5311 -16.9414 -#> 1.3999 -9.8199 3.7912 5.3425 5.5118 5.7815 1.5328 8.6705 -#> 3.1219 5.7515 1.3880 -0.1989 5.4088 0.5510 -11.7477 -0.2784 -#> -14.2673 10.1715 -8.1456 1.4596 -2.2529 16.0602 3.6823 -10.3038 -#> 11.0010 -7.4895 -21.9005 7.8930 0.6953 -2.4494 -1.8775 -1.9337 -#> 8.0189 -9.5857 0.1912 -0.3771 13.6362 1.4489 -6.2672 -7.9932 -#> -2.5327 -3.8656 -7.2625 12.7074 0.4264 1.5368 5.6500 -3.9857 -#> 1.8156 1.5734 0.2571 1.3652 5.4486 9.7826 3.4503 -10.3677 -#> -3.8268 -7.5919 8.7224 -7.5668 -4.8322 2.4466 -1.4175 -2.6361 -#> 8.6276 -4.3907 2.3532 -11.3650 0.8983 -8.9388 -21.6246 -0.2398 -#> 5.3831 5.5249 1.9599 0.7754 -11.9672 -3.6293 0.7607 -1.5836 -#> -15.1287 -0.2102 -8.7870 -9.7320 3.5318 3.8834 3.2313 13.0888 -#> 2.2043 1.2443 -6.2095 -0.5036 8.3035 -2.9883 -11.5146 -0.9311 -#> -2.2014 -4.3662 -9.8363 -0.0592 2.2272 7.1386 -2.3986 -9.8353 -#> 7.3829 -15.6095 4.8038 13.5743 8.6284 -4.1394 -8.4229 -2.4527 -#> -13.6528 4.5778 -9.1974 10.2745 5.6171 1.3481 0.0050 2.3714 -#> 5.1346 5.7869 -3.6878 5.2730 12.8479 -8.4027 -0.2686 12.3093 -#> -1.9531 1.3696 -5.5243 -0.3543 -4.0368 0.4145 1.2231 6.5016 -#> -4.4855 3.5932 0.2520 -5.4919 1.7162 -2.1047 3.5553 -10.0349 -#> -0.7338 -12.2822 3.8291 1.8916 -5.6240 4.1292 13.4338 8.6450 -#> 4.1223 8.7543 -3.3275 -10.5468 0.5172 11.4885 8.5203 4.6293 -#> 0.8531 9.3552 -4.1095 -4.7907 0.7956 -1.3426 -3.6576 -5.8040 -#> 3.3943 -0.0591 -5.4711 1.9219 1.3790 -3.3002 -14.7059 9.0316 -#> 0.1388 -3.6049 -10.5187 -4.0452 -4.5910 0.0103 2.7861 -6.4096 -#> -6.2126 -1.2538 5.6032 1.9384 8.2004 3.6508 5.4220 -2.7547 -#> 1.4816 -4.9908 -3.5770 5.7046 5.3249 7.0368 -4.7861 -8.5862 -#> -#> Columns 9 to 16 7.3293 0.7758 11.2714 0.4524 10.0081 -1.8373 -6.1661 -5.1543 -#> -5.8472 -0.0340 6.9379 -9.0681 0.1452 -5.2057 -12.2126 -1.6486 -#> 2.3437 -8.1255 4.6813 4.4023 -1.8926 -3.2216 1.2384 -9.5795 -#> 11.8843 -12.4493 13.3344 -0.9240 -10.9565 4.5401 -3.0791 7.8568 -#> -9.3088 5.4911 -0.1078 0.6599 0.1283 2.3134 8.2952 5.1707 -#> 4.1178 8.3648 -9.3887 -6.1874 10.2946 -2.3948 -1.2944 1.9575 -#> 3.9512 -2.3895 -3.5280 6.1398 4.6451 -8.5465 -1.5480 -6.9995 -#> 2.2374 3.9606 -5.4656 4.2810 4.2215 -8.9381 -2.8852 -8.8617 -#> -3.6895 -4.0458 6.1441 -12.8404 -8.1936 1.2354 -1.3595 1.0117 -#> -0.8024 -6.1211 2.9179 0.7460 3.9652 0.3023 0.6151 5.0391 -#> -11.5725 28.7443 -21.9473 12.8849 -7.2069 -3.0946 -10.9281 -3.5617 -#> -5.4936 -0.1804 -4.5155 5.1270 -0.6444 -8.3817 -5.6771 -16.6725 -#> 2.3760 -7.0744 -3.9384 -0.7086 -7.3069 4.5342 2.1592 2.9307 -#> 18.0141 -6.5316 8.1737 -1.6791 -7.9068 -0.4408 -1.8226 5.0003 -#> 7.7288 10.7306 6.2264 3.0890 7.9790 -2.7280 5.4462 15.1681 -#> 8.5514 -4.4508 5.4930 -0.7188 3.4840 1.9186 1.2016 11.9940 -#> 4.5466 -9.5580 14.4330 -8.1170 8.1214 3.3552 -7.4451 0.8764 -#> -6.6361 -2.7121 0.2718 2.8905 5.6385 1.1932 0.1248 -9.6453 -#> 4.6519 10.0845 -11.9125 7.3490 -12.6394 -5.6982 -3.9589 -5.8524 -#> 7.5585 4.5320 4.8395 8.3544 -6.1310 1.5636 -1.5205 3.5457 -#> 8.8557 -3.6722 -4.3582 11.1995 -13.1770 -5.3377 0.8860 -7.7558 -#> 9.9494 -9.0174 6.8326 -8.1302 -15.9986 -1.7770 -6.3664 14.5632 -#> 12.8819 16.6322 0.9788 0.6587 -9.5508 -0.4219 1.1572 1.0094 -#> -13.2621 -4.8329 -3.3790 0.6607 -3.8674 19.4960 -2.4050 1.7530 -#> 8.2321 3.4733 4.7779 -5.3348 0.2874 1.8593 -7.6614 -6.7333 -#> 9.0961 5.4900 -4.4653 6.5680 0.2827 -4.5207 3.0002 8.9991 -#> -4.3711 8.0910 14.0509 -7.3617 -4.7490 9.0630 -1.7344 -2.7312 -#> -1.7179 1.9318 -9.6681 16.6352 4.7708 -6.8043 -6.1491 -0.2504 -#> 14.4281 -10.5506 18.0719 0.6034 2.2336 0.7469 4.2527 3.9240 -#> 3.4098 3.1782 -1.2625 5.2707 3.1843 3.9704 6.9703 3.3117 -#> 1.1391 6.7281 8.8927 5.2655 5.8992 -1.3787 1.0948 -1.4335 -#> -7.3976 0.4969 -0.9509 -7.0835 14.7301 -5.6787 4.1450 15.2954 -#> 12.4407 -4.6198 -0.9390 -2.1177 -4.9047 -1.7441 7.3410 10.5900 -#> -#> Columns 17 to 24 -7.8632 -2.9569 -7.4812 -13.7157 8.0637 2.3197 -11.0591 6.0891 -#> 5.2080 13.6162 9.5811 9.7852 5.6479 -0.6554 -1.9055 6.7459 -#> 0.0767 2.2985 -3.4298 3.6582 3.5657 1.8876 4.9415 -0.0905 -#> -0.8436 -8.1971 5.6436 -0.8189 -2.7185 -6.8124 1.8173 1.5997 -#> 16.2083 16.5765 -9.1467 1.2284 -12.6130 -4.5548 2.0418 1.2082 -#> -0.3672 -0.1228 -0.0400 -4.1638 0.4942 -6.9269 7.8788 -6.5652 -#> 13.4752 -1.3951 -7.7839 -2.8148 10.6688 3.2956 5.4087 -6.8197 -#> -5.1719 3.6068 8.4387 2.3832 9.2946 7.0327 -3.0611 -2.2567 -#> 1.8095 11.6708 -3.7466 6.6039 0.9131 -5.9405 12.7895 -1.3867 -#> 16.9034 14.6173 -1.4816 12.0900 -9.7003 -8.1873 -10.4125 3.2021 -#> 18.8224 -3.7732 -8.4614 17.2741 -12.3936 11.3341 -9.2802 9.3729 -#> 2.1984 4.4343 -2.0045 17.9349 12.9631 -10.8768 5.0915 -1.7478 -#> -1.1283 10.2250 -0.7795 9.2527 2.0497 -6.3417 6.1346 -4.9914 -#> 3.7214 -17.4149 -2.9213 -3.1648 -9.2139 -1.9090 16.3691 -1.6833 -#> 17.7869 1.2456 -8.1943 -6.7983 -12.6561 1.0241 8.7451 1.9875 -#> 4.7919 -7.9875 5.3562 -4.6082 -12.4467 -9.7171 4.8647 -3.7000 -#> 8.2106 7.4021 -7.2296 14.3541 0.3103 -1.5755 0.0523 -11.7865 -#> 4.0221 -0.2243 -7.8152 2.0820 3.3237 -8.2121 -3.1208 0.4177 -#> 2.1557 6.9841 3.9537 3.1003 -4.4127 13.4479 6.2930 9.3346 -#> 6.3638 -7.4629 -22.5791 1.1678 -7.8419 11.4958 5.5791 -2.9846 -#> -8.2702 -13.1500 3.4012 4.5281 11.8472 9.1611 -0.4623 4.2150 -#> 5.4411 -12.5264 6.7563 -2.7371 5.6250 -2.9614 7.0123 4.2767 -#> 5.8710 -6.3890 1.2708 -12.6806 -11.1545 2.2729 2.5553 14.9886 -#> 3.2095 8.6801 -0.5554 8.1789 -1.8506 0.1934 -27.5572 -1.5930 -#> -6.8670 -1.6240 -12.6488 -5.6393 6.8329 -1.8703 6.1925 -2.0215 -#> 6.8801 3.9142 -2.3834 -8.1555 2.7157 7.8278 2.2930 -6.3260 -#> -2.3847 -1.3235 -3.3893 -0.3491 4.1305 -3.4685 10.7051 2.8873 -#> 8.2645 -10.5847 -0.8847 1.2763 -10.3263 7.4525 -9.3715 3.6793 -#> -2.3518 -0.1390 1.0705 -5.8702 -6.4483 -4.0465 8.3043 5.2560 -#> 3.2468 8.2362 -10.9262 -6.5290 -12.8279 -5.7303 -1.3841 -0.0960 -#> -3.0723 10.6398 12.0713 0.7000 -3.5167 -14.5697 3.6254 -2.7057 -#> 0.0036 -1.4688 3.9714 -0.6494 -15.3354 -3.4704 3.0982 -7.4453 -#> 0.4233 3.8652 1.7919 -11.7475 5.7635 -13.0052 8.0559 -2.3109 -#> -#> Columns 25 to 32 1.6111 9.8913 -5.1667 1.7823 4.4968 11.4306 4.7141 4.1485 -#> 11.6637 -6.4701 7.3143 -4.6603 5.1235 -7.4082 8.3548 -0.5569 -#> 0.6283 -6.8813 2.5591 -1.5133 4.7381 -2.9992 9.0115 -5.5502 -#> -5.2239 -4.3276 5.1755 -0.0235 4.7770 -7.2770 -5.2854 5.8824 -#> -8.7289 -0.1826 4.4126 -2.3563 -3.5984 -9.1043 9.6573 -0.8086 -#> 4.8235 5.1700 1.8015 4.0285 7.1082 -7.7678 5.7203 -16.4164 -#> 4.0143 7.2377 4.8562 9.9579 -4.2431 0.6437 -3.0818 -3.3971 -#> -1.1889 6.8866 4.1312 3.2933 -1.7267 1.8717 -1.0299 -4.4844 -#> 7.9196 5.0489 -2.6998 -6.7225 -5.8232 -7.8611 -8.9995 8.6625 -#> 6.5469 -5.5016 5.5969 -7.3770 1.7542 -13.9674 6.6170 -3.4024 -#> -3.2858 -0.3681 14.5794 -3.3098 12.1049 -8.3324 -0.8725 -12.8950 -#> -4.5755 8.1768 10.8769 -4.6412 -1.5418 6.8211 5.3414 -1.7760 -#> 4.8843 2.1762 -8.7745 -0.1388 2.1083 1.3516 -11.3251 10.4455 -#> 3.3188 0.7462 2.9958 3.4060 2.7944 -4.2444 4.0027 -0.6496 -#> 3.7560 -13.5172 1.9928 10.8493 -1.9686 0.5085 3.2974 7.6224 -#> 0.7598 -7.6283 -2.8274 4.6157 4.2587 -4.9954 -2.3554 0.9759 -#> -3.5939 7.8641 0.4557 9.6153 -11.5511 0.3652 5.8262 1.3749 -#> -1.5936 -5.9849 -0.1979 -1.2771 -2.1348 -12.1289 1.5487 -8.8455 -#> 7.6842 4.1295 8.3468 10.5882 5.1853 -11.4518 6.2370 -5.1755 -#> 1.1376 -0.9603 11.4751 7.1682 0.3050 13.5715 11.1495 2.8284 -#> 8.3730 10.5045 14.0180 -11.2211 -1.3625 2.3138 3.8882 -6.8152 -#> 3.7236 -1.7148 -7.6170 -10.5873 -3.2708 10.6142 -7.5766 8.5804 -#> -7.0491 2.5127 8.0336 9.0789 0.4238 6.1985 12.7871 -13.7677 -#> 2.8779 10.3715 -17.5331 -10.4388 -4.2289 -2.8544 3.5366 -13.0072 -#> 14.9275 8.7207 3.2939 4.4447 3.7926 -9.0021 9.4168 -1.2667 -#> -5.4648 7.6661 13.5957 10.1384 1.0440 -2.1800 3.8088 2.7490 -#> -6.1931 2.9256 6.1095 2.7975 6.8424 10.0426 6.3487 9.9140 -#> 5.5017 -2.1072 6.9378 2.6380 -1.2445 -0.7814 -0.1116 -12.2251 -#> 7.5641 -1.5218 12.6226 3.0393 4.7455 -10.6471 8.8348 8.3849 -#> -10.8081 6.4331 -0.9774 6.6743 2.1463 2.1770 -8.2339 -16.4321 -#> -4.9075 -7.1971 2.3138 7.9496 2.5724 2.2093 3.0592 -2.0602 -#> 2.3171 -3.5195 3.2429 3.8005 -9.6169 5.8675 -6.2395 5.4772 -#> -4.0414 7.8393 -3.1173 -0.2646 2.4630 4.9677 -1.7572 7.2229 -#> -#> Columns 33 to 40 -1.1876 -10.1433 -6.3296 -0.4366 -8.4276 4.5743 10.7330 9.1425 -#> 5.0169 1.3334 11.1390 -3.0555 -5.2267 2.4215 3.0188 -13.6635 -#> -2.9091 7.9281 9.3381 8.1316 -7.7611 -3.3668 7.0043 7.1997 -#> -3.4941 -8.8710 22.0236 2.1432 -11.5798 13.7316 10.7663 -0.7452 -#> 0.7552 17.7075 -3.7739 -8.6597 7.8503 -3.0094 -2.7635 0.9666 -#> -10.9439 -3.6854 1.2291 -11.6378 14.9099 0.2285 -15.1566 -4.2762 -#> -0.5783 15.5054 3.0954 -3.3504 -13.4509 -2.3075 9.3905 1.3299 -#> 2.8292 -1.6403 8.7995 -4.3770 -1.7240 -1.7656 -8.9716 -2.7895 -#> 2.3974 13.9734 1.0183 -8.1300 0.0174 6.9550 8.2570 -8.0587 -#> 3.0301 11.8596 8.8648 -3.7870 -8.1850 -2.5475 -4.2095 -0.0444 -#> 10.4876 10.5542 -4.7367 4.3377 -11.2366 -0.7845 0.3486 -2.2428 -#> -5.8378 2.4098 -0.9600 -4.4779 0.1891 -6.8651 -9.0412 -1.2870 -#> 0.5132 3.5568 7.9932 3.6603 -3.2506 -9.6204 -3.2829 11.5460 -#> 4.3603 -4.4309 2.1633 -4.6632 -2.0197 10.3826 -10.8853 18.2003 -#> 7.5312 9.4417 -14.0978 -10.5704 7.4753 2.4065 -12.0700 8.9292 -#> -5.9442 5.7016 10.2508 5.5452 -4.8535 -2.5900 -5.1088 4.2898 -#> -18.8976 5.1532 -1.7340 8.3101 5.6807 -2.8783 -3.3924 0.9158 -#> -4.6655 -2.7845 2.2621 -5.1463 11.9983 5.4542 5.0857 3.3174 -#> 4.7172 8.5207 8.1565 2.4791 -7.0028 -0.8125 -4.5575 -3.3890 -#> 4.9781 3.8925 -6.7012 -3.3485 -9.6601 -9.7996 0.6632 4.9651 -#> 6.2889 -6.5221 -0.1579 0.8559 -3.5223 -2.9480 -8.2845 -3.6279 -#> 16.2025 -4.6872 0.9619 -2.2647 -0.9014 -1.9281 10.6382 1.3064 -#> -11.7320 -8.8520 -8.2607 -1.3817 9.6871 3.5138 -12.0845 0.8755 -#> -1.4128 -0.8471 -8.2075 6.5583 -9.8319 0.0534 -2.4484 3.2296 -#> 0.1093 5.4801 -5.7720 -4.4883 1.0138 11.8541 -5.7959 -13.5700 -#> -2.7510 3.9824 7.7852 -8.5101 -0.6763 -0.6188 -4.2763 -4.1366 -#> -7.6959 1.4529 -2.4865 -10.2923 0.1622 3.2543 0.6042 -8.5631 -#> -5.8255 3.2380 -5.6855 10.0065 -13.5559 -18.4401 -6.9620 6.1546 -#> 14.2976 -3.6410 2.4764 -14.9836 -9.5727 12.6544 -12.3706 -4.0015 -#> -4.1990 1.5840 -2.1817 14.1879 -4.8348 -12.4974 6.0161 17.1821 -#> -0.7287 3.5166 6.5444 -5.4296 2.1586 -1.0379 -4.0262 1.5960 -#> -3.1356 4.7152 -3.5899 -3.5598 4.9465 -12.6657 -0.7683 -1.2444 -#> 4.7171 0.5994 0.7119 -1.2080 -0.6086 -13.2839 -5.4848 2.1458 -#> -#> Columns 41 to 48 12.1076 4.1011 -2.1738 -12.9176 6.0466 -1.2080 4.1916 -14.2161 -#> -11.0032 -2.0739 -9.6891 9.1633 -6.7289 -0.9261 -11.6838 -9.1522 -#> -2.8997 -6.1992 0.2605 -0.8266 3.4657 6.5743 -7.7402 -0.3860 -#> -9.4210 0.0803 -11.6765 7.1376 2.6870 -8.4538 -5.3343 -1.7722 -#> -6.3655 -12.2643 8.3479 -0.7230 -13.5265 0.0571 -4.1167 -0.2769 -#> 7.3242 -2.8557 6.1877 5.6059 -0.9670 0.8556 8.1311 3.8170 -#> 0.9440 -6.0001 5.4027 3.7200 8.2664 -4.4506 -8.2239 3.3115 -#> 1.1671 1.2033 -0.1148 4.6368 9.2238 -8.6745 9.8473 1.3444 -#> -7.4524 -2.3227 2.5718 0.3380 4.2563 -6.7630 -0.9309 -14.5055 -#> -10.5321 -0.5466 3.3322 8.5984 -15.1215 -3.8004 -5.8031 -4.9785 -#> 0.5278 -2.1349 0.5429 6.7788 -7.7084 -6.8593 3.6495 5.3098 -#> -3.9173 -12.6153 6.4391 -2.6274 5.9788 -2.6989 5.6161 4.0751 -#> -9.5523 9.0388 -3.6559 0.7355 -6.6148 2.9918 -4.4288 10.3985 -#> 0.3431 -0.1633 9.4491 -0.1662 0.1703 -8.1686 3.9250 -1.9232 -#> 1.5288 -2.3083 -5.0019 -2.2136 -13.1283 -6.2529 -12.6464 3.3033 -#> 1.8642 -1.6308 0.4104 -2.1203 6.6278 -7.2271 9.7675 -0.6935 -#> -3.5142 -5.4236 0.4054 4.4837 -1.8545 -1.1056 -16.1740 -11.8274 -#> 7.2128 0.4841 1.1405 1.7895 -2.4622 1.0410 5.6031 -9.5526 -#> -5.1216 -4.1475 0.7678 -7.3741 7.0402 -3.9091 5.0237 1.8940 -#> -2.9024 6.0801 -1.3226 -1.9505 -7.1316 -1.0240 -7.6500 1.4707 -#> -4.6808 -3.3517 -0.6546 -1.4612 6.0778 8.9817 7.7796 -10.4063 -#> -3.9926 9.0640 -7.9229 4.0447 2.5237 -11.6426 -7.8623 -10.9089 -#> 4.1280 -9.8034 9.1217 5.1864 5.1205 -6.8890 -4.8244 -0.4584 -#> -0.4377 6.9769 8.7374 -10.3751 9.5877 14.0208 1.6259 -4.7922 -#> 5.0793 -7.7669 5.9387 -13.6609 4.1338 1.2238 10.1254 -4.0397 -#> 4.3210 -7.1856 -3.9623 2.0145 -3.4725 -12.8630 3.5290 2.1779 -#> 0.4560 -4.6627 1.9266 2.4260 -2.8334 -6.1060 1.7071 4.3015 -#> 5.2938 -0.4437 -0.2218 4.4239 6.5981 10.4136 4.6831 3.9865 -#> 0.0827 -6.5887 1.4364 -9.5022 -10.8417 -7.3839 10.5732 -8.4682 -#> -0.0789 -5.1351 12.8619 -10.3998 11.1001 -0.8602 -1.0153 -5.6380 -#> -0.8525 -8.5984 -4.2362 2.2819 5.7807 3.9225 5.1029 -3.2233 -#> -6.7520 2.5125 -7.0239 0.2492 -17.7612 7.3457 -5.1606 21.6853 -#> -1.2298 -2.5437 6.2544 -6.8056 -5.4504 -6.6582 7.4377 4.6695 -#> -#> (16,.,.) = -#> Columns 1 to 8 -1.3204 0.6526 10.5092 -5.6327 11.6096 5.4670 4.2140 0.0346 -#> 5.5739 -3.2615 -10.9422 15.8088 -5.3255 -9.1013 -4.6959 -0.5321 -#> -1.4791 6.4599 -3.5330 18.5985 -7.7456 -0.7409 4.7767 -1.2939 -#> 2.1681 2.9503 8.4330 -5.2447 -17.1378 7.1880 -10.5660 0.0754 -#> -7.0732 14.9318 3.6820 -3.9117 -10.7969 -8.7032 2.4902 0.1089 -#> -6.4173 8.3155 -8.7507 -5.8865 -10.0633 6.2456 -5.8225 -4.2588 -#> -0.2864 3.1815 8.6395 -0.3138 0.3553 -5.9444 -0.0749 -2.6012 -#> 1.0320 -8.3929 1.9943 1.1173 5.1423 -6.1033 0.8580 4.4847 -#> 4.8267 6.8426 0.9700 -0.1088 -1.3790 4.9978 2.2415 -0.3459 -#> 3.9409 0.6129 -12.8229 13.8631 -6.3605 -12.7444 -0.1254 3.9160 -#> 0.0052 12.2717 2.5303 -3.5619 -2.9978 1.4443 1.1008 -9.0313 -#> 2.6251 -6.9490 14.1019 9.4191 -15.9656 -6.0713 10.5775 -3.8202 -#> -1.0193 -6.3644 1.8182 11.5701 -0.1826 -6.2748 -2.8998 16.5332 -#> -9.2020 19.8385 -5.9341 12.3887 -10.9639 9.4341 -1.0446 8.9571 -#> -14.0495 11.1773 9.4797 2.2405 -11.3045 -14.3312 -0.5920 8.3753 -#> 7.1392 4.5487 -10.9330 -3.0501 2.0338 0.5864 -6.0851 6.9316 -#> 2.6657 0.9035 -5.8575 -3.1847 9.1681 -1.4086 2.6142 13.2898 -#> -2.2047 10.0648 8.4240 1.1256 -2.6297 -0.3316 4.8572 -6.3603 -#> 9.5895 -10.9603 2.9531 4.1686 -4.6578 1.8648 -3.9940 -0.8700 -#> 7.0689 -1.4863 0.9992 7.1206 -0.5830 -4.1616 6.5459 7.8172 -#> 9.3923 -5.3182 -5.8125 1.4640 1.8993 9.1349 3.1091 0.5354 -#> 5.0575 3.5568 4.8846 -3.3064 4.8651 3.0455 -4.5414 6.5744 -#> -12.5866 3.3297 1.1292 -7.9103 -4.2261 -5.4005 4.1291 1.8921 -#> 10.1886 -21.3950 -15.0034 5.4307 -1.9230 2.3486 -17.8280 -3.1337 -#> 10.0871 13.7585 -9.1602 2.4039 -4.3966 6.6193 -0.0315 -5.2115 -#> -10.1880 3.9860 5.6668 -11.3652 2.0414 -4.1076 -1.0125 8.7918 -#> 0.1394 5.1834 9.7742 -11.5945 1.0495 -1.1102 5.8734 -1.9149 -#> -17.5434 -6.5157 1.6953 4.7723 -0.9014 0.5178 6.9369 2.9006 -#> -0.3901 9.4402 -11.9338 12.7419 -8.8317 -5.6769 6.9202 0.9321 -#> -7.3379 -7.1466 -0.9682 0.3887 4.4588 8.6968 2.9510 1.6989 -#> -0.7674 -1.7466 3.9111 2.1979 -10.2515 -4.2773 9.5866 1.0826 -#> -1.5783 7.1853 -6.7278 -9.1641 -15.5977 -10.6462 3.5289 5.9880 -#> -4.7147 7.8592 -9.0302 -5.2358 -3.3409 -1.7656 -4.6556 -2.6884 -#> -#> Columns 9 to 16 1.0254 10.3361 11.6621 5.4113 -5.4113 1.7748 -6.8710 -9.8104 -#> 0.4625 4.9036 5.4948 -4.6719 -3.3553 -0.6379 0.6686 12.4877 -#> -7.4537 -5.7242 -1.2048 -7.6518 -10.1702 -7.5392 1.4355 12.3038 -#> -0.9118 1.9426 -22.7376 -2.0421 -3.2993 6.2670 -8.1239 7.8196 -#> 3.0716 -2.4960 -0.7130 -13.1407 9.6765 10.1313 14.9288 -0.7429 -#> 7.0794 -9.5038 2.5003 -5.2492 -1.7645 -1.3024 -3.8371 8.9790 -#> -5.1030 -6.9457 -0.9513 0.6588 -1.2483 -1.0004 2.7857 -3.1608 -#> -2.9562 -6.1328 3.7558 8.8549 -16.1291 -5.9037 -3.9408 6.5159 -#> -1.6588 3.0108 -12.8467 -1.6978 -15.3076 9.3569 4.3810 -2.0955 -#> -4.3906 0.5790 2.3439 -12.0252 -5.3709 9.9952 10.5764 10.3523 -#> 6.5793 1.0548 11.9372 -2.0547 8.4397 -1.5919 3.8448 2.3324 -#> -1.5664 9.7998 22.0340 8.1030 -11.4524 -15.2741 0.8491 -3.5748 -#> -11.9949 -1.1028 -1.6206 -2.0485 -9.2684 -1.9960 6.5735 15.1946 -#> 0.5942 -1.1728 -5.7841 3.6862 -7.8729 5.5735 -7.0033 -1.6853 -#> -1.7875 -9.1322 -7.6958 -18.2343 4.8327 9.3955 3.9783 -3.5601 -#> -10.6473 -8.1347 -16.5191 3.3761 -4.2651 15.5675 4.2293 0.2734 -#> 7.0049 13.8239 -7.1527 1.1260 -1.0699 -1.6748 -1.0046 -0.9413 -#> 7.5757 2.4887 13.2990 -6.5295 1.1039 4.3652 -0.2265 7.1046 -#> -5.9953 -13.3316 -5.4457 -8.3539 4.9143 -7.1104 -1.9930 8.8147 -#> 8.4529 7.0763 6.0135 4.3569 11.1647 2.8066 -11.1874 -16.3402 -#> 2.9189 -8.3507 4.3038 3.2874 -5.8044 -2.3187 -17.9639 -7.2240 -#> 3.0913 7.9585 -5.1885 3.2803 -2.8590 9.8295 -2.5579 -16.5575 -#> 2.7880 2.8779 0.6772 -9.4118 8.2246 -8.6007 -2.6125 -6.7030 -#> -10.5021 -5.8253 2.1096 -1.5385 2.2725 -4.3075 1.2849 1.6933 -#> 13.3302 -10.1520 -1.0219 -0.6172 9.4727 3.5504 -4.7962 -8.7593 -#> 3.4697 -6.5768 -6.3320 -1.2067 4.5735 2.2156 4.8014 -4.8593 -#> 1.3233 2.2615 -7.9351 -8.0059 -1.7954 -6.8646 -4.6570 -2.4058 -#> 0.3469 0.1460 17.7335 4.3834 9.7596 -7.1045 2.4103 -7.7036 -#> 1.1120 -6.3162 -1.7628 -1.7410 -3.0155 7.6801 0.8479 -1.0478 -#> 4.1956 10.0632 1.7495 -5.7685 3.5873 -0.3796 8.8077 -4.3659 -#> -5.1423 1.6058 2.3845 -8.5698 -10.1795 -4.4278 7.6315 3.5999 -#> -12.7859 -1.8685 -14.3316 3.0497 7.8189 -6.5503 9.9442 5.3623 -#> -4.5302 -7.2908 -7.1151 4.5172 -1.1260 6.3266 7.6171 -7.9990 -#> -#> Columns 17 to 24 2.8025 2.8271 2.6480 3.4959 5.0904 2.8226 -0.0594 13.4359 -#> -14.2048 -9.2178 2.9372 -0.3767 -5.0131 -6.1643 1.8080 -4.0727 -#> 1.7859 -2.5607 -1.9930 8.0479 -0.7414 3.8054 5.1039 -2.8461 -#> 10.4622 -6.9029 -11.8365 10.8772 -2.8240 -2.7649 7.2829 -5.8833 -#> -14.1812 -10.3616 -4.2358 3.8300 1.7081 3.1369 -2.0032 -6.1442 -#> -0.5696 2.5405 -15.5316 -5.1835 -6.0208 2.9822 -0.5974 20.8739 -#> -5.6804 -8.5035 -1.6632 -0.6585 -2.4957 3.7135 14.2337 -4.5331 -#> 1.7501 0.6361 -5.0097 -6.8476 -5.1944 12.4870 -3.7354 0.9425 -#> 5.1651 -11.6754 1.1540 1.5580 7.1286 4.3611 -2.7560 -18.3934 -#> -17.0134 -11.4174 0.8793 -0.6620 4.0115 1.4018 -0.8441 3.0652 -#> -10.4883 -1.5086 8.5144 3.9381 -1.8614 13.6181 -9.9868 7.9517 -#> -10.4059 22.3095 2.7550 -2.2265 1.2176 4.2821 5.2935 -1.3582 -#> 4.4460 -3.1476 -7.2154 -9.6695 3.5336 -1.6278 -9.2159 -10.0800 -#> 8.1050 2.0959 -3.9473 -0.7106 -2.8918 5.4461 -3.6736 -1.2730 -#> -6.1247 -1.4076 -4.4631 1.3634 5.7326 -2.7705 -12.1562 -6.0150 -#> 6.4228 -7.8171 5.9113 3.0675 2.0606 -1.1603 4.6238 -8.2037 -#> -1.6866 -5.6572 0.2234 4.5182 3.1158 -13.9857 6.7237 4.4937 -#> 0.1855 -2.4545 -7.3087 1.1111 -3.9793 6.8628 6.6371 11.8543 -#> -6.6008 0.6008 4.6997 -1.5596 0.2766 -3.9507 -0.4521 3.7588 -#> -6.1623 4.5682 6.6610 0.1241 10.9521 1.7854 2.6624 -2.3684 -#> 8.5402 15.9268 5.0533 -1.7578 0.7582 5.4772 -1.6012 -10.3979 -#> 11.4607 1.5538 -7.0335 1.5337 5.4602 3.1144 -6.8919 -15.1536 -#> -0.9874 11.4116 -1.7802 -16.8985 -2.0688 -12.0846 -7.8504 7.7643 -#> -4.8618 -0.8623 6.5522 -3.0675 0.4466 -6.2224 -1.0398 3.3669 -#> -6.9103 -0.4912 13.6113 8.4300 5.0445 -1.1812 12.4340 5.0129 -#> 2.9676 -3.5193 -6.5441 -1.1419 -0.4146 -2.7309 -3.9625 0.2427 -#> -0.5066 -4.1027 2.8202 1.5772 12.5458 -7.5701 -1.1209 -7.5376 -#> -2.0766 4.6149 10.3450 6.7406 -4.9339 5.7498 17.3834 2.1269 -#> -9.6193 -1.3648 16.9052 6.0183 4.4245 -0.6685 2.6400 -7.4417 -#> 4.9282 9.5929 -7.0839 -1.3978 -6.6698 -8.3408 0.2958 5.9292 -#> -6.4539 0.5636 -5.9307 -1.6424 5.9629 3.1537 -4.6600 5.7245 -#> -4.9213 0.7057 9.0855 3.7485 3.8237 -11.5914 -0.5496 -9.8527 -#> 2.9095 5.2565 -2.3803 9.6923 -1.2148 -3.3360 -1.4157 -13.0961 -#> -#> Columns 25 to 32 17.6035 -1.0098 0.4745 5.2585 -8.8529 8.4093 -2.0189 -14.7016 -#> -4.3281 -14.1630 5.7770 3.3682 -7.6735 3.5629 0.3521 13.0974 -#> -9.4802 -2.8482 -2.6903 -2.4335 15.1989 -3.3284 -11.1240 16.1645 -#> -6.6805 -17.1317 0.4984 7.8714 -7.6200 -5.2047 4.7747 4.5009 -#> -5.8401 -6.7577 -3.1531 -3.6990 10.3882 -6.1229 0.1506 14.0069 -#> -3.9239 -1.7848 15.8552 -18.3749 4.1166 7.0067 -9.0793 7.1111 -#> 3.9034 -3.2623 6.3512 6.5142 14.1725 -5.2678 -7.8942 10.7899 -#> 4.7258 0.0566 4.4063 -9.2343 -4.1122 10.0212 -5.9066 -8.8272 -#> 7.2739 -0.5225 -9.5228 12.9427 8.1354 1.2012 1.9712 12.4760 -#> -7.0641 -7.0924 -7.2254 6.4615 -2.3076 -5.9760 7.1168 12.0361 -#> -15.8013 4.0385 -0.1108 4.2316 10.0294 -14.1306 3.5020 7.7976 -#> -3.4861 0.6688 1.7526 -18.0820 5.2449 -15.6297 -13.8289 3.2798 -#> 5.3501 8.1111 3.2815 4.4101 8.1744 -0.5013 -2.3403 2.6503 -#> 1.1117 9.9882 11.2948 10.9378 7.5708 -11.2358 6.5127 2.0092 -#> -9.1659 -7.0353 6.4879 2.5423 13.5808 1.9107 8.5529 9.1137 -#> 5.4336 -12.5380 -12.8151 16.0256 -12.6789 -4.6559 13.5395 1.8768 -#> 9.7476 -7.3086 -6.5087 12.6011 1.0711 -4.2273 -2.4932 -3.8373 -#> 7.3071 -3.0686 4.7835 -14.8320 1.5951 -10.5570 -17.9316 8.8383 -#> -12.3674 -5.0916 6.1914 3.6960 2.2219 6.4669 6.3904 11.3875 -#> -0.5713 3.2452 -1.6545 4.4855 6.2768 -1.4021 -2.1622 -7.2800 -#> 6.0130 -2.4392 2.8221 5.5003 2.2283 -5.5170 2.9483 -8.5641 -#> 3.1298 7.8611 -2.8450 13.6667 -4.9227 0.3805 7.6739 -5.1870 -#> 2.8829 -0.7680 12.2108 3.1766 -4.0264 7.3800 3.9469 -11.9077 -#> 8.4839 -6.2682 -5.1891 10.8249 -7.2002 -7.9541 3.9664 -1.7241 -#> 0.0130 -6.5531 -3.9633 -11.3358 -2.3152 -5.5856 -5.8848 14.1289 -#> 0.2054 -2.6318 3.5308 4.7872 6.3370 13.6606 -7.1487 -0.2621 -#> 5.9394 4.4685 -0.4925 6.0674 1.9031 6.0924 3.2171 2.2223 -#> -12.7919 0.8670 0.7464 9.2208 5.1413 -3.4920 -0.8628 -1.1500 -#> -10.9959 2.0829 -6.2836 -3.3308 -1.2404 0.3261 3.4622 6.7565 -#> -1.7894 7.0095 -4.7989 5.9668 14.7276 -0.0832 -1.1292 -2.3796 -#> -3.8813 -4.9567 4.5870 -20.2590 -3.6941 11.4605 5.8353 5.5101 -#> -21.2222 -2.7852 -0.2706 -3.5601 -10.8011 -0.5643 7.5233 2.2526 -#> -2.8226 -1.7783 -0.7867 -6.6398 6.8352 1.3618 -7.3804 -0.3211 -#> -#> Columns 33 to 40 4.6767 -5.6769 -8.2998 1.8646 -11.7082 -0.1739 0.7321 0.9858 -#> 9.6309 -4.0555 -4.5445 -6.0392 -1.8424 -4.5782 -2.2394 2.7480 -#> 19.3465 1.4504 -0.5011 0.6825 8.3566 -6.0887 -12.4131 1.4882 -#> 15.4626 -0.9147 -2.8020 10.4818 0.1867 -2.4863 -4.4813 8.9666 -#> -2.2579 -4.4858 -2.6382 -8.7673 -2.4578 8.6333 -0.1472 0.8210 -#> -3.8017 -4.5528 2.1585 0.3108 4.3367 4.4167 -4.1404 -7.1562 -#> 6.1018 -1.3892 2.6067 11.0037 -0.8936 -5.2789 -3.8202 -8.6666 -#> 9.7097 3.3408 0.3116 3.6396 0.1723 -5.4994 -3.0563 -3.6130 -#> 4.9959 1.7683 -1.3992 -1.3118 -4.7137 -3.1537 2.8307 8.6702 -#> 5.5646 -1.9904 -0.2973 -14.9815 3.3672 0.3944 0.3536 3.0087 -#> -18.5556 0.8547 3.2588 -8.9740 -5.4412 -7.7150 -7.0575 5.6699 -#> -12.5947 -4.6393 -2.0987 -10.0344 2.7829 5.2220 -0.2026 -6.7648 -#> 13.9009 10.4672 0.7112 -1.6072 -6.0489 -4.2060 -7.4302 11.4814 -#> 2.8680 2.9533 3.4821 3.1926 2.0164 -6.2120 -8.4158 2.3738 -#> 2.9694 -8.4894 -3.6779 -3.0176 -9.0022 7.1106 1.5780 7.5124 -#> 7.1181 7.5552 -9.9423 5.9438 4.1525 -1.6902 -6.9626 0.7560 -#> 3.2141 -9.8982 -6.5783 -6.5020 5.5751 -0.7571 1.3244 -4.2656 -#> -1.6315 -6.8276 10.5276 2.9332 0.0299 -0.9580 0.6356 -1.0171 -#> 6.1094 11.5435 -2.5057 1.0803 -5.3372 -16.7403 -5.9370 0.2371 -#> -10.7809 -6.6780 -11.6158 -7.0993 1.7148 0.5335 -0.7110 -5.7289 -#> 4.4232 8.1946 -2.5558 8.8174 9.0208 -4.4722 -3.7291 -11.3726 -#> -5.6536 1.3000 -6.5301 3.3230 -7.4231 4.5805 7.0784 11.0849 -#> -1.1154 0.2432 -2.7664 1.8290 12.1905 4.9875 2.9038 -13.9286 -#> -4.1042 6.4505 -2.1113 6.7506 6.6225 9.8238 11.0429 -0.3648 -#> -10.7701 -6.5126 3.9260 0.4233 -5.3073 -2.3215 4.7964 -4.2986 -#> 10.9507 4.4344 -11.7721 3.5976 -10.9204 3.9153 -7.7275 -1.1018 -#> -1.1800 -1.4308 -7.2643 -2.7054 -0.8770 3.8441 -2.1754 -8.6870 -#> -10.8437 -0.3657 -7.6537 7.2054 7.0568 0.1700 -12.9408 -9.0044 -#> -0.5383 -5.7443 -2.7455 -9.1649 -1.6883 -5.2865 -0.0477 2.8188 -#> -3.5584 3.7999 -4.8056 5.4437 8.3543 -2.5686 -14.0578 2.3122 -#> 3.3393 -6.5970 -3.9285 -7.5108 -5.0091 1.1976 3.0261 1.5317 -#> 6.3473 -0.3590 3.6051 -7.7109 6.6993 0.2420 0.8268 9.6148 -#> 2.3679 -4.2571 -7.5652 -2.6319 -0.2594 3.5161 -3.8648 1.3730 -#> -#> Columns 41 to 48 -8.9004 -2.4431 10.1769 -11.5539 0.9727 1.1449 -4.1664 -5.9406 -#> 0.7273 -1.5841 -4.8308 0.8003 4.3019 -3.4016 8.7224 -4.3127 -#> 9.4124 -5.4304 -7.3879 -1.5780 -0.8244 -9.7549 7.1619 -1.2881 -#> 4.4182 -9.6812 2.0931 -7.2892 -9.6957 -6.2069 13.2915 -14.1783 -#> 12.7210 16.2381 1.1936 -1.2641 2.7008 7.9641 -5.9156 2.2741 -#> -0.3840 -4.3539 0.3929 4.1084 13.3946 -1.5061 -16.0275 14.1161 -#> 4.8527 4.2228 -5.5224 -10.0346 0.8754 -11.9943 3.7959 -1.2697 -#> -3.2294 -7.6962 -0.5004 0.1692 -2.6538 -2.7955 4.5124 7.7744 -#> -1.2764 -5.2906 -6.6582 1.0561 1.0314 8.8017 -1.1811 -5.1169 -#> -0.8060 11.0308 -2.5343 4.9971 -3.6082 3.8307 1.2113 -1.5236 -#> 8.1704 4.0161 -3.9638 -0.7323 -7.8175 3.4363 -1.1549 2.2999 -#> 18.6318 -2.9746 -5.0073 -6.5272 -2.8055 -3.2255 -4.9634 6.8455 -#> -2.7224 -6.3045 1.2384 10.8459 -4.7152 2.5874 11.1960 3.3570 -#> -0.6983 -11.5129 -7.0210 0.7556 -6.1628 -4.8726 9.0647 -7.1177 -#> 6.3566 5.4712 -5.9771 5.7913 -4.8866 0.0971 5.9906 -0.9292 -#> 0.4678 -0.3558 2.8748 7.5943 -7.6338 -2.0429 9.8464 -6.7635 -#> -14.0334 6.1741 0.1270 -3.7588 -5.0111 2.7787 -5.4135 5.7406 -#> 12.3765 2.0786 6.4566 3.8771 3.3312 -17.2266 -4.5957 -4.6832 -#> -4.6396 -8.2818 1.9074 -12.3811 -14.7881 0.2181 13.7469 6.0143 -#> -2.1437 8.6999 -7.6948 -5.9722 1.3065 -2.1438 1.2666 -2.7880 -#> 8.8771 -19.2104 -12.9171 -1.5808 9.4828 -5.6920 7.1086 1.3148 -#> -3.7578 -9.0183 0.3922 3.3510 -2.0988 -1.6591 7.8891 -20.0991 -#> -2.7687 -1.7540 1.9076 -5.1849 0.4355 5.3135 -2.5350 -6.0803 -#> 0.0246 7.8145 5.0346 -6.8477 10.7157 -0.9533 -9.1985 -8.1825 -#> -1.9543 -8.1319 -8.0247 -20.4680 13.8195 -11.3091 1.6414 6.3563 -#> -1.4511 11.1988 2.4813 -9.8037 3.3185 6.6449 6.7901 -5.9648 -#> -7.7734 -3.5544 -5.1077 -13.9153 1.2176 7.7291 1.5552 1.4795 -#> 8.3889 9.3231 -4.9893 1.0248 3.7695 -7.8579 10.9412 6.6266 -#> -5.4727 -5.4548 -6.1984 -7.9609 4.9564 -3.3660 19.5644 1.3405 -#> 0.5967 -2.9063 10.0813 2.3446 -2.4747 6.2081 -2.8591 8.8000 -#> -7.9574 -9.9399 4.6892 -10.6437 -9.4975 6.5943 8.4337 4.7842 -#> -0.9742 9.0712 -15.3261 2.3812 -8.8730 17.1299 4.1943 13.8919 -#> 10.7144 -5.8989 3.4063 -3.5016 9.2069 1.1498 4.2816 -0.5729 -#> -#> (17,.,.) = -#> Columns 1 to 8 -5.1619 -3.0181 0.0799 8.0886 -8.0772 -1.7640 -2.0343 -4.7349 -#> -9.5859 -4.2881 -1.5816 -0.3175 5.1352 -4.7297 4.2629 -17.4556 -#> 0.1543 -4.0294 1.6310 -3.1471 -1.8329 -1.6866 5.3869 -2.4516 -#> -0.6062 7.6543 -11.9250 -3.3041 6.7607 -15.7408 10.0127 -9.7765 -#> -8.9783 3.2641 6.1305 -6.3902 -11.9548 6.8638 -3.3556 9.9042 -#> 5.1950 -8.6193 -2.3250 1.6673 -5.1070 4.9004 1.0199 1.3473 -#> -2.8416 -3.7654 0.7706 1.6454 7.8840 0.3698 -6.4068 -1.4886 -#> -5.5793 -6.7922 -8.8654 23.0614 10.3158 -4.6445 0.7068 -2.0719 -#> -10.3018 -3.6898 6.1985 -1.7263 -4.9523 -4.9590 4.9585 -3.1475 -#> -15.3576 -6.5329 6.7014 -3.4137 -5.3419 -1.9853 5.9655 -9.1938 -#> -9.9914 10.6697 9.6767 -1.4626 8.7081 10.7835 -8.2493 2.8484 -#> 4.6417 -4.1499 3.1414 5.5009 6.9503 11.8197 -5.4029 4.0689 -#> 5.8440 -9.3735 5.0352 5.2992 -9.7783 -13.5972 0.1664 2.9280 -#> 11.3548 6.7934 -0.6160 -0.2442 -1.3540 -15.6663 9.3547 -3.9605 -#> -0.2968 -2.4413 7.3948 3.9985 -12.6692 -1.9594 -0.7751 3.0247 -#> 9.6975 0.2138 3.5373 -0.5736 -1.4443 -8.1797 13.9032 -4.1398 -#> 1.3615 -8.6565 4.6917 1.6749 7.3299 4.0300 1.2357 9.4734 -#> -4.4466 5.9717 3.1726 4.6943 -6.6801 8.4973 4.4991 -1.2390 -#> -2.3091 -0.4119 15.5392 3.3148 7.9750 -1.5371 -7.4485 3.0937 -#> 3.5304 10.8551 -6.4733 -10.7545 12.2401 2.2745 -1.5117 -1.7315 -#> -7.3637 10.8645 -15.1257 9.1840 20.0610 -3.7527 0.8773 -10.2257 -#> 1.5459 10.0743 -5.4409 2.1041 -3.2769 -16.7276 4.5645 -10.5682 -#> -2.6268 4.3734 0.6472 1.6766 0.9398 -2.4559 -1.0414 10.8499 -#> -10.7332 -0.2520 -1.4968 -9.6132 3.9767 -2.8039 -5.3973 -5.8692 -#> 14.2206 3.5955 9.6403 -6.4169 9.3426 8.1698 2.8643 -2.2282 -#> -6.4186 -4.2460 -2.9623 9.3977 3.6590 -11.1579 -3.4319 6.0393 -#> 1.2881 -1.7220 -1.5405 -2.2051 -1.8137 3.1895 2.4844 2.4990 -#> 8.5819 -4.9393 6.3451 -0.7898 10.0815 7.1112 -4.4164 7.8334 -#> 4.4946 4.7311 4.1540 1.7233 4.4152 -6.0693 11.5300 -11.6184 -#> 0.9200 -7.2788 8.5793 -9.7418 -3.6820 11.7213 -5.6273 14.2378 -#> -2.9730 -8.6553 5.6509 4.2432 -3.6835 8.6053 6.7605 8.3489 -#> 6.7401 -10.0715 -0.1860 -10.4713 -3.0209 -3.4074 -5.3498 0.4218 -#> 8.3541 3.9618 5.6562 1.1163 -10.3233 -7.6167 2.0936 -3.4023 -#> -#> Columns 9 to 16 2.6554 7.9167 11.0304 -4.0841 8.3659 -15.0617 1.0746 0.0951 -#> 10.2653 -7.1887 -4.6526 -3.0189 -2.7806 3.2218 1.7490 0.8503 -#> -2.5758 -11.4871 -2.7451 2.1423 -5.5929 -9.2981 -1.2714 5.3440 -#> 15.2209 -20.7210 7.8777 -4.8968 -3.0089 -8.5371 6.3820 5.9276 -#> -4.4465 2.5734 -0.8134 2.3880 -4.4097 3.2617 5.6593 -8.9384 -#> -10.4354 -1.8856 14.2443 0.8415 0.0422 6.6552 -0.5176 -6.7742 -#> -10.1508 6.2166 2.2943 -11.7513 1.5491 8.8356 7.0511 5.0354 -#> 2.0353 -10.3984 6.9557 -10.8480 18.4747 -3.5018 3.2486 -3.2049 -#> 1.1831 -2.6879 -5.1247 10.9413 -21.6096 -4.6595 -1.9645 6.3144 -#> 4.4255 -5.3163 -0.3537 -1.4157 -8.4213 2.9359 3.8514 -9.3721 -#> -4.2098 5.0282 -18.7914 -4.1104 8.0676 -1.0900 7.6226 -6.8346 -#> -4.2074 -13.9924 1.3014 -3.5194 8.7787 6.4364 7.0054 -7.3548 -#> -7.7226 -9.0022 -1.0856 5.7499 -1.0572 0.2483 -8.3625 9.5794 -#> -1.4326 -10.7731 3.7636 13.0417 -10.8303 9.9199 -4.1907 -3.1137 -#> -7.6242 1.8568 -9.7423 -1.0967 -1.8745 8.7429 -6.1002 -4.2548 -#> 5.0080 -12.6321 -5.0495 10.8333 -13.4834 1.3995 -9.5204 5.1213 -#> 2.6434 4.7095 0.6623 3.0407 -15.8481 -1.2436 1.1847 -6.4874 -#> 7.3450 6.5187 9.4047 -3.9367 3.4945 -6.0047 -0.9545 -0.7230 -#> -8.0573 -17.1685 -9.0478 -9.2035 -0.3623 -1.2031 5.7464 6.7034 -#> -6.2174 2.1358 -1.1809 -0.1612 0.1096 11.0414 -3.0298 -0.0556 -#> 8.8962 -16.2860 3.3317 1.7542 4.2848 -3.9125 -1.7668 1.5290 -#> -0.6201 5.7843 0.1410 1.3853 -9.2887 3.0838 -6.3960 2.8845 -#> -4.3300 -8.9639 2.3970 -1.8609 5.3062 2.0718 9.1274 -8.2183 -#> -2.6742 9.8989 3.7277 -1.4399 -16.5996 9.0688 12.1121 6.0464 -#> 1.4948 6.4696 -10.6562 12.1046 -9.5973 10.5317 2.4813 -1.4759 -#> -7.1796 -11.7486 -2.1333 -1.0915 6.6922 -7.2241 2.8495 -3.5509 -#> -1.6976 4.0630 -2.1573 -6.1899 10.4746 4.8966 3.6920 1.2625 -#> -5.1499 10.9102 -10.2700 11.9868 2.2540 18.0478 -14.5344 -3.5789 -#> 6.3958 -15.7501 -3.0503 10.2427 2.8136 3.6015 -5.5024 -8.6063 -#> -4.2474 6.5401 6.0854 3.6492 -5.1034 -14.2622 -2.2923 1.7127 -#> 6.0082 0.6665 6.6702 1.6794 2.8281 -5.3877 1.5180 0.3557 -#> -10.4044 -5.7640 -3.0339 5.7415 12.9303 6.6334 -7.6317 -9.7096 -#> -11.3651 -15.4415 -11.4166 10.0479 1.4643 -3.6533 -3.1368 -0.9498 -#> -#> Columns 17 to 24 4.1676 7.5346 -10.3955 -4.6839 0.8027 2.8550 0.9751 -5.7473 -#> -6.4199 -7.4061 3.9227 -1.3790 5.2387 9.0546 3.6188 -12.4515 -#> -3.0704 -2.0439 9.9159 4.2188 5.0221 9.8404 5.9538 -6.1228 -#> -4.2774 -4.6437 -1.9673 0.0261 4.6949 -3.4775 -3.4377 -1.4731 -#> -0.7676 1.8158 -6.7767 1.5923 -8.7248 -6.2934 -4.0206 -8.6869 -#> 5.2254 0.5085 -4.9254 11.4029 3.0626 0.1749 -13.6412 2.6948 -#> -4.2016 2.6745 1.2933 -11.6407 9.3893 -8.2027 -0.5136 -2.2459 -#> -2.9029 -1.7087 6.3701 0.0140 2.1192 9.2754 6.8887 8.9853 -#> -1.8979 -2.3461 1.6134 6.0133 -6.7247 -0.7846 -8.3433 -10.7952 -#> 5.7948 -0.8582 -1.8358 -7.5462 0.4105 4.3216 -1.8319 -11.7112 -#> 0.0190 3.6595 0.9984 2.8579 -10.6178 -6.5399 -16.9250 2.6083 -#> -14.4299 13.9719 3.9606 -7.8635 6.7253 7.4283 -2.4251 9.9554 -#> 4.1462 -0.4103 8.6328 -13.0772 -3.6970 -0.9119 8.4376 -2.4958 -#> 10.0345 1.5690 1.6743 -1.7376 -13.0662 -8.2072 5.0286 -3.3454 -#> -2.9310 1.0582 -0.1027 11.4803 2.2111 -19.6883 -3.5387 -17.9479 -#> 0.9670 0.4862 1.6640 -4.9562 -4.3110 -1.3840 10.0940 -9.0717 -#> -4.9501 9.6821 5.8758 -7.8606 -7.7860 -1.8677 7.3119 3.8837 -#> 2.2149 -6.3069 -6.0500 2.8637 0.0099 2.4581 3.6942 2.2841 -#> -11.5380 -3.9158 -1.4166 -2.4726 11.1947 6.1733 1.3971 -4.6914 -#> -0.1422 1.8758 5.2678 -6.8755 -2.8271 -4.3313 -3.3322 0.3057 -#> 0.0091 -2.7056 11.6045 -1.7608 -1.0983 -0.1272 16.8650 4.9916 -#> 10.2011 2.1264 -2.9585 -1.9620 -9.1845 -8.9957 -7.8044 -9.0668 -#> -6.6551 10.0978 2.3357 -5.9963 3.1873 -2.3427 5.4463 -5.4590 -#> 4.8718 7.0416 -10.5680 -16.6755 -2.4014 2.2601 -2.7843 -4.7637 -#> -14.4118 -6.3285 -14.8899 14.0403 11.6604 6.6104 1.4341 -2.7390 -#> -5.8211 9.2186 -4.2027 -1.8181 0.1697 -5.8698 3.5700 2.3641 -#> -7.4844 6.4635 2.6496 -1.5543 4.3140 -8.8012 -2.4610 1.3795 -#> -0.4812 13.2961 7.8098 4.3972 -3.9960 -9.7672 -4.6588 -7.2766 -#> -6.0338 -7.8509 2.8048 7.2771 2.8969 -1.5515 6.3580 -4.0975 -#> 8.6986 14.3351 -9.7717 -4.0023 -13.9180 -0.9266 -3.7760 5.1942 -#> -6.2422 2.0891 -3.1558 1.5595 2.0403 2.2207 -3.0024 -3.1902 -#> -1.3010 -5.1145 7.8635 -2.7449 -3.6258 -4.4641 -4.6369 -3.9438 -#> 6.8597 5.6082 -5.5376 1.7434 -13.2703 -3.8859 0.5757 -2.0066 -#> -#> Columns 25 to 32 1.0254 7.0910 14.8846 -3.1377 -1.7441 -0.0101 8.2309 13.6490 -#> 7.2203 -9.8544 4.2957 0.8403 -5.4216 4.2901 0.5307 -11.0900 -#> -3.2518 -5.0485 12.4350 -10.1260 3.0470 5.9382 -5.1991 -3.0149 -#> 0.9825 -17.6493 -3.7067 5.6643 -9.3720 9.9860 -2.0499 -10.5845 -#> -5.4938 -0.9989 -7.3844 1.0720 -2.2592 6.6725 -11.4328 2.6314 -#> -3.4160 -1.9221 6.9001 -9.0317 -1.7597 0.1183 -10.0590 -6.1533 -#> -1.4411 2.9178 6.7159 -5.2199 3.7244 6.8215 -3.6982 7.9301 -#> -6.6415 3.1656 13.6420 -11.6915 -3.5711 5.6936 -2.9921 3.3072 -#> -9.9602 2.2012 0.1391 5.7160 4.5033 4.3581 -0.8962 1.0365 -#> -2.7040 -12.5892 -0.7467 6.4007 3.1365 0.2761 5.1091 -4.8358 -#> -6.5507 13.0072 -8.1460 -0.7292 9.5301 -1.2198 5.1274 13.7180 -#> -1.8198 2.6547 17.5326 -10.1833 -5.6534 0.1305 -8.4429 6.1219 -#> -3.2205 8.2305 -6.3872 8.7776 8.8897 2.1674 -2.3012 3.0600 -#> -2.2893 4.2262 4.3550 -0.6973 3.3189 5.1070 -4.3752 0.7677 -#> -1.8420 9.5921 -1.2874 5.4161 8.3827 4.1311 7.7046 -4.6980 -#> -2.2840 -7.7288 -11.9842 0.1884 1.9551 0.8840 0.4875 -0.9882 -#> 5.5980 -6.7492 -0.0901 -2.6437 -0.4772 10.5760 -1.4162 2.2015 -#> 1.7957 -4.1445 0.9496 -10.8582 -6.8151 4.5452 -3.4300 6.9712 -#> -12.0713 -3.2485 -0.1581 2.2096 9.8587 11.0674 13.6746 -0.3345 -#> -0.0934 4.7471 6.2557 10.1419 -4.7220 -6.5351 3.8809 -4.5939 -#> -2.1335 4.1217 11.6893 5.7794 -12.5167 3.2000 3.4059 -2.0594 -#> 4.6066 10.4228 -5.0172 5.2087 2.2536 -7.1337 3.2033 -2.7008 -#> 7.1901 -6.3060 3.5671 -9.6330 -5.9182 10.4501 -0.6683 -0.4104 -#> 3.0606 8.3018 -4.4403 10.3565 -6.3468 -4.1858 9.2453 12.9026 -#> -1.5092 -10.2147 -2.1994 -0.7522 6.9462 -12.3251 4.1127 -4.6304 -#> -3.3659 -2.5790 1.9965 -0.7867 7.8040 11.6107 10.2200 0.8229 -#> 3.3383 7.1378 10.7486 -3.0431 0.0634 2.2515 -10.1307 -7.2413 -#> -2.3418 12.8547 -2.7102 -2.8552 7.1522 -3.7697 3.2059 -6.8201 -#> -0.0820 -11.7380 13.9148 -0.6083 8.5250 -1.8080 2.8137 -12.9253 -#> -5.3931 -4.7689 -5.3929 -7.5490 9.0840 3.3297 3.7127 10.0785 -#> -0.8460 -2.7423 6.4142 -7.6215 -7.6490 -2.2948 -0.1514 5.8941 -#> 3.4158 -9.1364 1.2171 3.4272 -1.3323 -1.6521 -5.7197 -1.3973 -#> 1.8469 0.4327 -5.5370 1.6241 10.8228 0.9583 -11.2766 2.1041 -#> -#> Columns 33 to 40 -14.3575 -6.2215 -2.6464 3.5467 3.0407 -4.6165 3.2398 5.9551 -#> -0.0480 -11.2296 9.0606 -8.4787 3.7471 9.2177 -4.2168 -8.3942 -#> -7.6122 -3.3571 9.6595 5.2166 9.2910 1.9303 0.2418 2.0756 -#> -5.9336 -9.8716 16.4470 -1.7783 -4.9414 9.2061 -4.3183 3.6377 -#> 4.0196 -1.7859 -6.7206 -3.6580 -2.8390 -2.7614 2.3717 -4.0672 -#> 8.6838 -8.0013 -4.4193 -15.6865 4.0136 5.3373 4.8009 -15.2233 -#> -4.8134 -6.2068 -1.7466 5.2689 2.3783 -4.6048 -13.0424 -1.3871 -#> 2.1823 -1.4709 4.6472 -4.9556 3.6358 -1.6127 -4.5299 -0.7508 -#> -7.7999 8.5013 -0.5993 8.4254 -4.4798 -12.0772 13.9128 7.1798 -#> -2.4574 -7.7704 -6.0134 -1.1115 -2.3742 5.8104 -0.0299 -7.7799 -#> 4.7580 9.0302 -0.8664 2.2599 -4.7359 6.7417 -5.0842 -6.4733 -#> 14.2720 0.5971 -11.0018 -0.0784 12.7015 -7.7272 -1.3485 8.8818 -#> 10.4907 0.4184 -0.7159 -1.4922 1.9790 -9.6921 3.3352 -4.8613 -#> -2.7084 3.6704 0.9111 5.4527 -7.8099 -1.5995 4.5562 -1.6489 -#> -2.1861 -3.2265 4.8851 2.3370 -12.4613 3.2311 15.8262 -3.8380 -#> -1.0990 4.2719 3.0757 15.0411 -5.6176 -3.2691 -6.7996 1.1006 -#> -8.8203 -4.1936 -1.6645 -2.6186 10.6776 9.8482 6.9322 -12.3674 -#> 5.5215 -8.9044 -3.4960 -6.0829 -1.6044 2.5236 0.0788 -5.7065 -#> -1.2669 0.4632 13.2928 11.7140 -7.4080 -6.8600 -8.0006 -4.2027 -#> -4.0752 -4.7452 0.7452 6.4075 -1.4182 10.2600 3.5309 -8.4121 -#> -4.2307 1.1806 9.6492 15.4468 5.8637 6.8539 -8.5205 1.0018 -#> -3.8457 -0.7070 6.1170 2.9409 -7.0939 -3.4753 1.9503 6.2271 -#> -5.9679 -1.6525 0.3704 -5.1216 -8.2476 5.7906 -7.2179 1.3314 -#> -11.1109 -3.0333 -14.4224 1.7890 -2.4105 -0.1108 0.3998 -5.0452 -#> -7.0083 4.6308 -0.7722 11.1411 3.5653 -0.4934 6.9186 -8.6089 -#> 2.0942 -7.0346 7.1834 -1.7625 -9.4952 -0.1650 -0.6904 -3.1321 -#> 0.0324 6.7720 4.1951 -2.1648 1.2437 6.1136 1.2496 7.0027 -#> 10.8015 5.0426 0.4105 14.0700 6.3069 12.8383 -11.9664 -13.9583 -#> 1.3122 4.0592 2.7471 6.4240 1.6898 1.8227 0.5049 4.1766 -#> -1.7909 -5.0770 -1.5484 7.6894 5.3256 -14.3788 4.5347 0.7098 -#> -3.2952 -1.6346 2.2955 2.4030 1.0500 -8.9837 6.7502 6.9802 -#> 6.8395 -5.0656 0.1906 -9.0577 9.8598 5.1781 -7.2335 -0.2278 -#> 4.4538 2.7797 -0.9497 -6.1691 5.6912 -10.2034 3.1724 3.2141 -#> -#> Columns 41 to 48 -8.6441 2.4011 5.3406 0.5169 -7.1458 -10.6428 3.4734 6.4883 -#> 3.6561 -1.3602 -4.9829 14.9695 7.3692 1.5609 3.6538 -0.9603 -#> 9.7197 3.6295 -1.4647 4.9346 9.6288 7.8971 3.1743 -8.6251 -#> 1.6472 2.3387 -15.3470 17.4664 14.3885 0.9156 -9.6962 -6.3569 -#> 11.7460 6.7559 5.7597 1.3397 -0.2342 7.9527 4.8621 5.7302 -#> 5.2752 -6.7882 7.0607 4.4035 -7.4108 2.5544 -11.0435 1.9892 -#> 0.3894 5.1207 3.2670 0.8916 -1.3623 0.8533 -0.3910 -2.0448 -#> -14.1813 3.4252 -6.7175 -0.2159 -4.8677 0.9509 -6.7635 -3.2218 -#> -4.9690 13.1662 4.1110 8.9466 6.0124 3.0692 10.9720 -6.5359 -#> 0.5593 3.6655 7.2326 6.2879 7.1621 6.6863 1.0141 16.3214 -#> 12.3352 -7.1740 -4.8923 -5.6806 -6.4280 10.6362 -5.3967 3.5949 -#> -2.2432 -12.8000 -11.8296 -16.0490 -9.8189 12.8758 -2.4351 -7.8443 -#> -4.5688 7.1990 0.5583 -1.7811 14.3203 4.1463 -2.6914 -4.7332 -#> -1.3859 5.0359 -1.3397 1.9056 9.3781 7.6273 -7.1196 -5.7528 -#> 12.7389 11.5307 10.7647 0.5788 12.7882 1.0392 -4.6635 -2.4167 -#> -7.0239 16.5722 -6.2209 3.7139 8.4041 1.7262 11.3945 -4.0328 -#> 2.1784 -2.5491 -0.0794 9.1677 -5.9280 -6.5202 5.9558 12.9000 -#> 0.6541 -2.0004 -10.7711 1.4766 -6.1706 7.5175 4.1388 1.1855 -#> -5.8622 -0.4412 4.2065 22.7852 10.0497 7.5345 -7.5494 -9.1347 -#> 7.6072 8.8231 -5.0044 -11.2194 -0.6387 0.7680 1.4248 6.9167 -#> -12.1577 3.3665 -14.2950 -4.3271 3.3356 6.4653 4.0655 -2.3738 -#> -6.7812 10.3407 3.4721 -3.3608 10.9246 -4.7196 0.0555 2.4327 -#> -5.3465 -8.1440 -9.7443 10.5662 -1.7553 4.5819 -9.2733 0.5123 -#> -5.5077 -4.3240 -2.6952 1.6386 -13.8442 -3.9715 -3.8010 -1.9999 -#> -4.1681 -4.8682 1.0707 4.2970 -5.5541 -2.3680 10.4819 -12.6713 -#> -8.8408 7.5186 1.9883 4.9950 10.6310 3.2479 1.3208 0.8368 -#> 3.6220 9.5142 -0.6823 1.2705 -5.9588 -16.6398 -2.1621 1.5515 -#> 16.8932 7.8655 0.1815 -15.7166 -8.2259 -9.0776 12.2167 -0.3076 -#> -6.6233 3.8283 1.9062 -0.7773 8.5527 3.2746 7.6873 -3.1137 -#> -4.6619 -5.6619 7.5286 -3.1911 -1.0284 10.5853 6.7247 7.2033 -#> 1.2962 -2.9593 6.1153 2.5834 -6.6286 -5.0059 -6.8562 -4.3241 -#> 15.2824 -16.6599 -1.0556 -0.1420 10.4592 3.5020 -5.7064 8.5650 -#> -5.1108 4.7232 -1.9738 -6.1196 2.2868 3.7609 9.3567 -10.4940 -#> -#> (18,.,.) = -#> Columns 1 to 8 -2.1182 -0.7571 7.9467 2.1898 4.7717 25.2638 -0.8232 -7.9312 -#> 1.4757 -8.4381 4.6898 -6.4652 12.7088 6.1968 -6.5510 -3.3435 -#> -1.0880 3.1956 5.9717 -2.7051 2.0740 -1.9199 2.2675 -1.2013 -#> -1.3924 8.2906 -2.5384 11.9476 -3.7197 7.0748 -6.8147 -9.3733 -#> 5.5619 -1.2401 1.4920 -6.0486 3.3838 -1.9039 -6.3562 8.2120 -#> -0.6116 -4.2774 9.4702 1.4873 17.8344 -8.8499 -29.3098 3.4220 -#> -3.7875 -1.9810 -6.1796 10.2906 -1.9900 -2.1061 7.1188 6.1529 -#> -2.5599 -9.5376 -2.6383 2.9431 -3.7980 5.2267 -1.7671 -3.9193 -#> 4.8052 1.8797 8.5406 -8.1130 -2.0360 12.9946 2.4048 -4.7191 -#> 2.9974 -10.4339 -0.2404 -13.4410 7.9839 3.1934 -4.1069 3.5502 -#> -1.6361 -4.6846 -3.8339 -1.8360 20.3230 -7.6295 -4.0841 -7.0594 -#> -9.9150 -3.2720 6.0693 3.1899 3.3531 1.6007 1.5613 1.2052 -#> -8.4328 -4.1565 -8.0653 -2.6708 -9.2769 1.4444 8.3375 8.4450 -#> -3.2125 0.9822 0.4917 12.6375 3.2021 0.8483 -0.6230 4.8014 -#> -0.2609 3.1864 -1.5156 6.5577 12.8700 7.6108 -4.1719 -4.2897 -#> 3.2358 -6.7246 3.5930 -4.1780 -4.8662 -1.1375 -4.2261 -14.1053 -#> 10.4043 1.6688 -7.9842 -8.4965 -2.1641 5.7481 -5.0010 12.2231 -#> -3.2635 -2.9199 6.9172 -5.2742 3.3919 0.9880 -7.8094 6.6050 -#> -2.4746 -0.2135 7.4969 -0.3032 11.5303 -13.9905 -0.9217 -5.6705 -#> -3.7569 2.5001 -6.3367 14.1089 8.5051 -3.3212 10.8588 5.3911 -#> -4.0467 -4.7554 1.1179 9.3319 -3.1277 -0.4392 -0.8629 2.8097 -#> -1.7547 2.4296 3.7441 5.2022 -5.6253 8.2018 16.8028 -5.5757 -#> 0.0685 -0.7017 3.6165 9.9757 13.0693 -0.2408 -5.1801 -11.3338 -#> -5.8028 -5.4429 -7.2817 -11.3863 -0.9092 -0.2677 4.8687 3.4034 -#> 2.2659 4.6321 7.0772 4.8961 16.6779 0.9971 -7.9687 -5.9066 -#> -4.8906 -2.4658 -11.6296 -1.2893 2.6321 0.9346 -0.6113 -13.0913 -#> 8.9434 9.2545 3.0081 7.0109 -0.4382 12.3121 3.2274 -14.7190 -#> -0.4862 1.9033 -5.7876 4.4347 7.4138 -26.2398 9.5332 9.6503 -#> 2.2402 3.9792 3.3721 10.4018 6.3583 -4.3108 -1.3508 -1.7882 -#> 7.4722 2.3650 -1.8423 -17.8048 -1.3112 -4.7984 3.7357 9.3711 -#> -4.0211 14.1959 4.6885 -7.2354 3.9748 7.9475 -8.8620 -6.2914 -#> -3.7000 -6.9855 -3.8212 8.3691 0.8111 -11.8869 -3.7272 0.7643 -#> -4.7139 -6.8645 -0.7060 2.7541 -4.5484 3.5338 -8.6274 -1.2171 -#> -#> Columns 9 to 16 7.2647 -8.6888 11.7926 -1.0136 1.5494 2.4462 -6.5419 11.3079 -#> 4.5663 12.5308 0.7116 -2.8596 5.5737 -9.2472 5.8513 -9.5661 -#> -13.3512 1.3591 -1.0610 0.4476 5.8431 1.2887 -1.1533 -1.8306 -#> -3.8562 -1.3592 10.7865 14.0710 -13.6829 -10.5686 3.1663 -0.1743 -#> -6.0634 8.4946 1.7074 -4.0652 1.0906 -0.4083 -2.4810 -2.0329 -#> 22.3303 8.4309 5.9973 -7.3569 -11.3795 8.3505 2.1217 -1.1199 -#> -7.1720 2.0795 6.0179 1.7678 7.1983 0.3938 -4.3298 5.9715 -#> 4.9156 -5.4124 0.4973 -9.1705 4.4407 4.9347 -1.4694 -2.2564 -#> -4.8952 9.7767 2.9930 -5.5948 7.6400 -5.4677 1.7944 -2.3845 -#> -1.0263 10.0324 -7.4797 2.0402 9.0892 -4.2136 -4.2335 -7.1426 -#> 2.9696 16.1909 -12.5469 -11.5150 -1.9482 11.0889 -3.9540 -3.3895 -#> 2.2270 -4.5721 -18.1313 -9.1965 2.6634 18.9085 7.9679 -2.6307 -#> -20.6692 4.9193 1.0881 10.1452 3.7314 -4.5919 1.2086 -2.2627 -#> -9.9781 7.9977 -5.2585 11.1988 -2.5901 -5.4626 3.3733 5.7326 -#> -2.7484 12.7465 -0.2040 13.1155 -6.3980 -9.1547 -0.0957 2.8694 -#> -0.1539 -3.9398 -0.4655 8.6836 2.9089 -17.5699 -2.7059 2.7102 -#> 4.0578 -1.0165 -17.0126 -5.9745 9.3175 -15.3041 3.0811 7.9012 -#> 3.4738 2.0146 2.5935 -9.4803 -0.1053 0.6432 -0.7223 -6.0844 -#> -6.5118 -1.8916 9.7646 0.6670 -2.9842 -7.9055 -6.2935 10.3625 -#> -6.9629 4.8729 -21.6642 -4.4527 15.1960 -1.6819 5.9706 0.5120 -#> 2.6332 -3.0433 -16.7360 2.7591 11.2852 1.8029 1.4326 5.5599 -#> 3.3250 -9.9069 -0.7334 9.8337 6.9544 -8.0702 4.9554 -4.5893 -#> 11.7242 9.3968 2.5485 0.8957 -2.3584 -5.4113 2.9040 6.0120 -#> -2.5418 9.6976 -6.0312 4.0430 11.2914 -7.4692 2.5382 4.2934 -#> 2.3915 -0.9626 -0.1069 -11.9796 9.2055 -1.7904 0.1402 6.1305 -#> 8.3409 0.2522 13.4007 -0.3983 2.9108 -8.6546 -7.6775 7.6755 -#> -1.1018 1.2193 2.2627 -10.3144 0.4891 5.3719 4.6967 9.7798 -#> 9.2502 -9.7482 -14.2606 5.1587 0.8067 -4.0890 8.0564 4.0071 -#> -4.5599 -3.6730 0.2758 -0.8335 11.5419 -2.6657 -3.4265 7.6750 -#> 3.7316 -15.9019 6.4014 2.8236 5.3996 -3.1120 -14.9505 0.0564 -#> -5.4596 1.8753 0.0063 -7.5509 -16.1388 4.6096 3.8762 -3.2336 -#> -8.9335 8.1773 4.1663 12.9117 -4.6454 2.1488 2.8809 -9.6851 -#> 0.0477 -10.4034 6.2636 -0.4742 5.5335 0.5257 0.8917 2.3452 -#> -#> Columns 17 to 24 11.0573 13.7312 5.8543 4.8489 -2.6569 -3.6729 -2.0829 6.7676 -#> -2.7833 -7.6586 -4.0285 -6.8365 -4.4145 3.2945 -7.1827 -4.5094 -#> 1.2952 1.4334 -2.6366 -1.1614 -13.9862 -0.6042 4.2555 -2.1690 -#> 6.4549 -7.4690 -2.0481 -4.5348 -7.2859 8.3759 -7.9426 15.4676 -#> -18.3143 -2.0021 -5.8181 -0.4990 2.7677 -6.0340 14.8655 -11.5795 -#> 1.4829 5.7384 -3.3054 -11.3290 -0.5048 -14.0581 12.6753 -3.5082 -#> -11.5949 1.0957 -17.6710 4.4140 -6.1736 -2.9811 9.0966 -13.4329 -#> 4.3069 2.3311 -6.4558 0.6288 -10.9957 1.1390 3.1576 1.7888 -#> -0.2856 -6.8539 12.5494 -3.4223 0.8451 7.7194 3.4557 -3.7051 -#> -7.3951 -7.1409 -9.8258 -3.3382 -3.3691 0.7643 7.8610 -1.5240 -#> 10.8178 8.4922 2.1662 -14.1131 4.2186 10.6684 12.8027 -13.3816 -#> -0.2625 5.7901 -7.1812 5.2922 -3.0609 -7.8868 10.9752 4.4107 -#> 1.2552 -1.4243 1.2744 4.8681 -4.7870 8.5247 -2.4831 7.4701 -#> 0.1306 10.3847 0.1616 3.1268 -4.7955 2.5798 -0.0894 -5.6855 -#> 5.3339 25.9790 14.3555 7.6033 0.2488 0.1268 -0.5005 -2.1643 -#> -9.2570 -6.3087 -3.6075 -7.1626 0.3360 6.7071 0.3442 1.5326 -#> 1.8577 -2.1719 -8.1049 -13.1077 -7.9606 -11.6935 -6.9597 5.5981 -#> -1.2713 4.8206 3.2899 -1.6654 -19.0327 -9.0100 4.2865 -5.6547 -#> 4.1511 4.4826 -2.2632 -7.7519 2.3801 7.0907 6.0796 -5.8401 -#> 3.5021 1.0276 -6.6806 1.2568 -2.0366 7.1988 -6.7076 -1.0068 -#> -2.5280 0.8024 -8.9132 10.7897 -2.8374 6.7471 -2.2640 0.6762 -#> 6.0122 -12.5526 0.2823 1.9227 3.5617 13.1899 -14.5711 -1.3615 -#> -5.3664 21.6509 0.7484 3.8145 0.2056 -14.7323 -4.4145 -4.0317 -#> -15.9122 -8.8177 -0.5277 4.1409 20.5291 3.4526 -2.0654 10.7065 -#> -3.5763 -0.3758 5.0681 -7.6710 4.2845 1.7020 3.9231 -2.4412 -#> 2.4172 5.0061 -5.2222 -2.6070 0.8065 -3.1587 -3.6731 5.6538 -#> 2.0505 5.3885 6.5308 6.9079 -0.5731 2.4897 5.5510 -0.2221 -#> -1.2687 -6.4637 -0.7992 1.5713 11.7883 4.8409 7.4817 5.2496 -#> 2.1072 2.8074 -2.4123 6.6889 -1.5585 10.1925 -1.7233 -6.7870 -#> -0.0777 -0.0870 -2.5451 -2.8325 -0.0159 -10.7361 4.9466 -2.9215 -#> 0.8042 6.5573 -3.6933 -4.7547 -6.9004 -1.5150 6.4858 10.4649 -#> 0.4677 4.9717 2.9189 1.6511 17.4041 10.3360 7.9191 9.0941 -#> 1.1544 -7.1980 -5.6384 2.1168 6.5097 8.4443 3.2848 -2.8482 -#> -#> Columns 25 to 32 1.0632 -2.7192 10.7800 -1.4988 5.6075 -1.0201 24.4864 -4.5952 -#> 1.8964 -4.6640 -5.0406 5.0747 -20.3201 6.5519 -6.2392 -14.4074 -#> 1.1198 -9.0152 0.1442 -4.7465 2.0033 3.4777 -3.9870 -4.5185 -#> -2.9254 -8.6776 1.1509 8.1511 -1.6471 1.3036 -9.9595 1.6227 -#> -1.9584 -1.3598 0.2078 3.9571 10.6007 -5.8023 2.1676 6.2004 -#> 8.0944 -9.1235 -7.1985 -4.1770 2.2347 14.1464 -2.0111 -22.4899 -#> -2.8796 -7.7015 -6.2215 -5.4362 11.5978 -6.0972 -7.0790 -2.9885 -#> -3.1749 4.9953 -3.3213 -7.7443 -8.5786 4.4992 3.6853 -9.6041 -#> -5.9079 -3.8342 -12.4933 9.6378 -6.8800 -8.5215 0.6795 3.3179 -#> -2.3786 -2.7458 3.6221 10.2273 -4.5791 1.8903 4.0407 -1.8440 -#> 3.4746 -7.2704 -5.6237 2.9026 -1.7574 -6.0075 -5.9611 3.7171 -#> 5.2485 -5.6597 -6.8140 -4.6195 -1.5528 -3.7034 -3.9570 -0.2235 -#> -0.6517 6.5427 -2.0068 0.1044 -2.7763 1.1070 -7.6801 3.5222 -#> 6.5342 -7.2092 -3.5259 3.8781 2.7967 -2.8619 -6.4781 1.9573 -#> -3.2627 -8.2156 5.7127 -10.2309 8.4198 1.0317 -8.5008 -11.6586 -#> 2.6354 -4.9628 6.0390 5.1588 -0.0314 5.3271 16.1069 2.2121 -#> -6.7892 1.7971 3.7045 10.1105 -11.8865 -2.2256 -2.0193 -3.8336 -#> 2.3820 -6.5229 -9.5870 17.5375 7.5020 4.4041 -4.9949 -2.7408 -#> -6.9702 -5.4410 -1.7742 -8.2750 -13.7040 -1.1796 -7.8330 -20.4477 -#> 2.6058 9.6906 -4.7918 5.4476 -9.4555 6.6318 1.0639 16.5680 -#> 11.1500 -8.0351 -3.4691 -2.8308 -11.5355 2.5864 3.8273 3.3524 -#> 1.0210 -2.1057 -3.7304 2.2107 -3.7075 -5.8099 -8.6483 7.5339 -#> -0.2247 -5.4163 -0.5882 -14.3024 -2.6323 1.0308 -3.6861 -16.4198 -#> 8.8517 -5.0068 4.0431 5.2185 -7.5437 2.1399 9.1219 -7.3401 -#> -1.0232 -2.8021 -7.3895 2.2813 -8.9037 0.7960 10.9585 3.6894 -#> -3.5527 6.3734 11.3317 -9.4079 4.1024 9.1470 5.1007 -10.9093 -#> 1.7934 2.0417 -7.4359 -0.9415 -7.7304 -14.8421 4.6599 15.0996 -#> -3.7730 3.0577 2.7603 -14.7906 6.3743 12.6716 -12.3185 3.3731 -#> -8.9482 7.4187 -3.6600 3.4782 -1.1062 7.5387 8.5087 13.0561 -#> -3.1593 -1.4216 6.9764 -0.7125 14.8323 -3.5689 9.6071 -3.1786 -#> -5.4040 -7.6017 4.8639 -5.5065 -6.7502 5.9897 1.7524 4.7436 -#> -0.7224 6.5160 13.5542 -0.8484 4.8272 9.3635 0.0067 8.9889 -#> 3.8936 8.1327 -1.3547 -5.1174 7.6414 0.9399 4.8738 6.6594 -#> -#> Columns 33 to 40 -9.2635 -7.2035 -6.8341 -3.8920 11.1846 0.3289 8.8149 1.3694 -#> -11.8951 3.4905 3.6577 6.1760 2.5567 -3.8852 5.0694 3.1206 -#> 5.2271 1.2251 9.4451 7.1253 -0.0516 -0.6160 2.3369 -2.2727 -#> 2.7091 -4.3054 17.7565 -1.2457 13.8621 -3.6399 11.4166 -8.5968 -#> 5.7349 1.3999 3.5504 7.1171 -14.1684 -3.7598 -9.0061 4.2168 -#> 0.1764 20.1954 2.7542 12.2646 6.3112 -0.7967 3.3466 -0.7696 -#> 8.9047 -5.0562 0.5008 5.8380 -5.7228 6.8445 -3.6734 18.9400 -#> 2.0629 -8.6991 -0.5225 -10.7714 9.7581 -8.4159 7.7689 3.5440 -#> -17.0136 1.7361 10.8883 5.8647 6.5579 5.2743 5.7465 -1.6654 -#> -1.0095 -2.9798 5.6213 9.2175 -6.1612 -3.8507 -2.1239 -0.5416 -#> 5.0362 1.0972 -7.7023 -1.9197 -4.8732 -11.5382 -0.7454 -2.0414 -#> 5.5874 -1.5368 -4.8733 -3.8002 -6.0718 0.0737 0.5264 8.3143 -#> 1.9344 -3.4319 5.4247 -5.5361 -2.4694 9.2598 -3.9390 -12.0095 -#> 7.2665 -5.3043 20.2393 -5.8789 -3.3490 12.6675 3.9846 1.7936 -#> 12.2226 4.7088 -0.0720 3.2021 -5.8620 5.8119 -6.5830 16.4504 -#> 4.5032 5.2409 10.1898 -5.6131 0.3540 1.0364 -11.1512 -9.0271 -#> -6.4820 -11.0132 -6.5388 4.7742 -18.1064 6.5336 -3.4887 -0.7135 -#> -3.6589 -0.3688 -0.8579 -2.5890 -5.6719 -7.9088 3.0846 -10.2442 -#> -4.3303 0.2712 -4.2583 10.9800 3.5294 -0.0587 -5.5945 -1.6424 -#> 7.3768 0.5957 0.5841 -7.9975 -16.0498 1.6262 4.4650 13.6902 -#> -5.2461 -8.1827 12.8812 -1.8316 -0.6669 0.6355 20.4224 12.9001 -#> -1.3292 1.1440 11.3888 -18.0718 8.5401 7.8767 15.9205 -3.3331 -#> -6.4445 -0.6811 2.3454 7.0941 -0.0060 10.7454 -2.9552 3.8102 -#> -8.8245 6.8739 3.8587 8.6120 1.8066 5.9996 -5.2777 -3.7399 -#> -3.8900 12.8585 -10.1478 14.4859 -4.3676 -2.4779 9.6047 13.7051 -#> -0.3521 -13.9371 -10.2086 0.6534 5.3738 -10.6879 -4.5834 9.1824 -#> -3.5589 12.1856 0.6645 2.0483 2.5583 8.6065 4.0014 2.7410 -#> 23.3597 6.5069 -0.3860 5.0167 -13.4162 -3.9814 -3.4062 14.9261 -#> 3.4224 -7.6012 2.4267 2.0239 -5.9643 -2.4389 8.4784 12.2254 -#> 8.1131 -10.8963 -11.4393 7.0195 -8.6953 -4.0921 -9.2943 -10.3610 -#> 9.7696 4.3104 -9.8152 -5.1547 7.7941 -1.7377 -12.3977 -1.6889 -#> 14.6476 7.0514 -2.3186 2.1118 -2.5128 16.7693 -11.8469 6.5283 -#> 7.4444 -1.3278 5.5839 2.1411 3.1325 -7.2029 6.9434 -0.3647 -#> -#> Columns 41 to 48 6.3361 -8.3823 14.6838 -2.5564 2.3135 -11.0018 -3.4430 -6.6691 -#> 9.5879 9.2590 -5.3786 3.0516 -6.5846 10.0990 -11.7321 3.3944 -#> -3.6460 13.8559 1.0598 -0.7469 1.6902 6.6729 -1.9580 2.0151 -#> -0.0767 11.7552 7.3769 -8.4430 3.5126 3.5257 5.5583 -0.5369 -#> -3.4907 -0.4182 -8.7510 -5.0566 -3.7087 -1.6293 -1.2193 -4.2151 -#> 0.7292 12.5892 -7.2307 -1.5323 7.5259 -0.2826 -6.2068 1.0094 -#> 0.1214 -0.8100 -0.6030 6.5839 -7.0462 -0.7055 13.5199 -9.9241 -#> 7.8897 -1.4335 0.9386 3.6702 4.7301 -1.1972 6.3694 -6.8673 -#> 0.1027 7.4085 0.0746 -4.1141 1.1504 7.1094 -7.2309 -3.6300 -#> 11.9246 4.9357 -16.8775 4.4258 1.9653 3.3235 -9.4068 -1.7021 -#> -2.9280 1.6684 -8.8813 -3.6751 1.3775 6.1987 -7.1506 7.4461 -#> -0.2259 1.0554 -2.3421 -6.5382 4.4263 8.2247 3.2295 2.3804 -#> -10.8960 12.0267 2.7160 8.1598 -2.4184 9.6094 -2.0357 -3.3505 -#> -15.7383 9.7148 12.3839 4.5777 2.4778 1.9414 -2.6358 -8.7548 -#> -5.0241 5.4350 5.5382 3.1705 5.4128 -0.5800 -2.6863 5.3659 -#> -10.4657 2.7522 -7.5623 -1.2090 1.4477 6.0935 -0.7693 -1.6742 -#> 22.0229 -5.2352 -11.4991 9.3870 -3.7281 -6.6134 -0.3707 9.4902 -#> -7.5351 7.1026 -4.9276 -8.8025 -3.7069 4.6289 -4.7223 -12.3302 -#> 7.2135 3.2702 -11.6549 18.1010 4.2276 4.9779 1.6759 7.6809 -#> 8.9515 -8.4539 -1.6544 10.5667 -11.8874 -8.4563 -1.8262 2.5703 -#> -13.5672 -11.6889 9.8639 6.4763 -3.5790 4.2325 10.7076 -8.6775 -#> -10.4942 1.1181 11.4040 1.1981 -9.4173 -2.4131 -4.2072 -2.0181 -#> -1.2752 -1.4358 6.2455 8.8270 -0.5674 4.2308 0.0401 5.4706 -#> 1.2251 -9.2584 2.5148 13.4763 -3.8339 3.0532 3.8232 8.1582 -#> 11.2414 0.5207 -14.2519 -8.0120 0.9716 -5.9536 -3.9727 6.0368 -#> 0.1865 -8.3975 -2.1859 1.7352 6.8353 -3.5377 6.7373 3.0838 -#> 10.4572 11.4622 8.1965 -0.7065 -2.8599 2.4068 -5.4784 3.3010 -#> -17.6197 -19.1179 -5.1196 8.2346 -3.9124 -1.4653 7.1807 3.3478 -#> 13.8698 -7.8992 -7.9530 1.0119 4.6210 -1.1669 -1.1689 -2.1323 -#> 0.2055 -7.5565 -5.8767 6.8536 6.6724 -9.7350 0.5113 5.2973 -#> 13.9984 11.7311 4.0220 -6.5534 6.5149 -0.9068 -13.3340 4.0862 -#> 6.8044 -8.1622 -1.4375 4.8625 5.6870 6.0109 0.3636 6.5547 -#> -11.3849 1.3162 3.0693 -3.2302 6.4743 -1.3361 8.0931 2.7210 -#> -#> (19,.,.) = -#> Columns 1 to 8 7.9069 8.2314 -15.8607 -17.8247 1.8460 3.3960 1.7360 0.4683 -#> -8.9073 3.4705 -10.9638 4.2166 2.0623 8.7358 4.3075 7.7731 -#> -3.2392 -4.2938 -1.1854 5.3059 -9.1970 -3.2163 -0.8529 -5.6473 -#> -16.0983 -5.4826 -5.9025 7.2389 -1.9904 -0.2595 -5.9837 -2.8110 -#> -0.7603 -5.9232 5.6491 1.1754 -5.2810 -9.1353 -5.6206 1.7050 -#> -3.8353 -2.9147 -0.8285 3.0347 7.9952 5.0474 2.4767 2.0987 -#> -4.2395 -7.5736 6.2816 -0.5787 -3.1490 -2.5334 3.8162 -0.3144 -#> -1.0432 10.8327 -2.2433 3.8450 8.5869 8.0106 6.9709 5.5028 -#> -3.6809 8.2100 1.3975 4.3335 -8.6595 -15.4878 2.6603 -6.5403 -#> -13.4476 0.9709 6.4793 4.2022 -5.8665 1.9948 3.1041 11.2352 -#> 17.0529 -18.9924 -5.9974 1.5673 10.0528 -9.8677 -11.7180 4.0176 -#> 2.5034 -11.2738 -11.7232 3.2905 3.9547 0.6555 -2.0560 11.3394 -#> 2.0134 5.7511 4.8171 21.8050 -7.7353 -1.7456 1.4701 -0.3785 -#> -3.1590 -2.1877 2.9348 11.7882 -17.0002 -8.5149 6.3678 -8.5830 -#> 7.2730 -5.6195 0.4421 3.1140 -16.6905 3.8076 -8.8719 -10.0481 -#> -5.8159 3.5791 3.6580 2.3512 -11.1311 -0.7581 6.1737 -15.1651 -#> -10.3767 2.9377 1.5156 11.9794 -1.1671 7.3783 12.5935 -1.0580 -#> 7.6018 3.8385 -9.4392 -16.1210 -4.3064 -3.2163 -9.3067 0.9529 -#> -0.3496 -7.4393 -6.1523 4.5940 5.0168 5.7267 -5.7149 2.5463 -#> -10.4903 -5.9530 -0.8680 1.7640 -11.1686 1.3398 8.3867 4.7253 -#> -2.5409 5.8450 1.7522 5.4105 1.0961 1.1265 8.4920 2.9826 -#> -5.9085 -3.9603 4.3299 -6.5190 -5.3264 -9.2896 3.0530 0.7985 -#> 6.2176 -12.0171 -0.4960 3.7811 2.7566 -2.6141 -2.2421 -2.5439 -#> 5.8788 3.4545 4.1160 12.4822 4.4258 11.6593 2.6726 0.7957 -#> -9.7333 -7.4100 -12.2648 -10.8202 -10.6378 -1.1734 1.2867 -4.5883 -#> -3.5399 -5.8196 2.7657 8.1652 3.0322 1.3477 3.9744 1.1902 -#> -0.0897 5.3256 5.4750 1.2325 7.7582 -2.1776 11.7394 0.1351 -#> -8.1216 -10.0971 15.2709 10.7041 -0.3764 12.1821 -8.2312 -6.5013 -#> -19.4967 7.9488 -6.2680 -1.6368 -13.4774 -4.1064 9.2371 1.7931 -#> -0.0632 3.0925 6.8879 8.9557 4.6627 -4.3163 -8.4271 -0.9522 -#> 10.2018 -0.9176 -4.9775 7.8032 15.2119 6.8883 -1.4054 0.6085 -#> -9.2351 -1.6457 5.4440 11.2551 -2.2972 1.8527 4.7612 0.0586 -#> -2.7470 -3.5868 2.2028 5.0616 -1.9473 -6.4087 3.4935 -2.2491 -#> -#> Columns 9 to 16 -5.4244 -2.9245 -1.2107 -9.0027 13.3219 1.9440 4.8561 -2.7183 -#> 10.7535 4.1678 1.8800 5.5829 -10.7058 8.6258 -1.2581 -6.0093 -#> 5.9274 -2.4308 4.2193 7.6270 5.6584 -2.0767 -6.9968 -5.9731 -#> 0.4131 5.1239 8.8736 -4.7305 -11.9331 9.0935 -3.7310 -17.8027 -#> -5.1119 -4.9878 -10.6180 0.6381 8.9182 -6.2042 -9.7135 -3.2330 -#> 3.2885 -1.8013 2.8075 -17.3621 -5.3055 0.9743 -14.3689 10.6096 -#> 8.6763 -2.0296 1.8713 -12.6399 5.9778 -13.2539 -0.1304 -9.0192 -#> 6.8218 3.4581 4.1998 2.8683 5.3528 0.3076 4.8818 -2.1856 -#> -1.0621 -3.8601 -16.4815 6.7831 0.5187 -4.3644 -10.2156 -4.1790 -#> 10.7836 -1.4334 -6.2758 8.9640 0.5866 5.1194 -1.1981 -10.0573 -#> -11.2556 -3.9902 1.0401 15.9763 -3.9429 6.0554 8.2533 -3.9805 -#> 4.6950 -4.2295 14.3786 -0.8173 17.1867 5.5266 -0.8495 -2.7889 -#> 6.8932 5.7003 -3.2936 13.3559 3.7617 1.0504 7.2782 -8.0423 -#> -0.6424 -2.4865 3.1472 -4.1264 -5.5375 4.5062 -11.4587 -8.3921 -#> -10.7727 -7.5424 -0.4159 15.8691 2.7726 -8.9574 -10.0827 -4.1641 -#> -3.5679 5.6040 -0.5242 4.5327 2.4081 1.7839 -4.5119 -8.2861 -#> 14.1972 5.9486 5.8421 -3.0795 0.4700 9.4044 6.9189 -8.0553 -#> -2.5634 -8.9776 -10.4028 -6.9258 9.4904 10.9099 4.0087 10.6379 -#> 5.2233 -6.9021 14.9642 14.7190 4.9868 -5.5645 1.4895 -11.4390 -#> 2.5536 -1.9752 2.8385 14.3668 -6.3358 -2.8437 0.8566 -7.5364 -#> 3.3009 6.6402 2.2603 -5.8293 -1.8706 2.2362 -2.2903 -6.8235 -#> -16.8731 8.3490 -2.8018 1.8419 -7.3747 2.7683 -6.5963 -2.6283 -#> -2.8859 -5.0357 11.7609 -12.2117 -0.3431 -4.5218 -3.4076 -6.0098 -#> 8.7292 0.1320 3.4142 -3.4457 -2.2028 -1.9678 0.8707 -6.1005 -#> 7.0865 -11.2925 -4.8071 -3.5456 -3.9779 -6.9031 -7.8760 9.7066 -#> -2.3910 13.3094 8.4412 7.7155 5.9992 3.4252 4.0100 -3.7850 -#> -5.6154 2.0339 3.5498 -7.0280 4.2935 -12.5404 -11.6149 -14.0983 -#> -10.1337 -2.7636 -8.9362 -10.0790 -15.9228 4.6759 -3.2626 -1.6180 -#> 4.8460 -2.3104 7.6842 5.6354 -3.9253 -1.7831 -8.9399 -4.0914 -#> -2.5983 -0.2068 0.7000 -9.6218 5.8294 -10.0757 8.0538 4.3348 -#> 1.8360 -1.5328 3.6088 3.5472 9.7461 -3.1073 -18.6002 -10.2768 -#> 1.2851 0.6742 9.5674 7.6433 -6.4800 -3.8342 -3.2553 -7.6469 -#> -1.3759 15.5534 13.5354 4.8163 4.8285 -2.8745 -5.3420 -1.2828 -#> -#> Columns 17 to 24 -0.5167 -3.2736 3.6042 9.2083 9.7675 -3.9386 1.3428 3.4470 -#> 2.4704 2.8545 6.8275 3.5003 -0.2367 5.6338 1.1046 4.6771 -#> -3.0332 2.1101 8.6318 -4.8976 -14.9192 -1.6236 2.2431 2.0142 -#> 0.4408 1.3938 3.2056 -8.0020 -2.1119 6.5324 -2.4020 9.4199 -#> -9.3392 9.3710 -3.2274 -4.2743 -0.2528 -5.7441 19.9280 -2.5542 -#> 1.9263 1.6250 2.8807 0.3688 -14.1765 -6.7254 10.7015 9.2226 -#> 1.9163 -0.4107 8.0362 -10.4422 -0.5098 3.4652 5.8927 2.2033 -#> 2.4038 -8.7245 2.6648 2.7030 1.3872 -6.1252 -9.6656 0.5002 -#> 0.1946 -2.0269 1.5145 1.2279 -0.1315 1.4630 0.6094 1.6623 -#> -6.3521 0.4316 -4.7299 0.3104 -1.4829 -1.6261 0.6357 7.8922 -#> -11.5962 10.6036 1.4772 -5.9238 0.6079 -0.2997 15.3069 3.6901 -#> 2.1367 -1.4953 12.1796 -5.4230 -6.6698 -5.3625 8.1637 -3.4637 -#> -4.0125 3.2110 5.4136 0.7173 -13.9764 6.4144 -11.7281 4.7523 -#> -0.8757 6.4271 6.8656 5.0033 -9.3574 4.0220 1.5025 8.2151 -#> -14.3414 4.9516 -3.8332 1.1615 3.7050 3.6838 0.3889 7.1748 -#> 3.4352 -8.0321 -0.5548 -7.1979 -2.1331 -0.9266 0.7498 3.3833 -#> -3.8955 1.0812 -7.4346 9.9988 8.9725 -0.6261 -14.2812 -16.3892 -#> 0.9259 -3.9578 9.6653 -1.8063 -2.1143 -7.3721 9.1487 4.5222 -#> -7.4901 0.1527 14.3195 1.1760 7.2353 -5.5672 2.5676 13.1659 -#> -0.1484 12.3372 -1.2780 -1.0691 1.3121 10.2539 -5.5109 0.9131 -#> 5.1531 -5.7138 0.8301 -4.5160 7.7156 -4.6745 -5.6056 -10.4448 -#> 10.0189 6.8799 3.0023 0.4315 0.7961 14.9472 -0.0927 -6.6039 -#> 9.3146 7.4606 6.3168 2.5325 15.7218 -14.6869 6.3143 -8.6685 -#> -3.7615 -2.7482 -14.5030 -11.5257 2.6656 0.5920 4.0818 -3.0660 -#> 1.9609 0.6213 2.8665 0.1718 8.8866 1.9372 0.4757 7.3429 -#> -4.1544 4.6766 0.7745 1.2428 9.3632 -0.3896 -4.7325 2.0222 -#> 1.3315 -3.0702 1.4970 -9.3075 5.4004 -2.4335 1.7001 -3.5945 -#> -12.0965 -2.4795 -3.4041 -5.4235 -13.7871 -3.7242 2.2135 0.9609 -#> -1.4595 -7.6355 2.5596 6.0541 -0.2605 6.7192 -2.9285 14.5957 -#> -13.5363 -0.7604 -2.6655 3.2770 0.5550 -3.3883 -0.0947 -17.2695 -#> -12.8532 -10.6036 -5.2803 -5.4334 -0.8143 0.9354 -2.8281 -0.3836 -#> 1.2841 8.5728 -15.6670 -2.3541 -9.3281 11.0663 9.8001 8.9577 -#> 1.2819 0.9044 5.6131 2.0612 -6.8050 1.4621 12.6808 -3.0845 -#> -#> Columns 25 to 32 18.7151 -3.8094 -10.6858 -7.2112 10.5903 2.8308 -5.1495 -6.9245 -#> -8.1932 -10.3093 -4.9610 6.5884 3.5473 -1.5878 1.4735 11.9831 -#> 1.6399 -2.0138 3.1374 2.5061 -0.7592 -12.6079 5.2005 9.6991 -#> -0.2601 2.4913 -0.1172 3.2673 -1.0057 -5.0404 -15.6786 29.6677 -#> 6.3120 -2.2729 8.2883 8.2263 4.6611 -11.1320 7.2619 0.2219 -#> -0.0609 7.8129 3.8104 -6.3285 4.6025 -4.4485 -0.0333 20.5367 -#> 21.4471 1.3245 -10.1189 6.9064 -6.3160 -3.4137 5.9507 10.0966 -#> 2.1005 3.0113 -7.5420 1.7446 -10.8668 0.2345 2.9597 7.5663 -#> 6.8733 1.2057 -1.3649 -2.9656 0.2376 6.2730 -2.6192 3.3051 -#> 2.5055 -4.9166 1.1617 10.2103 5.4102 -3.6294 8.0282 -3.7121 -#> 6.1556 -18.8554 -6.8870 16.7942 0.8996 -4.3907 15.7691 -6.0708 -#> -13.7109 -5.4970 2.0404 12.2095 -1.1657 -5.2565 -15.3310 13.6764 -#> -2.0914 5.5632 5.5576 6.7898 -2.8911 6.3307 -3.9102 -2.4087 -#> 5.1732 2.5589 14.9252 -4.7789 -0.9729 -1.8489 -9.2545 10.3432 -#> 0.5507 -0.1272 14.3721 -2.8788 7.3230 -8.7260 4.2822 -8.2377 -#> 7.8071 6.3932 6.5513 3.2569 0.9689 -0.3189 -4.6282 9.9274 -#> -7.9111 1.6218 -14.3454 7.2757 9.5468 -7.2799 4.2477 -2.9991 -#> 1.4842 -7.5389 -11.1014 -1.6944 7.2728 -9.5720 -10.4276 4.1355 -#> -3.4725 -0.5684 0.1559 2.7386 -5.8742 4.7850 5.0468 1.2748 -#> 0.1762 6.7808 6.3146 7.9698 -2.0042 -0.3079 6.2638 -6.0213 -#> 7.8444 -1.9231 9.2642 1.0918 -10.7200 6.6738 -5.5230 9.9495 -#> -1.0943 7.7873 4.2141 -8.8941 0.5661 8.1359 -8.9883 -2.9023 -#> -2.6989 9.5537 13.1507 -1.4590 -5.3198 -5.8421 3.6497 2.1718 -#> 8.8567 -5.6070 5.4076 -1.1994 1.1786 6.1395 -2.2800 -11.5192 -#> 3.2517 -4.9864 5.4578 -13.7910 0.8607 0.3237 2.9753 -1.2993 -#> 3.8434 5.3917 4.0140 5.5966 4.2340 -6.8201 11.1240 -2.5896 -#> 3.2052 10.5745 -3.6056 2.5925 -1.4456 -0.6922 -3.8502 16.5866 -#> -2.4905 0.8548 5.1917 8.9457 7.4349 -11.4280 17.5558 -10.8432 -#> -8.6278 -1.5267 10.5918 -1.0952 0.6434 4.8316 -0.5702 -4.1505 -#> 2.0603 15.5785 -1.0938 0.4554 14.3984 0.0487 1.9924 -8.3948 -#> -4.8434 -0.7336 7.9365 -3.0920 -0.1015 -6.8206 -2.6112 8.2260 -#> -10.5381 6.8931 5.0467 10.2215 -5.0572 -10.9092 15.1842 -1.2059 -#> 3.0102 6.5824 11.8250 0.1481 5.1619 6.7905 -6.3298 6.1166 -#> -#> Columns 33 to 40 -13.0929 10.7086 -10.6413 -3.2434 -4.7143 18.9670 11.5604 0.6759 -#> 3.8253 5.3724 -3.8026 1.2284 3.1021 3.1633 -10.6655 4.8320 -#> 9.6050 15.6168 -0.1603 5.9876 5.5583 3.9613 -3.5428 2.9187 -#> 0.0542 11.3364 -16.0973 3.8528 18.8930 0.8423 -19.0941 3.0001 -#> 4.6167 -7.5432 1.4097 11.1482 -7.2509 2.3051 -1.6934 -2.9962 -#> -1.2676 3.3272 4.4583 2.2233 0.6304 -1.1169 -3.7036 -12.3604 -#> -7.3089 2.9711 -4.3308 18.0142 -7.9470 -0.1779 -2.5158 -10.6243 -#> -8.7892 -0.0394 -2.9864 0.9498 -9.3213 -11.2037 1.7052 3.1814 -#> -8.1533 6.6281 4.9492 13.8490 -3.8301 7.7220 1.4083 -1.9021 -#> 9.1308 -7.0064 -1.9722 -2.4951 -0.6262 -2.2968 -8.2641 4.3387 -#> 3.6544 -28.7199 7.4428 -0.3025 8.9910 8.9597 -18.8920 -2.7463 -#> -4.9109 2.1997 -5.8295 -5.6553 -10.4926 -5.4169 -0.9827 5.2563 -#> 4.6452 3.1934 3.9736 3.3958 4.4744 -5.4590 7.4215 2.6005 -#> -0.9471 5.3609 0.7467 4.1787 10.4314 0.9719 -8.7019 11.9533 -#> 12.1445 3.5598 5.1703 1.6544 14.4675 9.4715 3.7597 -2.1770 -#> -8.2620 12.5158 -12.3281 0.5927 -1.2875 3.3224 -15.3013 -1.1989 -#> 2.0916 -0.5344 -13.9363 -4.2687 1.7251 2.1584 9.0830 -2.3336 -#> -5.1937 -0.4773 -5.7731 -0.3001 -10.2828 9.0828 1.6291 -10.7334 -#> -1.9560 -4.0323 3.8732 9.6128 12.4645 5.0971 -12.0427 -6.5134 -#> 7.2082 -10.4481 10.9988 -6.3425 -1.5460 -7.8643 8.3994 15.8663 -#> -15.5418 7.9106 0.1194 -4.9896 -12.1432 -4.8835 -13.2222 11.1480 -#> -6.0650 12.4503 1.2084 -0.6934 4.0231 5.1878 7.6225 16.0133 -#> 10.0894 3.7950 2.7662 6.9198 12.7698 9.4798 -1.6231 2.8290 -#> 3.9802 3.8951 -5.1209 -12.1517 -6.8346 11.1322 -12.7089 -8.3542 -#> -14.0842 -6.4819 -3.2150 -0.6622 -8.4714 -3.2800 -11.8791 -14.5002 -#> -3.6597 0.2235 1.5656 1.3861 8.3664 -1.1531 3.4840 4.8625 -#> -0.0208 4.5306 -4.4620 5.6068 -3.5328 0.9448 7.0014 -8.7854 -#> 8.1730 -16.0610 -4.7102 -6.8084 6.8921 -13.9539 -2.5970 0.8457 -#> -7.5805 -7.3242 0.9772 -3.8022 -2.8765 -21.1915 -8.8806 8.6112 -#> 2.7900 -1.2628 -10.5153 0.7526 11.9589 11.3937 0.9338 -0.7113 -#> 3.3273 -4.8510 -16.1544 -11.6446 4.8889 1.0366 -9.9487 -8.3986 -#> 7.2784 -17.8624 2.1096 -7.7101 3.0077 -21.8409 -7.6522 -5.3485 -#> -9.4605 12.9928 -5.7377 8.1991 -3.6538 -4.7890 -3.8185 1.3486 -#> -#> Columns 41 to 48 -1.2806 4.4719 3.2787 -6.8226 -2.5020 -0.3241 16.2354 4.4872 -#> 4.2604 11.2756 1.4287 16.5132 1.0703 -2.5217 -3.6417 -3.5623 -#> -5.9802 -4.1072 5.3458 8.4859 5.6440 -5.0054 -5.6247 -2.7213 -#> -13.6875 12.1878 8.7390 5.3186 16.4883 -17.4650 15.0589 -7.1149 -#> 8.6370 6.1348 6.9977 5.9094 -3.5747 -0.2621 -14.0265 -8.1669 -#> 5.1393 4.4540 -1.6370 -10.1522 3.2873 4.1298 7.4150 9.1745 -#> -3.5731 -10.3298 9.7454 5.4470 7.8061 -1.7168 3.2278 8.5204 -#> -21.1075 -2.1749 -7.7596 -2.9712 1.7288 3.6917 16.6576 3.8277 -#> 2.7863 12.8813 21.6354 5.3553 -1.5177 2.0300 -9.4604 -10.3778 -#> -14.4875 11.1262 -0.4382 11.3462 -7.3018 2.3005 -12.4743 -5.6966 -#> 12.2622 7.4871 -5.5479 1.3534 6.5592 7.8683 6.0947 -3.2516 -#> 5.1640 -17.4172 4.4021 -9.6653 10.3575 1.2316 -6.2373 9.5555 -#> -16.6553 -1.8077 2.1062 -4.1136 0.4310 0.2189 -1.6960 -16.2303 -#> -4.1715 1.8463 9.8158 2.6802 1.2722 -8.7310 10.8009 -9.0684 -#> 7.0178 12.7044 13.5289 3.1725 -2.0315 -5.1303 0.0483 -5.9967 -#> -7.4459 -7.6461 1.2485 0.4100 4.6413 -3.3643 -1.9663 -8.7889 -#> -1.4376 0.2320 -14.3036 7.3219 -10.1662 13.2323 -4.9297 6.0043 -#> -2.9853 6.5275 -3.6477 -2.9087 -4.8125 1.1970 5.2914 -0.7746 -#> -1.9150 0.9732 1.9533 8.5727 21.2805 1.1917 0.2164 2.1555 -#> 1.4902 8.4288 -2.4849 -2.1220 -5.5417 -1.3303 -0.5162 7.3754 -#> -8.6690 -13.9623 0.6408 -1.8519 2.7357 -11.0654 12.4976 -2.1276 -#> 2.2020 12.4450 13.9324 1.6783 -4.6593 -5.5748 5.2634 -8.7318 -#> 6.9535 -3.0555 -0.3675 11.4615 -7.1063 8.8430 2.9909 8.8720 -#> -12.3394 -23.7992 -11.7108 2.3299 7.7491 0.7197 2.0416 0.8095 -#> 15.6655 1.4036 12.8402 -3.1910 9.5950 -5.7898 -0.3393 4.2886 -#> -11.1935 -0.6632 -4.5321 -4.1002 7.3329 -10.4798 8.2632 -4.2149 -#> 6.5294 6.4735 2.2356 -5.7527 -3.6129 8.0644 5.7421 3.7550 -#> 5.3824 -15.5725 -6.2125 6.5763 5.5347 -8.4960 2.9947 3.0953 -#> 0.0087 13.1358 1.9827 7.6309 1.6621 -10.9873 -4.9114 -9.1532 -#> -5.0495 -20.4267 3.4558 -1.2676 6.2794 -2.6602 -7.8177 8.3995 -#> -5.3475 -8.1220 -0.6322 -4.6946 7.1616 7.0391 7.9409 -1.3812 -#> 9.0524 1.8260 -11.0054 8.7754 -8.6430 5.8534 -23.3776 1.1368 -#> 8.2543 -9.6404 8.8720 -3.6102 9.5666 -5.0479 -3.4277 -8.2915 -#> -#> (20,.,.) = -#> Columns 1 to 6 2.3410e+00 9.7388e+00 1.6557e+01 1.1382e+01 3.7641e+00 5.1080e+00 -#> 9.7553e-01 -8.4447e+00 1.3123e+00 -1.5725e+01 -4.9083e+00 4.1044e+00 -#> -3.1703e+00 -2.2135e+00 4.4341e+00 -1.2445e+01 -3.2447e+00 3.6457e+00 -#> 1.2177e+01 1.5420e+00 5.5418e+00 -2.3071e+01 -1.8939e+00 1.7252e+01 -#> -1.0726e+01 -4.2155e+00 6.5425e+00 -2.0108e+00 -4.7632e+00 -1.3778e+00 -#> 1.2218e+01 -3.6186e+00 -4.9115e+00 5.6926e+00 3.1619e+00 -1.5542e+01 -#> 1.1675e+00 1.8411e+00 1.7625e+01 -1.9243e+00 -1.8360e+00 5.9005e+00 -#> -2.1542e+00 3.3940e-01 -4.9044e-01 7.3563e+00 9.6633e+00 -3.6952e+00 -#> 8.6159e-02 -9.9189e+00 -3.8149e+00 -8.9140e+00 2.6082e+00 -6.3848e+00 -#> -1.0125e+01 1.6594e-01 -2.5587e+00 -6.3884e+00 -1.0598e+01 -5.6895e+00 -#> -1.2456e+01 1.3919e+01 8.3685e-01 1.1546e+00 -3.0186e+00 -3.8498e+00 -#> -2.6311e+00 -2.4535e+00 6.6286e+00 -1.7185e+00 -5.9649e+00 -2.6087e+00 -#> -1.0975e+01 -2.2859e+00 -4.9682e+00 1.6251e+00 -8.6998e-01 9.8512e-01 -#> 2.8754e+00 2.6769e+00 2.8849e+00 9.1190e-02 4.8219e+00 5.3024e+00 -#> -1.0107e+01 8.5539e+00 -2.2039e+00 1.1878e+01 3.7640e+00 1.7298e+00 -#> 5.3624e+00 -4.1503e+00 1.0721e+00 -9.7069e+00 -2.6360e+00 6.1107e+00 -#> -4.1990e+00 5.3183e+00 9.7740e+00 -1.4839e-02 -2.9959e+00 -9.8873e+00 -#> 7.1682e-01 -6.7365e+00 1.7742e+01 -2.0493e+00 -4.0481e+00 -2.4705e+00 -#> 5.6811e+00 2.1429e+00 -1.9895e+00 -6.0990e+00 7.1933e+00 1.0245e+01 -#> -1.0327e+01 1.1902e+01 8.1764e+00 2.9050e+00 3.5568e+00 4.8856e+00 -#> 9.8634e+00 -7.3801e+00 -1.8072e+00 -1.8462e+01 -9.1422e+00 8.2301e+00 -#> 7.4528e+00 -3.2333e+00 -2.7164e+00 -2.6745e+00 4.9729e+00 8.2682e+00 -#> 9.8821e+00 6.4739e+00 -2.9487e-01 -9.3617e-01 1.5004e+01 1.2282e+00 -#> -4.8038e-01 7.8481e+00 3.1612e+00 -1.7786e+01 -7.0473e+00 -4.7113e+00 -#> 1.3829e+01 -1.1918e+00 4.7191e+00 -9.9064e+00 3.7553e+00 -5.6204e-01 -#> -2.0430e+00 1.0027e+01 7.2484e+00 7.6063e-01 6.9578e+00 4.1638e+00 -#> -1.2486e+00 -3.1506e+00 7.0924e+00 6.9274e+00 1.1315e+01 -6.7493e-01 -#> 4.5251e+00 -1.1089e+00 -4.6790e+00 -4.9297e+00 -2.0166e+01 -3.0425e+00 -#> 2.0341e+00 -5.0193e+00 -1.5881e+00 1.4271e+00 -5.1175e-01 -1.9088e-01 -#> 4.3895e+00 -3.2604e-01 2.8351e+00 6.5263e+00 -6.4667e+00 -3.6202e+00 -#> -4.6403e-01 1.6272e+00 3.1020e+00 6.5796e+00 1.4924e+01 4.7444e+00 -#> -8.6713e+00 7.4556e+00 -1.9416e+01 -6.8462e+00 -1.1393e+01 -6.3260e+00 -#> 8.3200e+00 -1.1023e+01 3.5368e+00 -1.8314e+00 -7.4983e+00 -1.0943e-02 -#> -#> Columns 7 to 12 -2.4804e+00 -1.8602e-01 7.1954e-01 2.9089e+00 1.7618e+00 -1.3436e+01 -#> 1.7067e+00 8.4794e+00 -2.6234e+00 7.7493e+00 -6.4802e+00 2.0458e+00 -#> 5.1771e+00 3.1818e-01 -1.7405e+00 -6.4634e-01 -7.8719e+00 -2.0841e+00 -#> -8.7515e+00 8.5336e-01 -1.5769e+01 -5.3437e+00 -3.2970e+00 -2.0343e+00 -#> 2.2304e+00 -5.4520e+00 1.9013e+00 2.4374e-01 -2.1490e-01 -1.4346e+01 -#> -2.6801e+00 -1.8525e+01 2.5054e-01 1.1545e+00 1.2923e+00 -2.2221e+00 -#> 5.4859e+00 3.2670e+00 5.4010e+00 1.0159e+00 2.6045e+00 -3.3450e+00 -#> -5.5536e-01 1.9618e+00 3.5680e+00 8.0874e+00 9.1611e+00 -2.3273e+00 -#> -8.0280e+00 1.5038e+00 -2.8943e-01 6.2901e+00 -5.8491e+00 -1.6066e+00 -#> 7.5733e-01 -8.2033e+00 -9.2451e-01 7.4625e+00 -6.7360e+00 -8.9020e+00 -#> 1.2189e+01 -1.8027e+00 1.6689e+01 3.4474e+00 2.9138e+00 1.7912e+00 -#> 1.3682e+01 -1.6062e+00 9.7473e+00 4.5117e+00 5.6799e+00 -5.4646e+00 -#> 6.8369e+00 1.3800e+01 -6.7010e+00 -2.2776e+00 6.9862e+00 7.4298e+00 -#> 5.5939e+00 7.5994e+00 -9.1281e+00 -1.0513e+01 3.1338e+00 2.2780e+00 -#> 4.0438e+00 9.6674e+00 -8.9905e+00 -6.4321e+00 -2.2459e+00 -2.7442e+00 -#> 1.7242e+00 3.3175e+00 -1.9010e+01 -5.4166e+00 2.1368e+00 1.3751e+00 -#> 9.9083e-01 -4.8365e+00 4.8378e-01 1.6902e+00 2.2087e+00 -3.4764e-01 -#> 5.4890e-02 -7.2534e+00 -2.5185e+00 1.2493e+01 -2.4334e+00 -6.3627e+00 -#> 9.5054e+00 -4.4612e+00 1.8223e+01 -3.7394e-01 3.6688e+00 7.5635e+00 -#> 2.2095e+00 4.6699e+00 9.9971e+00 -6.1401e+00 -1.0550e+01 -1.5797e+00 -#> -9.8198e+00 9.8286e+00 5.4587e+00 1.0334e+00 -3.0479e+00 1.7580e+00 -#> -1.2700e+00 2.6403e+01 -4.4089e+00 -8.6787e+00 -8.4800e+00 4.4394e+00 -#> 2.8741e+00 -1.9863e+00 -3.2514e+00 -8.6687e+00 9.8180e-01 -1.7195e+00 -#> -5.3990e+00 2.8465e+00 1.2402e+01 -8.9504e+00 2.4825e+00 -1.1775e+00 -#> -4.4550e+00 -8.1371e+00 6.7530e+00 3.7530e+00 -1.0986e+01 -2.4960e+00 -#> -2.8496e+00 2.0082e+00 -4.6388e-01 -9.5946e-01 1.0177e+01 -3.2097e+00 -#> -9.3333e-01 1.1898e+00 -4.7260e+00 -5.8988e+00 1.5007e+00 3.4544e+00 -#> 3.4987e+00 3.1301e+00 1.5726e-01 -1.7080e+00 4.4588e+00 4.7642e+00 -#> -2.8878e+00 -6.6251e-01 -3.1073e+00 -3.9279e+00 -6.6434e+00 -1.0017e+00 -#> 7.0556e+00 -1.6597e+01 2.5636e+00 -1.2606e+01 7.0497e+00 -1.9386e+00 -#> -1.7508e+00 -1.2447e+01 -1.0717e+01 5.4835e+00 5.0638e+00 -3.2764e+00 -#> 4.4112e+00 -5.0044e+00 -1.2282e+01 -1.1170e+01 -6.7855e+00 -6.1712e-01 -#> 4.2148e+00 1.2460e+01 -8.7559e+00 -2.6166e+00 -1.9169e-02 1.4165e-02 -#> -#> Columns 13 to 18 6.2091e+00 -8.8143e+00 7.2104e-01 7.2597e+00 -1.5449e+00 -3.9369e+00 -#> -9.3598e+00 5.4552e+00 -2.3879e+00 -1.3020e-01 -1.2810e+00 -6.2295e+00 -#> 1.6856e+00 5.1225e+00 -6.9954e+00 1.0107e+00 1.1691e+00 -1.7933e+00 -#> -3.2094e+00 4.2271e+00 -1.9460e-01 6.9871e-01 -1.8737e+00 -6.9162e+00 -#> -5.5509e-01 1.1511e+01 9.7265e-01 -4.1372e+00 6.0545e+00 4.0498e+00 -#> -6.5004e+00 3.4623e+00 3.2891e+00 -1.6722e+01 -3.2627e+00 1.3829e+01 -#> -2.0242e+00 1.3205e+01 -8.8209e+00 9.6122e-01 -4.1757e+00 -2.7641e+00 -#> 1.6212e+00 -1.6687e+00 -4.6750e+00 2.5037e+00 2.8769e+00 -8.8224e-01 -#> 6.9457e-02 -9.3542e+00 1.1674e+00 6.2896e-01 -1.2544e+01 -5.5968e+00 -#> -6.5634e+00 9.7293e+00 6.9997e-01 -4.0996e+00 7.8806e+00 1.4738e-01 -#> 8.3325e+00 -7.4368e+00 -1.0828e+00 4.0310e+00 1.0186e+01 1.6531e+01 -#> -9.2470e+00 1.6434e+00 -1.0851e+01 -2.7858e+00 -4.7217e+00 1.2983e+00 -#> 3.9226e+00 -3.9145e+00 2.3867e+00 1.5888e+01 6.1320e+00 5.3633e+00 -#> 9.3941e-01 -3.2628e+00 -1.4717e+00 -9.6640e-01 9.1844e-01 5.7049e+00 -#> 1.4595e+00 1.0113e+00 6.7419e+00 -4.7996e+00 4.4700e+00 1.5168e+01 -#> -4.1207e-01 7.1577e+00 7.2850e+00 3.0012e+00 6.6942e+00 3.8092e+00 -#> -5.2720e+00 1.0192e+01 2.8500e-01 -8.5123e+00 1.5029e+00 -1.8070e+00 -#> -8.1098e+00 1.7250e+00 -1.1236e+01 -7.7844e+00 -5.8833e+00 3.2155e+00 -#> 1.0499e+01 1.0083e+01 6.7967e+00 4.1068e+00 1.0419e+01 -5.3592e+00 -#> 2.2096e+00 2.1852e+00 -1.7720e+00 6.6781e+00 9.3576e+00 2.2509e+00 -#> 5.5971e+00 3.2116e+00 -1.7596e+01 6.4026e+00 -3.3209e+00 -8.8648e+00 -#> -1.5210e+00 -1.3915e+01 -2.6886e+00 1.4786e+01 -4.6967e+00 -7.7192e+00 -#> -7.6820e-01 8.3450e+00 -1.9640e+00 -6.6878e+00 -2.4184e+00 -7.9350e-01 -#> 7.4614e+00 5.3085e-01 5.5515e+00 6.6437e+00 5.0550e+00 -1.3595e+01 -#> -7.7221e+00 -4.6417e+00 3.9403e-01 -1.3851e+01 -1.0311e+01 -1.3360e+00 -#> 6.6917e+00 1.2243e+01 1.4173e+01 2.8141e+00 1.1076e+01 5.8287e+00 -#> 1.0787e+00 2.0636e+00 -1.0247e+00 3.1193e+00 -9.4799e+00 8.2093e+00 -#> -1.5041e+00 7.0780e+00 -7.8316e+00 -1.5371e+01 7.1263e+00 5.9010e+00 -#> -8.2715e+00 9.9196e-01 2.1497e+00 -2.1559e+00 -1.2710e+00 -1.9854e+00 -#> 1.1778e+01 1.7334e+00 3.4908e+00 -2.5709e+00 1.1166e+01 -7.3452e+00 -#> -5.5082e-01 -4.1049e+00 2.4903e+00 8.0472e+00 -3.3471e+00 3.5109e+00 -#> -5.2418e+00 6.6022e-01 3.3127e+00 -5.4575e+00 1.7177e+01 6.8944e+00 -#> 1.4118e+00 3.3244e+00 4.7817e+00 1.2690e+01 1.1262e+01 5.2541e+00 -#> -#> Columns 19 to 24 5.7987e+00 9.9046e-01 1.3476e+01 2.2373e+00 4.8450e+00 -3.3785e-01 -#> -6.4077e+00 -5.5996e+00 7.7242e+00 -4.9139e+00 -1.0795e+01 -2.7210e+00 -#> -7.8080e+00 -9.8749e+00 -8.0328e-01 -5.6685e+00 -6.5051e+00 -7.3282e+00 -#> 4.4262e+00 -4.0880e+00 -3.1434e+00 -1.5922e+00 -3.5972e+00 -4.8698e+00 -#> 8.9376e+00 -1.6125e+00 -3.0707e+00 2.6647e+00 -8.6386e+00 1.0014e+01 -#> 1.0788e+01 -8.3395e+00 9.1440e+00 -2.9792e-01 -5.8211e+00 5.8759e+00 -#> -6.4088e+00 -1.2216e+01 -6.1322e+00 -6.9714e+00 -1.8267e+00 3.8377e-01 -#> 6.6123e+00 -1.4050e+01 4.3107e+00 5.4709e+00 2.8144e+00 -4.5840e+00 -#> 1.3259e+00 -9.2734e+00 -6.1174e+00 -1.4116e+00 8.0506e+00 -1.7683e+00 -#> -1.3442e+01 -1.9701e+00 7.1781e+00 4.8789e+00 -1.0749e+01 4.3147e+00 -#> -1.0874e+01 -2.7071e+00 5.0782e+00 3.2553e+00 -3.3053e+00 5.0959e+00 -#> 4.7742e+00 -9.7450e-01 4.0312e+00 4.8259e+00 2.0546e+00 1.6256e+01 -#> -6.2039e+00 -7.7300e+00 2.7282e+00 9.7221e-01 1.7280e+00 -1.1206e+01 -#> -3.4114e+00 -5.4780e+00 1.9617e+00 1.9247e+00 -1.0711e+00 -6.1604e-01 -#> -4.0607e+00 6.2097e+00 -6.0131e-01 1.2949e+01 -9.1161e-01 -5.7228e+00 -#> -4.4931e+00 -6.3089e+00 -1.1048e+00 -3.4241e+00 -5.9472e+00 -1.1120e+01 -#> 1.4240e+00 -6.8421e+00 5.2643e+00 1.2383e+00 3.9429e+00 -1.4067e+00 -#> -5.4825e-01 -1.4219e+01 -7.1825e-01 -3.0608e+00 -4.8196e+00 1.2558e+01 -#> -9.2407e+00 -1.0295e+01 -4.5680e+00 -6.3542e-01 -5.6528e+00 -1.1106e+01 -#> 2.5570e+00 1.3100e+01 3.2021e+00 3.3889e+00 7.0477e+00 6.1366e+00 -#> -3.4840e+00 6.5269e-01 -1.1508e+01 3.1331e-01 -2.6091e+00 8.6547e+00 -#> -3.7476e+00 -1.5001e+00 4.7892e+00 -7.2544e+00 6.8576e+00 -2.3800e+00 -#> 1.2855e+01 4.0347e+00 -4.9090e+00 1.6522e+00 -3.1490e+00 7.2887e-01 -#> -1.1071e+00 -1.1343e+00 7.0469e+00 -4.3051e+00 -6.4920e+00 -2.1248e+00 -#> -6.6440e+00 -2.3582e+00 -5.1569e+00 1.7230e-03 4.1819e+00 2.6481e+00 -#> 8.3807e+00 -1.0780e+01 -4.4174e+00 6.4726e+00 -1.4760e+00 -1.2222e+01 -#> 1.2455e+01 1.3503e+01 6.7132e+00 3.2466e+00 9.0439e+00 -1.0962e+00 -#> -7.0192e+00 1.2580e+01 7.1967e+00 -9.6692e-01 -2.7230e+00 7.7308e+00 -#> -7.6719e+00 7.7843e+00 5.2237e+00 4.2453e+00 6.7228e-01 -9.5117e-01 -#> 3.6038e+00 -6.0937e+00 -1.4692e+00 -8.3313e+00 1.5697e+00 4.7399e+00 -#> -1.2687e+00 1.0278e+01 8.8936e+00 2.8074e+00 2.1382e+00 3.5530e+00 -#> -1.9756e-01 9.7066e+00 3.1387e+00 7.1569e+00 2.4309e+00 -8.4276e+00 -#> 7.7145e+00 -3.3162e+00 9.2129e+00 1.5138e+00 -6.6110e+00 -5.4324e+00 -#> -#> Columns 25 to 30 3.3448e+00 9.3022e+00 9.5999e+00 6.9354e+00 -5.0616e+00 -6.9058e+00 -#> 5.9573e+00 3.5726e-01 -9.9542e+00 -3.3779e+00 6.1937e-01 6.9114e+00 -#> 2.3182e+00 3.1634e+00 -5.6326e+00 -9.8146e+00 1.1031e+00 4.6291e+00 -#> 3.7577e+00 2.9056e+00 -1.2893e+01 -7.0990e+00 -5.8789e+00 5.4299e+00 -#> -2.0569e+00 -1.2289e+01 -1.2072e+01 -8.9648e+00 6.1291e+00 -4.8387e+00 -#> 2.4939e+00 3.9368e+00 -3.3553e+00 -3.7930e+00 2.2403e+00 1.5174e+00 -#> 4.7722e+00 -2.3783e+00 -3.4526e+00 -7.4857e-01 4.7284e+00 -5.9305e-01 -#> -4.4239e+00 2.2117e+00 1.6915e+00 1.6951e+00 6.1919e-01 7.1786e+00 -#> -8.2375e+00 -1.1487e+01 -3.2686e+00 1.9129e+00 3.4600e+00 6.1822e+00 -#> 9.6635e+00 -9.8398e+00 -1.4344e+01 1.1570e+00 3.5877e+00 -3.4732e+00 -#> 6.8537e+00 -1.6598e+00 1.4938e+00 -5.8247e+00 1.0250e+01 -5.8323e+00 -#> 7.7056e+00 4.1740e+00 -1.0906e+01 -3.1704e+00 2.1178e+00 7.9223e+00 -#> -5.4461e+00 -4.5399e-01 -2.1419e+00 3.8774e+00 -1.9193e+00 5.1422e+00 -#> -3.3196e+00 -2.4690e-01 2.8738e+00 -2.3391e+00 8.6590e+00 -3.8078e+00 -#> -1.1259e+01 -8.6266e+00 -2.9897e+00 -6.1473e+00 4.7037e+00 4.2867e-03 -#> 1.7899e-01 2.0520e+00 3.4490e+00 -6.9371e+00 2.1701e+00 -6.4096e+00 -#> 1.5743e+01 1.5727e+00 -8.0126e+00 -1.6293e+00 -6.9487e-01 9.7779e-01 -#> 3.3199e+00 4.9184e+00 9.6237e-01 -1.3034e+00 -6.9815e-01 1.5008e+00 -#> 1.7309e+00 1.7355e+00 3.3040e+00 -1.1595e+01 6.4257e+00 -2.0536e+00 -#> 9.0737e-01 -3.9525e+00 4.7663e-01 6.2257e+00 -3.6348e+00 7.5429e-01 -#> -1.6843e+01 -2.9076e+00 5.7143e+00 5.1020e+00 5.1055e+00 6.6208e+00 -#> -1.3249e+01 -3.3027e+00 1.0028e+00 1.5995e+01 2.1968e+00 4.1618e-01 -#> 1.8869e-01 4.9186e+00 6.6267e+00 -1.1144e+01 1.2518e+01 -9.4801e+00 -#> 7.2734e+00 -4.7865e+00 7.6977e+00 -2.5835e+00 6.9857e+00 -1.1586e+01 -#> 1.9629e+00 -7.9237e+00 8.7230e-01 3.6833e+00 1.0750e+01 7.8505e-01 -#> 2.2178e+00 2.6605e+00 -3.7676e+00 -8.0763e+00 2.7502e+00 -1.2980e+00 -#> -3.3262e+00 -2.9508e+00 -6.5470e+00 1.2192e+00 -5.2483e+00 1.0321e+00 -#> 1.7810e+01 -1.1750e-01 4.5557e-01 -8.7464e+00 1.0301e+01 1.7402e+00 -#> -1.9575e+00 -7.2531e+00 -5.6767e+00 4.7833e+00 1.0424e+00 -8.0447e-01 -#> 9.4049e+00 4.8694e+00 -3.4308e+00 -4.0935e+00 7.7406e-01 -8.3420e+00 -#> -1.0877e+00 5.9013e-01 -8.3452e+00 -6.0349e+00 -2.7546e+00 -1.5708e+00 -#> -4.6523e+00 -1.1413e+01 -7.7647e+00 -1.9390e-01 2.4253e+00 9.0860e+00 -#> -5.4974e+00 2.3994e+00 -4.8689e+00 2.3564e+00 1.8606e+00 -4.4984e+00 -#> -#> Columns 31 to 36 -5.3996e+00 -4.0268e+00 1.0155e+01 -9.1129e+00 -1.8529e+00 -5.9859e+00 -#> -1.1870e+00 9.0568e+00 -5.8302e+00 2.2678e+00 1.4472e+01 -7.0498e-01 -#> -3.1906e+00 -3.1934e+00 -5.5015e+00 1.1824e+00 1.1202e+01 5.9822e-01 -#> -4.0763e+00 5.6502e+00 -1.1138e+01 -5.5295e+00 -6.0794e+00 -9.0080e+00 -#> 1.4124e+01 -8.0207e+00 -1.2152e+00 -2.9232e+00 -1.2306e+01 -6.1385e-01 -#> -1.5200e-01 1.1691e+00 5.9120e+00 9.2071e+00 -1.2299e+01 1.2687e+01 -#> -4.5410e+00 -7.0710e+00 -6.6179e+00 -9.8775e-01 9.3494e-02 9.7834e-01 -#> -8.4071e+00 -4.0261e-01 3.7053e+00 3.0223e+00 1.0222e+01 5.3209e+00 -#> 3.2476e+00 -7.2387e-01 -8.1169e-02 5.4877e+00 -4.6037e+00 -3.2524e+00 -#> 7.5073e+00 7.5331e+00 3.5078e-01 -1.7766e+01 6.9871e-01 2.0369e-01 -#> 2.2511e-01 -2.8367e+00 2.4152e+00 -2.4106e+00 -9.4229e+00 5.5880e+00 -#> 4.5797e+00 -3.7804e+00 4.6730e+00 1.4458e+01 -1.3785e-01 5.0291e+00 -#> -4.1167e+00 7.9227e+00 -4.6733e+00 -7.1373e+00 1.6804e+01 4.3820e+00 -#> 8.5466e-01 -1.0724e+00 -9.3670e+00 1.3986e+00 -2.3463e+00 6.8835e+00 -#> 1.4297e+00 -4.2350e+00 9.0299e-01 -2.1366e+00 -5.3894e+00 -2.7163e+00 -#> -5.1330e+00 -1.9521e+00 -1.0124e+01 -9.9508e+00 -5.3707e-01 -1.4966e+00 -#> 4.0181e+00 1.2901e+01 1.0945e+01 -1.3669e+01 -8.6388e+00 -4.9692e+00 -#> 2.7508e+00 4.1519e+00 -2.2471e-01 6.2458e+00 -1.2785e+01 -8.8307e+00 -#> -7.6881e+00 -5.7679e+00 -7.7874e+00 -5.1019e+00 1.2219e+01 1.0732e+01 -#> 1.0853e-01 -5.1944e+00 -5.5256e+00 -4.7699e+00 6.8243e+00 6.7839e+00 -#> 6.1342e+00 -7.3736e+00 -4.8876e+00 6.9786e+00 1.0532e+01 1.9532e+01 -#> -7.2484e+00 4.8691e+00 -6.1163e+00 -2.4217e+00 9.1861e-01 -7.0110e+00 -#> 9.6960e+00 -1.0111e+01 4.1335e+00 8.7717e+00 -4.5262e+00 -1.9480e-01 -#> 1.0519e+01 -2.2790e+00 -5.3841e+00 -1.5754e+01 -4.0010e-01 5.5570e+00 -#> -1.7870e+00 -4.3120e+00 -8.3646e+00 5.1878e+00 -3.8105e+00 5.4791e+00 -#> -6.1213e+00 2.6515e+00 3.8603e+00 -7.3960e+00 -2.4629e+00 -3.2340e-01 -#> 2.4277e+00 -6.0967e+00 -6.1727e-01 9.6043e+00 8.1445e-02 -1.4710e-01 -#> -1.2415e+00 1.0929e+01 -8.9700e+00 -8.4613e-01 1.9625e+01 3.9038e+00 -#> -2.6456e+00 4.6997e+00 -1.1893e+01 -3.1487e-01 1.3500e+01 -5.4387e+00 -#> 3.8680e+00 1.5947e-01 1.4209e+01 -1.6989e+01 -9.0062e+00 3.9879e+00 -#> 9.3004e-02 -1.7697e+00 -4.5250e+00 7.9326e+00 -4.7444e+00 -9.6158e+00 -#> 4.5809e+00 -5.6403e+00 -5.0796e-02 -5.9721e+00 7.1729e+00 1.8613e+00 -#> -5.4809e+00 1.7845e+00 -8.3083e+00 -5.4888e+00 -2.0179e+00 4.4420e+00 -#> -#> Columns 37 to 42 -4.7855e+00 -8.5175e+00 -2.1857e-01 8.4637e-01 4.3660e-01 -4.9466e+00 -#> 6.6450e-01 4.6346e+00 2.6782e+00 1.2447e+00 3.8391e+00 -8.0498e-01 -#> -1.1130e+01 4.2234e+00 7.1485e-01 -1.3026e+00 -3.9174e+00 4.1765e-01 -#> 8.3580e-01 1.3614e+01 3.9566e+00 -1.0713e+01 3.2567e+00 -8.4020e+00 -#> 7.6823e+00 3.8936e+00 3.1150e+00 1.7484e-01 4.1188e+00 1.7145e+01 -#> 2.6063e+00 -2.1592e+00 3.8965e+00 2.8466e+00 -1.9440e+00 -3.9970e+00 -#> -4.6261e+00 1.2012e+01 1.5563e+00 -3.5953e-01 -1.2297e+01 2.0953e+00 -#> -5.7136e+00 -1.6435e+00 -3.1510e+00 3.9118e+00 -6.4690e+00 -2.4583e+00 -#> 1.0818e+01 -1.7416e+00 -9.2704e-03 -7.6917e+00 7.7324e+00 -2.0817e+00 -#> 7.8892e-01 1.1853e+01 1.3597e+00 1.0491e-01 -9.5310e-01 5.4994e+00 -#> 2.7781e+00 8.3081e-01 1.2500e+00 -2.1802e+00 -1.1053e+01 1.8365e+01 -#> -5.6311e+00 -9.8486e+00 2.1665e+00 -2.4590e+00 -2.2154e+00 6.3714e+00 -#> -4.4185e+00 9.0754e+00 -4.9723e+00 4.0641e+00 -2.4477e-01 1.1559e+00 -#> 3.8127e+00 4.3389e+00 1.8134e-01 -1.1438e+01 1.4193e+00 9.3104e+00 -#> -2.5857e+00 1.2633e+00 5.8720e+00 -2.4309e+00 -1.2344e-01 7.6392e+00 -#> 1.4477e+00 1.0082e+01 2.5753e+00 -3.5404e+00 1.1207e+01 -2.6013e+00 -#> 2.7662e+00 1.3309e+01 -1.3517e+00 8.3357e+00 -5.1607e+00 1.3805e+00 -#> -5.8706e+00 -1.7318e+00 2.8528e+00 1.9490e+00 2.3670e+00 -3.3641e+00 -#> 2.8802e+00 1.3370e+01 1.8935e+00 1.2284e+01 -1.6036e+01 5.7118e+00 -#> 4.4686e+00 -3.7408e+00 -6.7829e+00 -8.1280e-01 -6.6447e-01 9.5033e+00 -#> 7.0788e+00 -7.5171e+00 1.4984e+00 -8.3525e+00 1.0497e+01 -4.3993e+00 -#> -2.2460e+00 -9.2195e-01 -4.3723e+00 -1.5712e+01 1.6383e+01 -6.0502e+00 -#> 1.1910e+01 7.7961e+00 9.7424e-01 1.0935e+01 -9.6052e+00 1.5065e+01 -#> -2.9370e+00 -1.0523e+01 -5.9286e+00 5.4506e+00 -9.7247e+00 -1.1499e+01 -#> 1.0424e+01 -1.3085e+01 6.8869e+00 -6.7083e-01 6.7912e+00 -3.1413e+00 -#> 9.4696e+00 1.3121e+01 8.3503e+00 5.9973e+00 1.9038e-01 8.1521e+00 -#> 1.2455e+01 -3.8683e+00 -1.1434e+01 -6.7846e+00 5.1093e-01 1.0225e+01 -#> -6.8801e+00 1.1190e+01 5.6877e+00 -1.1347e+01 -8.6186e+00 -8.9230e-01 -#> 6.7188e+00 -1.9456e+00 2.7723e+00 -2.1238e-01 5.3955e+00 1.3889e+01 -#> 2.5153e+00 1.3421e+01 1.0657e+01 1.0813e+01 -4.0389e+00 4.3639e+00 -#> 2.3169e+00 -6.5107e+00 1.1071e+00 5.0577e+00 -1.9199e+00 2.6549e+00 -#> 3.7111e-01 4.1470e+00 -1.4876e+00 6.1349e+00 -3.8993e+00 1.3496e+01 -#> 3.9903e+00 -2.5372e+00 2.3113e+00 -1.3299e+01 1.5658e+01 -2.1184e+00 -#> -#> Columns 43 to 48 -5.3974e+00 1.1470e+01 8.0168e+00 -3.1737e+00 1.8088e+01 6.7016e+00 -#> 8.4993e+00 7.5901e+00 -3.8522e+00 -1.5282e+01 2.7711e+00 -3.9009e+00 -#> 5.4798e+00 8.2084e+00 -4.3640e+00 -5.6298e+00 2.8553e+00 6.0013e-01 -#> 1.5117e+01 -3.8010e+00 -6.0661e-01 -8.6124e+00 7.9563e+00 -8.5322e+00 -#> 1.0926e+00 -5.7196e+00 -7.8125e+00 2.9962e+00 -2.4641e+00 -3.9728e+00 -#> -6.9750e+00 3.9074e+00 -9.7547e+00 1.8155e+00 -9.4942e-01 1.4033e+00 -#> 7.8986e+00 -2.0966e+00 -4.5563e+00 -9.9288e-01 -3.9173e+00 9.6752e-01 -#> -2.1302e+00 -3.0563e+00 5.9163e+00 -7.8072e+00 -5.2031e+00 4.8729e+00 -#> -3.2703e-01 -5.2523e+00 -1.7050e+00 6.8670e-01 6.2048e+00 2.3407e-01 -#> 3.8850e-02 9.0948e+00 -3.6616e+00 -6.1142e+00 7.1880e+00 -1.5549e-01 -#> -6.8444e+00 4.9850e+00 2.1235e+00 4.4228e+00 -3.7574e+00 -8.3262e+00 -#> -3.4195e+00 7.0761e-01 -6.9093e+00 2.0876e+00 -1.3823e+00 1.0538e+01 -#> 7.4552e+00 -2.3935e+00 1.1926e+00 -3.8090e+00 -8.4963e+00 6.3465e-01 -#> -2.4364e+00 -1.1607e+00 -6.0250e+00 5.9558e+00 -3.0977e+00 1.4288e+00 -#> 1.1530e+01 -5.9603e+00 -4.1689e+00 1.2215e+01 -5.4808e+00 1.5768e+00 -#> 3.7910e+00 -1.1363e+00 2.6201e+00 -3.1267e+00 -7.8219e+00 -8.4765e-01 -#> 2.4809e+00 -2.6889e+00 9.0631e+00 1.3237e+00 6.6355e+00 4.8937e+00 -#> 2.8647e+00 1.6486e-01 -3.0638e+00 3.1515e+00 1.1612e+01 2.6685e+00 -#> 2.0890e+00 8.1913e+00 -5.3012e+00 -7.3151e+00 -5.1472e+00 -5.9437e+00 -#> 3.8042e-01 9.8934e-01 -1.1415e-01 4.0065e+00 -2.8091e-02 4.7139e+00 -#> -1.7802e+00 4.7301e+00 1.3032e-02 -3.6905e+00 -6.6128e+00 5.0625e+00 -#> 3.5504e+00 -8.2332e+00 3.7096e-01 9.8524e-02 3.4324e+00 2.8376e+00 -#> -9.2314e+00 -6.9470e+00 7.7655e-01 -5.2664e-01 -8.1657e+00 3.3885e+00 -#> -1.4360e+01 1.4785e+01 -3.7140e+00 -6.3802e+00 6.6014e+00 -1.1859e+01 -#> -7.1300e+00 1.4353e+01 -9.8361e+00 9.5068e+00 4.5462e+00 -4.0918e+00 -#> 2.4090e+00 2.6027e+00 8.3217e-01 -2.4805e+00 -4.2917e+00 -5.1048e+00 -#> -3.8890e+00 -9.2622e+00 -2.4179e+00 9.1678e+00 -1.0741e+00 1.4275e+00 -#> 1.0725e+01 3.6622e+00 -1.0390e+01 3.2040e+00 -8.1849e+00 -1.4154e-04 -#> 2.5473e-01 1.4962e+01 -6.1097e+00 7.3571e+00 3.6925e+00 -1.6461e+00 -#> -4.1509e+00 6.1366e+00 3.2962e+00 -5.4747e-01 3.5650e+00 2.8045e+00 -#> -4.0917e+00 -2.1213e+00 -6.7569e+00 2.4749e+00 4.6886e+00 1.8253e+00 -#> 4.5376e+00 -1.0110e+01 1.1122e+01 -1.1604e+00 -8.3908e+00 -3.9016e+00 -#> -4.6760e-01 1.4584e+00 5.2119e+00 -3.8599e+00 -8.4865e+00 2.6378e+00 -#> [ CPUFloatType{20,33,48} ]
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_conv2d.html b/docs/reference/torch_conv2d.html deleted file mode 100644 index cc10c7461735979837cc92ef271d6dbb86933d81..0000000000000000000000000000000000000000 --- a/docs/reference/torch_conv2d.html +++ /dev/null @@ -1,301 +0,0 @@ - - - - - - - - -Conv2d — torch_conv2d • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Conv2d

    -
    - - -

    Arguments

    - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    input

    NA input tensor of shape \((\mbox{minibatch} , \mbox{in\_channels} , iH , iW)\)

    weight

    NA filters of shape \((\mbox{out\_channels} , \frac{\mbox{in\_channels}}{\mbox{groups}} , kH , kW)\)

    bias

    NA optional bias tensor of shape \((\mbox{out\_channels})\). Default: None

    stride

    NA the stride of the convolving kernel. Can be a single number or a tuple (sH, sW). Default: 1

    padding

    NA implicit paddings on both sides of the input. Can be a single number or a tuple (padH, padW). Default: 0

    dilation

    NA the spacing between kernel elements. Can be a single number or a tuple (dH, dW). Default: 1

    groups

    NA split input into groups, \(\mbox{in\_channels}\) should be divisible by the number of groups. Default: 1

    - -

    conv2d(input, weight, bias=None, stride=1, padding=0, dilation=1, groups=1) -> Tensor

    - - - - -

    Applies a 2D convolution over an input image composed of several input -planes.

    -

    See ~torch.nn.Conv2d for details and output shape.

    -

    .. include:: cudnn_deterministic.rst

    - -

    Examples

    -
    # \dontrun{ - -# With square kernels and equal stride -filters = torch_randn(c(8,4,3,3)) -inputs = torch_randn(c(1,4,5,5)) -nnf_conv2d(inputs, filters, padding=1)
    #> torch_tensor -#> (1,1,.,.) = -#> -1.2050 3.5601 -7.8325 -0.6184 -5.9300 -#> 6.9992 0.3900 1.3901 -2.5574 6.5135 -#> -2.3490 3.9786 4.1985 -9.6240 8.3693 -#> 0.1293 0.5186 -2.7322 9.4996 -1.1448 -#> -5.2164 -12.5999 8.9623 -8.9888 7.3018 -#> -#> (1,2,.,.) = -#> -0.8450 -8.2603 -2.9019 -5.5598 -4.9509 -#> -5.6410 1.2666 -5.8790 2.1525 -1.3928 -#> -0.2024 2.1010 6.3255 -0.5656 -2.5797 -#> -0.3285 6.7826 5.8021 -1.7078 7.7191 -#> -0.0922 -8.5321 -5.4983 0.7221 -3.5291 -#> -#> (1,3,.,.) = -#> 2.5094 9.2626 5.0731 -3.6366 3.3318 -#> -0.2142 2.5216 -0.6932 8.3285 -0.4724 -#> -2.7548 -2.3855 1.6016 -9.5280 -0.9662 -#> 1.6404 -7.5112 0.5470 -3.7868 0.9206 -#> 0.0776 -1.8698 3.2148 -2.8353 -1.9503 -#> -#> (1,4,.,.) = -#> -1.0442 10.8465 5.7719 2.8028 0.3426 -#> -1.9633 3.7087 -3.1929 -2.0919 -8.2913 -#> -0.9261 -3.8192 9.3410 -9.4944 8.6766 -#> -5.9383 -2.2824 -4.0878 -10.5887 -2.2831 -#> 6.7283 2.9823 0.6066 0.9479 -1.2099 -#> -#> (1,5,.,.) = -#> -3.4877 -1.5824 -6.4475 -1.1290 -2.3183 -#> -2.2229 -5.1762 9.2085 -3.5268 -3.8315 -#> -6.3696 2.3858 -12.1739 -2.6510 -3.0961 -#> 4.4813 3.8243 -0.4043 2.4245 -0.3913 -#> 0.6648 -2.4358 0.4671 -3.7840 -3.5538 -#> -#> (1,6,.,.) = -#> 0.8825 0.3176 -6.6245 -6.0051 2.4430 -#> -2.0265 -0.9768 6.7438 -7.4899 2.0777 -#> -6.8935 -2.2333 -3.3011 2.0151 2.5786 -#> 0.2469 -4.8563 2.3537 0.5968 3.6600 -#> -1.4781 0.2248 -3.1730 -6.7766 -3.0347 -#> -#> (1,7,.,.) = -#> 1.5889 -4.5167 0.0906 7.6198 1.8365 -#> 5.0933 -2.3551 -2.6166 -7.5808 -2.0514 -#> 4.0381 6.9844 -2.3409 2.5503 6.2012 -#> 6.1860 8.5588 0.8248 1.6048 -4.7339 -#> -2.7566 3.9998 8.0454 -1.6114 -0.9366 -#> -#> (1,8,.,.) = -#> 4.3153 -3.2445 0.4090 1.5033 6.4462 -#> 6.4586 2.8058 -3.3607 5.9048 -2.4407 -#> -3.9344 4.6081 5.3624 -0.2189 -0.3718 -#> -4.3796 5.2653 -0.4628 8.5329 -6.1232 -#> 0.7405 5.8740 5.0339 -0.6710 0.8794 -#> [ CPUFloatType{1,8,5,5} ]
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_conv3d.html b/docs/reference/torch_conv3d.html deleted file mode 100644 index 1eb895a10ec05a8c52f0183cf5523f6e0c413b45..0000000000000000000000000000000000000000 --- a/docs/reference/torch_conv3d.html +++ /dev/null @@ -1,245 +0,0 @@ - - - - - - - - -Conv3d — torch_conv3d • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Conv3d

    -
    - - -

    Arguments

    - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    input

    NA input tensor of shape \((\mbox{minibatch} , \mbox{in\_channels} , iT , iH , iW)\)

    weight

    NA filters of shape \((\mbox{out\_channels} , \frac{\mbox{in\_channels}}{\mbox{groups}} , kT , kH , kW)\)

    bias

    NA optional bias tensor of shape \((\mbox{out\_channels})\). Default: None

    stride

    NA the stride of the convolving kernel. Can be a single number or a tuple (sT, sH, sW). Default: 1

    padding

    NA implicit paddings on both sides of the input. Can be a single number or a tuple (padT, padH, padW). Default: 0

    dilation

    NA the spacing between kernel elements. Can be a single number or a tuple (dT, dH, dW). Default: 1

    groups

    NA split input into groups, \(\mbox{in\_channels}\) should be divisible by the number of groups. Default: 1

    - -

    conv3d(input, weight, bias=None, stride=1, padding=0, dilation=1, groups=1) -> Tensor

    - - - - -

    Applies a 3D convolution over an input image composed of several input -planes.

    -

    See ~torch.nn.Conv3d for details and output shape.

    -

    .. include:: cudnn_deterministic.rst

    - -

    Examples

    -
    # \dontrun{ - -# filters = torch_randn(c(33, 16, 3, 3, 3)) -# inputs = torch_randn(c(20, 16, 50, 10, 20)) -# nnf_conv3d(inputs, filters) -# } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_conv_tbc.html b/docs/reference/torch_conv_tbc.html deleted file mode 100644 index 8d56e06ee47d8f16d17a04eb54c741e4a25cd774..0000000000000000000000000000000000000000 --- a/docs/reference/torch_conv_tbc.html +++ /dev/null @@ -1,223 +0,0 @@ - - - - - - - - -Conv_tbc — torch_conv_tbc • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Conv_tbc

    -
    - - -

    Arguments

    - - - - - - - - - - - - - - - - - - -
    input

    NA input tensor of shape \((\mbox{sequence length} \times batch \times \mbox{in\_channels})\)

    weight

    NA filter of shape (\(\mbox{kernel width} \times \mbox{in\_channels} \times \mbox{out\_channels}\))

    bias

    NA bias of shape (\(\mbox{out\_channels}\))

    pad

    NA number of timesteps to pad. Default: 0

    - -

    TEST

    - - - - -

    Applies a 1-dimensional sequence convolution over an input sequence. -Input and output dimensions are (Time, Batch, Channels) - hence TBC.

    - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_conv_transpose1d.html b/docs/reference/torch_conv_transpose1d.html deleted file mode 100644 index 954bcc9233425107a9cb272c1dfe540e965b7a9b..0000000000000000000000000000000000000000 --- a/docs/reference/torch_conv_transpose1d.html +++ /dev/null @@ -1,5300 +0,0 @@ - - - - - - - - -Conv_transpose1d — torch_conv_transpose1d • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Conv_transpose1d

    -
    - - -

    Arguments

    - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    input

    NA input tensor of shape \((\mbox{minibatch} , \mbox{in\_channels} , iW)\)

    weight

    NA filters of shape \((\mbox{in\_channels} , \frac{\mbox{out\_channels}}{\mbox{groups}} , kW)\)

    bias

    NA optional bias of shape \((\mbox{out\_channels})\). Default: None

    stride

    NA the stride of the convolving kernel. Can be a single number or a tuple (sW,). Default: 1

    padding

    NA dilation * (kernel_size - 1) - padding zero-padding will be added to both sides of each dimension in the input. Can be a single number or a tuple (padW,). Default: 0

    output_padding

    NA additional size added to one side of each dimension in the output shape. Can be a single number or a tuple (out_padW). Default: 0

    groups

    NA split input into groups, \(\mbox{in\_channels}\) should be divisible by the number of groups. Default: 1

    dilation

    NA the spacing between kernel elements. Can be a single number or a tuple (dW,). Default: 1

    - -

    conv_transpose1d(input, weight, bias=None, stride=1, padding=0, output_padding=0, groups=1, dilation=1) -> Tensor

    - - - - -

    Applies a 1D transposed convolution operator over an input signal -composed of several input planes, sometimes also called "deconvolution".

    -

    See ~torch.nn.ConvTranspose1d for details and output shape.

    -

    .. include:: cudnn_deterministic.rst

    - -

    Examples

    -
    # \dontrun{ - -inputs = torch_randn(c(20, 16, 50)) -weights = torch_randn(c(16, 33, 5)) -nnf_conv_transpose1d(inputs, weights)
    #> torch_tensor -#> (1,.,.) = -#> Columns 1 to 8 1.6673 -7.0390 6.6254 -13.9097 -9.2569 16.1860 3.0554 -10.1673 -#> 0.5097 0.4955 5.0436 -2.2605 3.2619 1.0813 8.2840 0.2337 -#> -2.5402 -1.4119 -6.6792 0.1580 -12.0189 -2.8290 5.7440 10.1772 -#> -6.6198 3.2866 -5.4540 2.0456 3.6355 7.4837 -10.9005 -5.6650 -#> -4.6747 5.7736 -5.4007 1.2693 11.4762 -4.7003 -11.3653 10.1400 -#> -7.4153 3.9516 -1.1868 -8.5232 6.7928 -4.0613 -12.1954 -0.3727 -#> 3.3955 1.0648 -4.0864 -17.0257 -3.4968 -4.9347 -25.7271 21.9273 -#> 4.0335 -6.6207 -3.6258 2.1134 2.8998 -6.8774 10.1487 -13.0325 -#> -0.7558 -1.2329 0.9330 12.2378 -7.8136 -0.6589 7.9927 1.7098 -#> -7.7476 11.0338 -19.6308 -11.0443 -1.9108 -9.0969 -28.1443 -17.9797 -#> 2.7401 3.4970 -6.8374 -8.1578 1.5898 9.3208 4.2085 2.1895 -#> 3.1280 -2.2744 3.1126 3.8573 -5.6051 2.0268 2.0747 -4.4871 -#> 3.2771 4.8169 5.4240 -9.3082 -13.5079 -0.9002 -4.1523 5.5984 -#> -3.5472 5.3228 -0.3093 4.4044 0.9189 -6.4752 -6.5480 7.4419 -#> 4.2013 -5.9650 -2.9532 -1.8045 4.5317 0.8231 5.2075 -0.9755 -#> 1.6888 6.3345 8.0313 -4.3153 -1.1394 6.0972 -11.7238 -4.2819 -#> -0.4871 7.4014 -7.3978 -2.9028 -5.0449 -7.1657 -3.8541 12.1393 -#> 0.8300 -2.1989 18.5210 -6.7047 1.8052 -12.0026 -3.6008 12.7004 -#> -5.7435 -4.3290 -5.1741 -4.9028 10.4417 -1.7813 22.8982 4.6736 -#> -1.8184 1.6866 2.4972 -10.6010 -1.6615 4.2066 1.2256 1.8704 -#> -0.1898 7.4767 21.4571 -1.9761 8.4871 -4.1088 -4.3248 3.2695 -#> 2.8483 -11.0110 7.3260 0.7999 3.4976 5.7566 12.8856 4.1038 -#> 6.9826 9.8364 -0.9694 -7.2072 2.1751 -9.7158 -20.3140 -4.7768 -#> 0.2967 -5.7820 1.4100 -1.7626 -0.5567 -2.8442 11.3496 0.9707 -#> 2.5259 7.6391 2.8814 8.4676 -8.6886 -16.8321 4.9696 -23.1633 -#> -3.6094 -7.7429 -3.9817 -10.3633 -14.8211 -17.5294 -7.3543 1.4741 -#> -3.6305 1.5408 -2.2006 8.1134 1.1769 -3.0465 5.4047 -7.1063 -#> -0.2056 -3.0883 2.6383 1.4320 -0.4309 12.6957 20.8133 -1.0188 -#> 0.7445 -2.1118 1.9698 -2.5180 0.3591 -17.7520 -5.9807 -7.7320 -#> 3.2831 -7.6201 -3.1870 -8.1050 -8.9224 -8.1959 -4.7006 -10.0807 -#> -1.7069 -4.7109 5.5420 0.4770 3.4143 15.3951 -8.6150 -10.6463 -#> 4.9333 8.1781 2.6415 5.0800 -18.8387 -13.3438 0.6360 16.3387 -#> 2.4939 8.4147 -4.0838 -7.4578 -5.3779 -6.8781 3.7359 9.3849 -#> -#> Columns 9 to 16 4.5019 6.9288 -21.3335 0.1317 -3.8836 8.3590 -0.8443 -14.3927 -#> -16.2857 -11.4820 -1.4024 -4.5153 1.9758 0.6438 -2.7761 10.1361 -#> 6.2420 -0.6045 -9.7481 9.3241 -3.6135 -18.6855 6.7881 -13.9915 -#> -5.0487 -20.3735 28.4884 3.9854 20.9909 -0.9224 7.0288 0.5621 -#> -3.5373 -15.3569 -16.2579 16.5925 -7.9983 -9.0862 -6.5943 0.2070 -#> 13.7029 5.4600 -7.5984 -6.4349 7.3174 15.5161 2.0219 -1.0355 -#> 3.6252 -20.4715 -3.8647 -20.7537 7.9750 0.7964 -23.7706 9.1045 -#> -8.7950 4.0828 -4.5432 -6.1630 -5.2080 -14.8888 9.1260 -5.9252 -#> 2.9966 4.8565 6.0109 -8.5393 19.8787 3.5916 4.0040 -11.9161 -#> -11.2591 -1.4102 8.5272 -8.9441 10.3345 -13.4799 -0.9897 15.2467 -#> -0.6632 8.7770 -15.7039 -13.8364 1.2821 -0.6675 0.4520 -0.1015 -#> 17.6055 15.4246 -8.5307 13.7545 6.1392 2.1994 5.0420 -3.2646 -#> 5.2685 -6.2487 3.4035 -6.7103 4.0975 7.0198 -4.6465 -7.1480 -#> 7.5268 -3.9966 9.9433 -3.9344 -6.8677 6.9691 -16.2891 22.7894 -#> 5.2153 16.0680 16.4236 18.3746 -5.5733 -1.2358 -6.8528 1.5499 -#> 4.9569 -3.0487 -1.9484 0.2809 6.0391 0.1817 2.8151 -11.2639 -#> 29.2747 7.6625 -8.7079 2.9776 -6.0190 -6.8860 -3.6460 2.3650 -#> 8.1998 -6.1037 -6.8587 -6.0042 4.2655 10.6292 -6.2967 0.6930 -#> -14.4762 16.4459 -5.9406 2.0809 1.8835 -3.1317 -8.9321 7.7105 -#> -10.7999 18.7483 17.0834 -10.3893 -8.3545 -21.9094 4.0324 2.8529 -#> -6.2272 -3.0918 1.9956 -4.4372 -6.3339 5.5328 -6.0873 -10.6416 -#> -5.9817 -6.9611 12.4265 -5.7492 3.2616 -6.5960 6.5149 9.2240 -#> 8.0986 13.9525 21.4184 -4.3470 9.5946 -18.5461 -4.5604 -15.4378 -#> 3.7024 -0.7089 2.5686 -0.7054 6.0643 12.1628 1.9096 6.4987 -#> 10.7530 8.5200 -5.6235 2.7819 6.0993 -11.6488 18.3623 -5.3475 -#> -0.5536 -8.5774 0.8194 7.4602 -11.3240 5.9768 -0.3840 19.7081 -#> 0.0850 0.9243 2.3494 12.6000 -6.7075 8.3957 -7.9544 15.4305 -#> 1.4948 1.6475 -10.0262 -0.6992 -20.6828 5.6993 3.6267 -8.4266 -#> -3.9348 2.5818 4.7236 2.2962 -8.2360 3.9196 1.1975 14.9994 -#> 7.2291 -14.4824 -7.3407 26.6378 4.6584 -1.7192 -4.7126 -6.6773 -#> 3.6147 -1.7117 3.2719 -0.1214 3.2613 -12.0086 0.1217 -11.8695 -#> 2.6794 -8.8757 20.7463 13.6375 -16.4280 14.3362 -13.4708 9.1810 -#> 11.7337 6.5971 4.8227 -7.0015 -12.9038 22.7038 -19.3795 -5.8170 -#> -#> Columns 17 to 24 1.2900 -12.9553 -5.3724 15.7169 -10.8972 -11.2200 15.5498 -5.0368 -#> 2.0881 8.0263 -3.9599 9.3983 -10.2026 2.6811 -5.9319 -7.8959 -#> 14.9066 7.4782 -0.0191 -0.6530 -7.1498 9.4691 -11.9981 15.5602 -#> 14.5616 5.2064 2.5051 8.9144 17.6842 -13.5671 -3.9489 7.7915 -#> -9.7128 6.7477 -3.1993 1.6146 4.5125 5.5407 -8.2644 13.3508 -#> -1.6102 11.9611 3.4677 -4.9038 -10.2262 -14.4330 -1.2227 -12.5192 -#> 12.0007 -14.9848 0.9463 3.5031 17.4539 9.3853 -1.6176 -9.6670 -#> -18.4721 3.3622 8.1907 -2.8999 -0.2637 -4.8769 1.5459 16.6135 -#> 2.1102 5.4149 -1.7087 -8.4103 12.7386 -9.9078 -9.3740 12.7742 -#> 5.3657 5.1944 12.6113 8.4549 15.0561 -10.8250 11.2875 -8.3614 -#> 4.6304 9.5156 -12.6676 -0.9019 -10.6211 9.9370 3.3044 3.2810 -#> -4.6163 17.0280 1.7505 6.2276 -11.3447 -6.5945 1.7422 0.4499 -#> -3.0405 -5.3669 -1.8315 -2.1907 -6.5904 -0.9806 -0.6572 17.8599 -#> 4.4193 18.8522 10.2451 2.9497 21.5558 0.1101 -5.1855 -9.5943 -#> 10.4257 13.2620 2.2022 -19.5122 2.7581 -3.0927 1.9384 7.3883 -#> 10.6498 -4.7587 2.6286 -6.5227 -4.8643 -16.4109 11.9423 -26.0725 -#> 3.3000 9.5719 11.0230 9.4557 -4.9966 -4.4449 4.4763 -18.4214 -#> -15.8333 -7.3848 4.0740 6.5697 -5.7416 -4.4711 8.0335 -3.5673 -#> 11.2547 11.7943 -27.4378 -5.9246 8.8010 20.3112 -3.0324 16.8786 -#> -2.6789 4.0453 0.9162 12.4067 5.2146 -2.5160 -9.9002 16.7632 -#> -19.3769 -4.4897 5.8856 5.8640 -14.5450 11.0825 1.5232 -3.6449 -#> 6.0268 12.5299 13.8870 -16.7846 -9.3446 0.2853 0.7077 -11.7750 -#> -12.7468 10.7160 31.0325 34.9867 -5.8856 -11.4765 -10.4610 -2.5913 -#> 9.3896 -6.9599 -7.6279 -5.4562 0.0194 -6.9496 -2.8263 -5.2451 -#> 4.0190 -7.9305 -7.1466 13.0450 -5.5047 -4.4661 0.9274 -4.7775 -#> -13.4600 -0.2809 -0.7337 9.6314 1.7702 3.1068 9.6836 -3.6410 -#> -0.2111 0.3837 -10.4137 -3.0894 8.8677 -2.2090 7.1865 -3.8667 -#> 13.4651 7.1765 -1.6467 -4.9560 -9.6018 8.1471 -7.0450 17.0260 -#> -17.4430 -1.8719 -8.0025 -15.9723 11.6767 -7.6636 -7.0144 3.6021 -#> 18.1120 -0.0250 -18.4019 9.8127 -2.1751 -2.7134 8.3464 -1.5143 -#> -3.8167 10.2692 6.5966 7.9059 1.1906 10.1638 1.1750 -2.4618 -#> 0.1265 -15.9027 2.7108 -6.4089 4.9851 1.2107 -11.7264 -1.2454 -#> 0.4524 0.2076 8.5678 -6.2080 0.6733 -7.2406 0.8338 3.5937 -#> -#> Columns 25 to 32 -0.0293 14.3552 3.6278 7.7933 -5.4766 12.6476 1.8703 -2.8825 -#> -3.2107 -9.0870 7.0099 5.4586 2.7359 2.2581 -3.8164 8.9624 -#> 13.3699 -4.2088 -4.7406 -8.7636 -5.9517 2.2329 1.8833 21.2654 -#> 12.8826 -7.6350 -5.2774 2.4938 9.3194 -13.5257 5.9729 20.7365 -#> 10.1223 4.4283 -13.0912 -9.2791 -1.8947 0.4621 3.4288 8.0413 -#> -9.1683 7.7603 3.0516 -5.1836 2.5889 8.7409 5.4927 4.1831 -#> -1.1417 10.9090 -7.1233 -3.5366 -1.2165 -3.2714 3.8553 1.1704 -#> 6.5912 4.1697 7.7293 7.9883 -9.9514 -5.4976 12.7131 -0.1949 -#> -4.4844 -5.7362 -3.9648 10.1807 -6.2640 5.3054 -4.7080 4.0852 -#> 12.6666 12.5517 -4.1955 7.5249 -0.7672 3.2016 6.5978 20.8497 -#> -7.9074 7.2881 -4.8798 -1.4548 -8.7517 3.9573 9.5329 -4.8086 -#> 0.3169 -3.8601 1.0305 0.5803 1.2628 4.3491 -12.1400 -2.7427 -#> -0.3467 13.3886 -6.3979 2.0452 2.3282 -2.0779 -2.4584 -16.2387 -#> -5.7151 9.2725 -0.0182 1.8764 6.9025 -7.1871 -10.9513 -6.7978 -#> 2.4439 -1.6886 -7.7059 0.1886 17.1840 -6.1218 -5.3057 5.4783 -#> -16.9525 6.0691 16.1657 -11.4185 3.4004 10.6037 4.7744 2.5440 -#> 9.5940 2.7641 7.8744 19.0521 -3.7261 14.7517 3.5746 -0.4117 -#> 0.4406 -12.6979 -6.8351 3.1805 -11.3419 -1.0762 9.5960 -4.1767 -#> -11.9938 -6.4985 9.3770 6.9766 13.3734 12.1043 14.3930 -7.0519 -#> -2.6697 -2.4679 1.1577 3.0986 6.3140 -21.5625 -0.3885 0.4263 -#> 0.3668 -3.9383 20.8603 6.6954 -2.7366 -0.2231 -0.5172 -13.0651 -#> 19.9507 1.6934 7.1942 -15.3310 3.2378 8.5819 -14.4676 0.2683 -#> -2.5942 1.2458 9.2131 9.8679 8.3943 -12.4564 4.6223 3.0705 -#> 2.0545 -1.3666 -0.5177 -9.2698 -7.6781 -1.9081 -3.4399 -3.9338 -#> 8.1274 -0.1854 5.2153 -11.7028 8.3669 4.8400 -13.4963 11.7893 -#> -0.1239 6.0862 -7.2274 0.1266 2.1916 -9.3351 -11.9587 -7.4673 -#> -20.4607 4.0279 7.3662 -0.7345 5.7650 3.7344 -0.5221 -2.1129 -#> 11.4275 -12.4868 13.0824 10.5353 4.0451 1.8503 18.2519 -0.7406 -#> 13.0047 -15.2421 9.2275 1.4917 -11.4757 -5.3868 3.8877 4.4101 -#> -10.2052 18.6921 -13.4181 -20.0699 12.8999 -12.0904 0.3806 -5.5696 -#> -2.4651 -10.7724 8.9007 1.9734 -6.8980 -11.9957 6.5855 -4.4643 -#> -1.0077 -4.1095 -2.1013 -0.5406 5.8052 9.8388 -20.2044 -2.6470 -#> -13.4139 -2.1838 -1.5024 -6.3311 -6.9503 3.3761 7.3990 7.9767 -#> -#> Columns 33 to 40 1.8592 0.9761 -2.6963 -3.8153 -1.8090 29.3572 -11.8402 11.8519 -#> 13.3483 5.1859 4.9052 -0.4473 12.8364 2.1392 11.2240 1.1636 -#> 2.9127 -0.4139 -9.8866 2.1386 -0.3520 -3.4992 15.7872 -1.9768 -#> -8.7932 10.8440 -0.3113 9.1708 9.1623 -17.6051 14.4912 -2.6571 -#> -5.4569 14.1891 1.8065 17.6668 2.9545 -3.7614 26.8721 6.9820 -#> 5.6159 -18.5910 -0.6680 -9.2854 2.9860 3.0446 -10.3332 -9.6543 -#> 6.3406 1.3053 -2.6143 0.2845 7.4428 8.5844 1.7969 -1.5125 -#> 10.9239 9.0027 22.6637 18.8975 -4.2188 12.3108 26.5707 8.6210 -#> 1.2633 -9.0421 0.9745 2.6553 -11.1772 -12.4188 1.3170 -5.5598 -#> -12.8508 6.2380 -4.0621 14.0566 16.2944 -9.8447 8.1251 11.8212 -#> 10.8548 17.6606 -0.6996 -0.6495 1.4041 2.4018 7.0875 3.6098 -#> -3.0109 -0.2962 6.1093 6.9493 -6.2817 4.9856 9.9639 0.5589 -#> 0.5565 -0.0766 1.2112 1.3211 -15.6654 0.7365 -1.7787 -11.4870 -#> 1.2893 -10.7174 -7.8088 -3.3270 -0.2953 -10.0667 10.5948 -3.7164 -#> -7.5186 -7.9848 -14.5698 2.6338 6.9504 -1.8349 14.9450 9.0353 -#> -3.8914 2.3522 -3.9384 -14.0598 11.8851 3.7743 -4.7451 -18.7160 -#> -7.9114 4.4290 1.2984 -6.1245 10.8040 9.1649 9.3862 3.8943 -#> 13.5192 -0.3540 11.7615 -10.4322 -10.5759 5.1959 -10.5739 -2.5550 -#> 4.1348 -11.0223 -7.5457 -1.8305 10.4051 9.8118 5.1208 5.6186 -#> -4.8263 -6.7968 5.5902 12.7691 7.9257 1.9996 6.5328 8.3577 -#> 0.7681 7.4137 12.9171 -3.9795 -2.3992 5.0543 -2.1804 6.8998 -#> 7.7960 -0.6246 -0.4729 2.6995 5.4509 -0.4801 2.1876 4.9933 -#> -6.3639 0.3813 12.9830 -2.4914 -6.7726 1.7444 4.6179 18.1880 -#> 6.7140 2.4883 8.6126 18.9305 -10.5894 3.3141 -21.9243 6.1176 -#> 0.2413 -8.9027 3.3162 -13.0962 -12.3184 -3.8753 -23.0307 9.0538 -#> 0.0690 -15.0865 1.8602 -8.6315 0.5089 -0.7437 -5.9029 3.0767 -#> 7.1451 -8.2692 -11.5116 -8.8415 -2.0170 -7.0324 -3.6529 -10.3245 -#> 6.1436 -0.5680 4.2794 -0.3316 -7.4252 -10.7011 7.0821 7.7291 -#> -1.3997 6.5191 -0.3735 -6.5781 -10.0386 -6.9226 -2.5003 -9.5659 -#> -0.0958 -0.5062 2.8923 -3.4108 13.0374 -2.9370 -10.4329 -3.1501 -#> -8.9609 2.8826 0.2373 5.7849 -1.4304 -5.0798 12.6846 12.8611 -#> -0.7236 -1.0731 -3.0510 -9.1343 -18.2925 1.9743 -0.5286 -7.9524 -#> 10.3919 13.0850 0.9242 -0.9788 -6.7773 0.8554 -4.7509 -5.3867 -#> -#> Columns 41 to 48 2.9985 -15.5869 14.0373 10.4037 -11.1567 6.1454 3.8270 7.7380 -#> -1.6961 -13.4523 2.7170 -5.3499 -5.4221 10.5192 9.8349 -12.1479 -#> 6.2730 22.6802 -7.0960 -14.5570 -13.0734 24.1451 3.9818 4.1208 -#> -1.6407 8.4934 -6.2687 -4.5555 -9.5029 15.1876 8.3357 -3.0061 -#> -12.4818 22.4958 -6.0299 -8.6670 -10.3577 2.4604 10.0247 -10.8954 -#> -6.8458 -6.1891 1.1851 2.8739 2.8533 -10.4645 1.1883 -5.4889 -#> -18.7008 15.1108 8.5953 -5.3394 -2.5032 6.4586 3.9125 3.6250 -#> 2.3395 4.7180 -12.1364 -2.4026 6.8779 8.2134 -22.8381 -3.5587 -#> 15.7945 -2.3467 5.5252 4.7041 -6.9610 -8.8453 -2.2314 11.2754 -#> -4.7374 -10.4254 10.9580 -13.0100 -6.7119 0.3756 16.1945 8.9202 -#> -6.7409 4.5078 4.3692 -4.6630 -5.9395 -3.9312 -5.5752 1.4460 -#> 1.1583 8.8820 2.3125 -8.9832 1.3572 5.7707 4.3816 -12.6141 -#> 1.1008 6.2556 -5.6111 -7.0873 0.3703 3.5847 0.6188 -5.6186 -#> -5.2835 -8.3637 -0.4618 5.6316 -1.7833 -8.5794 27.2491 -10.4892 -#> 20.8741 8.0639 -15.5463 -7.4028 3.6106 -1.1026 0.9431 -4.4649 -#> -11.5054 -5.7812 19.4337 8.4593 7.6994 -14.5653 3.4009 -3.5659 -#> -13.8089 -11.7804 10.2397 -9.4642 -1.2912 -14.0060 -6.6083 -11.9624 -#> -8.1229 -1.9699 11.3045 4.4870 -1.1037 9.2278 -1.8346 -6.5917 -#> 8.1041 -13.1606 -10.0848 2.4079 -2.9753 -14.2536 -8.2830 25.1459 -#> -0.6315 11.0338 -11.0389 4.5759 -1.7932 18.5247 -1.9834 -1.4612 -#> -11.3549 -0.3709 18.6718 14.9252 -3.7786 -3.9237 -4.9956 18.7164 -#> 9.1790 5.0087 -4.8780 -10.6788 -4.4748 -7.9521 2.3956 11.5146 -#> -14.9233 15.5074 13.5288 6.9501 -3.6375 6.9654 -4.0685 -8.8993 -#> -1.0245 4.7855 -17.6042 4.1208 1.0548 -8.5788 -1.7149 1.0637 -#> -5.1705 13.9681 1.5986 15.8308 -3.9914 8.3166 1.1126 -11.7575 -#> 2.7832 -21.6206 0.3791 -5.4350 18.9783 0.2022 6.9384 -23.6424 -#> -1.1030 -4.5934 -1.3159 2.3002 2.8800 -6.3804 19.1025 -11.9186 -#> 1.8217 2.0252 -7.3752 7.9834 -7.1141 -2.5236 -6.1048 14.2583 -#> -7.0049 7.8784 -14.4774 -6.2137 10.1269 -15.6804 -6.8904 -4.6614 -#> 10.0098 8.4671 2.8254 9.2529 -16.7819 -3.6309 18.5456 -10.9203 -#> -0.6490 -6.3074 0.1846 4.3354 9.1514 -12.6226 -3.6332 2.4962 -#> 11.7775 -1.8143 1.0238 -24.0774 13.3624 0.3635 1.7397 -2.5824 -#> -13.1913 2.5277 -12.8994 11.0974 -3.0293 2.3099 -0.8103 -0.4676 -#> -#> Columns 49 to 54 -8.7345 -14.2799 19.6350 13.5499 1.2315 -0.6506 -#> 1.8484 -3.2535 -1.7324 -4.3937 4.1733 -0.5664 -#> 4.5108 13.2596 -4.4172 1.1377 3.8892 10.2091 -#> 7.5502 1.0322 -9.1656 -15.8731 5.8408 -5.2419 -#> -2.2194 8.1269 11.4847 -5.2739 -5.0256 0.6433 -#> -4.1628 -0.1577 1.1322 9.3402 4.0405 -2.5579 -#> -2.4551 21.4342 -0.1518 0.2222 -0.9053 -2.7780 -#> 10.6593 -5.1491 0.3012 6.6177 -8.4856 -11.1581 -#> -11.0929 5.7385 -7.6207 -0.4959 -1.9499 2.3844 -#> -4.4337 -0.3689 -3.9270 -5.0881 3.1659 -2.1273 -#> 1.6437 0.2100 -3.8414 -0.0949 -6.4092 -2.9916 -#> -10.3230 11.3656 -11.9414 11.8128 -5.7713 2.6200 -#> -4.4394 -3.3487 2.7528 3.8788 -1.4185 -3.0869 -#> 15.5630 -6.5025 -3.7371 -6.6396 15.1147 -0.3786 -#> -3.5993 -1.2830 -12.9225 -9.8014 6.9267 7.7926 -#> 1.7097 5.5124 -1.5683 -0.6643 -3.9920 5.6530 -#> -6.5705 5.0899 10.4892 -1.9026 2.3790 -1.7570 -#> -9.6865 -3.1503 5.3201 -1.7256 1.1293 -11.1168 -#> -4.3125 4.4370 -12.0474 -2.7226 -2.3004 -5.8733 -#> 6.1387 -0.5534 -3.6291 9.1612 7.0297 1.6557 -#> -7.3084 9.0659 4.9779 -3.0977 -8.6063 0.9893 -#> 20.1320 -8.8056 -12.3684 6.0840 -1.6641 4.9350 -#> -3.0457 9.5258 -15.0322 7.7813 1.9794 -3.8203 -#> 0.5432 11.5130 -26.2873 3.9583 8.1876 1.2593 -#> -5.0762 12.8825 -2.9607 6.7716 2.5277 -4.8843 -#> -3.7113 7.3289 -1.8610 6.7798 4.0303 -2.3700 -#> 1.6202 -2.6948 -2.9366 -5.8387 7.3190 3.1959 -#> 1.5668 -3.8051 15.0303 -7.3860 -2.8871 -3.9352 -#> -2.9899 -11.8343 -5.7724 -2.9391 -0.1798 -5.7964 -#> -2.9931 4.5864 7.4006 -21.0496 -10.1332 5.1122 -#> 20.1991 -6.5378 -11.0948 0.1378 7.5001 -2.8968 -#> -2.7824 -17.3546 -6.3185 -4.0368 12.6433 7.2955 -#> 0.4172 -4.4497 9.9093 -2.1454 8.5557 -5.9379 -#> -#> (2,.,.) = -#> Columns 1 to 8 4.7596 -2.3708 1.3418 10.7392 -17.9585 1.6765 9.6306 3.8409 -#> 2.5987 -11.9234 5.6093 -15.4016 8.0875 -11.7553 -11.3902 8.7209 -#> -11.8239 2.1317 -9.5167 8.5529 -5.5999 -8.7316 -8.2675 12.9377 -#> -10.1575 1.3394 -2.0061 6.9250 0.4979 -6.0003 2.4058 8.1599 -#> -12.7172 -1.2893 -14.0448 -3.4130 1.7811 -20.7635 -1.9964 -6.3883 -#> -1.9728 17.3032 8.5093 4.1580 -6.2205 18.0032 -1.7450 2.6589 -#> 3.5242 -16.6260 7.9907 -1.1886 -14.1955 -7.1470 7.9479 7.5243 -#> 0.5199 -7.8098 -5.8438 -1.0146 -5.0564 12.4495 -1.4228 -13.6972 -#> 0.1183 2.4802 12.7326 -6.0799 -1.6831 -6.8250 9.4823 10.3730 -#> -12.6137 -9.4658 -8.8782 -6.6302 -1.4513 2.4207 23.0161 -8.5375 -#> -1.3446 -0.7345 -8.8468 5.0346 0.8501 -2.3404 -1.5003 -9.4550 -#> 0.1520 -2.6062 4.4482 -8.2256 -5.3967 -2.2162 -3.8969 -12.9057 -#> 7.7093 -0.7853 7.2272 3.5096 1.8355 16.3071 -9.8703 -3.0199 -#> 1.7039 -7.8727 4.3188 -6.3996 9.3464 -11.0304 -13.0502 8.8442 -#> -5.0000 -0.4485 -5.8625 0.0639 0.5489 -8.9442 3.0572 -10.4764 -#> 2.1024 1.1575 -7.3141 -3.1504 16.4440 1.9490 4.5007 1.5405 -#> 2.2730 3.5882 10.6477 -8.7302 5.1281 -5.5718 1.3978 -8.2277 -#> 13.8522 -0.6762 10.2911 -0.2770 -7.3265 -7.6784 1.6779 2.8196 -#> -3.8064 -8.9778 -0.1520 1.8026 -7.8690 -4.4951 12.5004 1.7195 -#> -14.5756 4.0345 -6.2921 5.8808 -14.3969 9.0118 -9.3838 -3.6941 -#> 4.7236 -4.2033 -8.3519 -4.8034 -9.7167 9.4050 -4.0289 -11.0073 -#> 5.3240 -7.0412 1.8191 -11.3021 12.7159 9.0398 8.1060 2.1124 -#> -6.4629 -2.5036 -7.5835 3.1017 -11.9905 -8.9950 11.2799 -2.9150 -#> 4.2767 4.3477 1.0800 -2.5998 2.8277 10.7435 9.3785 -9.6822 -#> -4.4303 20.7324 -13.0990 6.1930 -6.2529 0.7377 -7.4001 1.3349 -#> 4.4720 2.8303 13.6389 -7.9152 8.4899 -3.1532 8.3466 0.4572 -#> -8.3623 15.7370 -10.7719 4.4032 9.9055 3.7759 -2.8595 -1.7905 -#> -4.8775 1.8915 -0.8530 10.2657 -15.9343 10.3426 -17.7020 2.0729 -#> 0.5613 7.1272 19.9651 10.9519 14.0965 2.4315 13.4485 -1.4175 -#> 4.8714 8.1319 -10.5989 0.3411 -9.3492 7.6087 9.2715 9.6984 -#> -1.3565 -7.4533 -4.8182 -17.3904 -0.5276 10.7067 -8.2810 -3.0651 -#> 5.9905 1.0488 8.9671 16.8183 11.3654 -6.4070 3.9435 19.8416 -#> 3.5665 -2.9801 -1.3262 8.9776 9.5841 2.2091 -2.9954 -3.0973 -#> -#> Columns 9 to 16 -8.8073 -4.6426 -13.9884 -8.2320 -8.5217 -0.4775 -7.4151 -0.6725 -#> 10.0546 4.3717 -1.5863 1.3989 7.1575 32.8837 3.4143 -0.7769 -#> -8.8521 -9.9139 -19.5135 -24.9521 0.7116 -11.7181 -5.9171 -5.1497 -#> -3.9503 -13.6314 -5.8326 17.8691 6.6317 -7.6286 4.8017 10.8725 -#> -12.6786 -19.4017 -9.4109 2.1331 6.1469 11.5832 4.8107 5.5907 -#> 23.6291 -7.2367 7.5458 1.4301 9.4437 -3.4435 9.0182 6.6427 -#> 2.9611 0.3858 -5.4467 4.0315 -21.9230 -10.6043 3.1202 1.5852 -#> 3.9447 3.1710 -12.9762 8.3977 9.4892 -0.5182 2.5873 17.2142 -#> -1.3194 -13.1223 6.6894 7.0331 6.4437 -12.6453 -14.5226 -3.8072 -#> -10.4038 -7.1690 14.0111 -8.5985 -3.2720 0.6783 8.7759 8.8802 -#> 2.2255 7.0874 -0.6579 -2.0451 -3.7498 0.6559 5.8038 3.7002 -#> 14.7065 -16.0033 -5.9809 5.9349 7.9587 -9.9855 -17.9542 -3.7742 -#> -1.4432 1.0548 -14.3342 13.1105 2.1546 -9.1868 11.2042 -0.9932 -#> 12.1984 -12.8600 -2.5919 -0.8897 11.8771 10.0617 -8.9914 -11.5302 -#> 7.5468 -12.1222 -26.0768 -6.2028 -0.7591 -1.2010 -6.0798 -11.1108 -#> 8.7007 20.9135 -2.9024 1.6970 -11.4094 8.3639 10.0946 6.3963 -#> 12.4040 4.4726 24.0844 -5.9889 9.6856 1.4792 -1.8792 2.4935 -#> 2.5080 18.3730 -7.6850 9.7258 -11.3493 13.4590 -1.7244 9.6702 -#> -5.4163 -14.2738 -4.3608 10.3636 3.8761 -12.1021 -6.0958 1.4944 -#> -5.7651 -10.1263 -17.2028 5.7702 -3.2691 -9.1889 -2.5732 -3.3792 -#> -10.0545 4.1091 14.0733 -10.4982 -5.7342 2.3171 -6.4773 3.3353 -#> 2.6782 -2.8125 2.7590 -1.9897 -9.1181 11.7984 -9.0359 1.9233 -#> -5.2344 10.2287 -2.8782 1.6119 -0.4775 -8.1459 1.1576 -9.2421 -#> 21.9949 -8.0431 1.7145 4.7288 -22.4366 4.0880 -12.0808 1.6562 -#> 7.5187 9.7269 0.6820 -2.6167 -1.1286 -1.0346 -18.5821 -5.9096 -#> 9.3526 -5.6035 5.2310 9.7429 -3.3496 -10.7880 -7.9118 -15.5445 -#> 13.6427 -0.6605 11.3279 -13.0369 9.5204 -4.8265 1.9385 -12.8543 -#> -1.0619 -13.6995 -2.8276 4.1690 18.6644 -5.6350 -5.1124 1.0305 -#> 17.6611 11.3794 4.5687 20.1350 3.8588 -1.0378 1.2913 5.7805 -#> -9.2243 5.0020 3.6845 5.8329 -21.8466 -0.6807 0.5413 -0.8968 -#> -0.6459 -3.9581 -4.2478 -0.6667 -8.4845 -8.5001 2.8692 8.0881 -#> -4.8527 8.2332 -5.3847 -15.9781 14.2163 -13.9345 -3.4176 -7.2562 -#> -1.8429 9.3069 -3.8057 -1.8225 -20.4709 2.7353 2.2712 -5.1033 -#> -#> Columns 17 to 24 11.0662 -14.5526 0.0697 -19.0220 4.6790 3.9423 -2.6753 -2.9578 -#> 22.7239 6.9675 0.6424 3.7429 16.1882 -1.5087 -16.0539 0.1969 -#> -2.5550 -3.6210 -3.0403 3.2895 -9.0184 -1.8333 -1.3289 3.3007 -#> -0.8008 -13.9414 -3.3939 -18.7868 -4.2458 9.7082 1.1865 -18.0323 -#> -5.0873 11.8188 7.8642 11.5776 -16.0727 9.4908 8.4264 -15.9399 -#> 9.5338 0.5156 0.8218 24.7352 21.4326 -1.3723 -3.8085 4.0040 -#> -3.6941 13.9830 3.1614 -2.4057 -5.2300 12.6373 0.7486 -10.2209 -#> 9.6279 5.2914 -2.1667 -2.6882 -8.5979 -17.3494 14.1774 4.5505 -#> -4.9814 -17.5009 -15.7751 2.3943 -14.3330 15.0807 -10.8200 -6.3016 -#> 1.0375 7.5819 1.8242 -2.1025 3.4078 -3.6889 -3.6870 13.7473 -#> -10.5505 8.0478 7.4250 -13.4552 -14.5107 3.5241 -1.5158 4.6435 -#> -17.6329 -8.9843 -0.5239 11.9668 -16.0605 -6.9913 -9.6883 2.4846 -#> 3.9703 10.5117 2.4437 -9.5863 -8.5419 4.4804 15.5980 -8.0862 -#> -3.8186 2.0125 2.0543 10.0792 -7.5268 -4.8376 5.2885 7.0757 -#> -12.5689 -8.2614 -13.2759 12.4033 -19.5609 3.0369 -12.8527 -5.7830 -#> -4.6646 0.1257 6.0242 4.2538 14.7014 -8.8032 -5.5753 -10.4496 -#> -10.2244 -0.1577 13.0210 10.7659 -2.0343 -4.4945 -2.3216 -4.1657 -#> 0.5649 -8.4057 26.7812 -6.8665 1.8525 5.4294 -5.2585 4.8750 -#> -3.3987 2.3718 -11.2156 -6.4853 -24.5315 2.5088 -9.4710 4.6227 -#> -7.9132 1.8410 11.1107 -11.2720 -4.2379 -2.7606 -5.2530 5.4750 -#> -1.3218 10.2854 1.9285 -11.3502 -11.5231 -10.5135 -5.7751 1.5308 -#> -4.0835 -6.0362 1.4618 -5.1609 6.1913 -23.7162 6.2317 14.7307 -#> -17.7677 -17.0161 -13.5697 2.4813 8.5010 -6.2259 -17.0806 -15.3741 -#> 11.8556 -4.8654 -11.5342 -4.0967 11.5691 1.4684 7.8547 9.3808 -#> 1.8603 -12.9068 8.3558 -8.8895 23.3471 -17.4276 1.8093 7.6683 -#> 9.5079 3.8055 7.7854 20.9576 -6.5707 15.5529 2.2398 21.4755 -#> -5.8233 -11.2661 -21.2273 1.8201 1.9201 0.9845 -15.9731 7.6355 -#> 4.1302 -6.0695 -8.4113 -10.1005 -17.6905 -16.4190 6.3698 -12.3462 -#> -17.9427 15.6452 -5.7567 -10.7185 12.8775 12.0741 21.4769 6.2621 -#> 17.3807 3.2854 2.0233 0.4028 7.7633 4.6181 7.1242 -18.0502 -#> -2.6197 -14.2332 0.0485 9.7610 3.9811 -18.7429 12.5354 -6.3579 -#> -9.0786 0.7897 3.0636 -13.3762 1.8459 8.2314 11.0418 3.6651 -#> -2.2425 2.0437 -5.5021 -3.8781 9.9642 2.8891 -2.2660 -11.2317 -#> -#> Columns 25 to 32 -32.6698 -8.6093 -5.0129 -11.3094 24.4550 -0.5704 4.8933 -5.5201 -#> 1.2721 0.6487 -10.9427 11.6122 12.3350 0.6388 1.1434 3.0152 -#> 8.1651 -14.7024 -10.9904 2.9376 5.4284 11.7591 15.3603 5.0108 -#> 21.2289 -14.6108 3.7578 23.7265 0.1149 3.1634 -7.0561 -25.7672 -#> -6.4865 1.1524 13.9062 17.8211 4.4666 6.4054 4.1235 7.4927 -#> -24.0737 -10.2488 -0.7837 -19.7711 -3.5405 -14.7599 -11.0616 -5.3042 -#> 0.7789 -17.5263 -15.1668 8.9913 -2.4021 9.1531 9.4817 7.4751 -#> -6.0258 -2.9461 13.5621 15.0060 -5.9288 -7.2522 13.6002 -4.5906 -#> -5.3995 -12.3453 -2.9668 -10.6944 0.0278 3.2880 -3.7492 5.9309 -#> 7.2386 11.6521 6.7375 4.7726 10.0783 -2.9344 -19.2203 -3.8975 -#> -3.8018 5.9055 -9.5356 -17.2161 1.6056 -3.9030 -2.1836 6.9117 -#> -22.5193 1.6938 -3.8648 6.0373 6.0638 -18.1423 2.2888 1.0071 -#> -0.0912 4.5126 12.5903 5.7911 -4.7895 -15.4148 -1.4881 -7.6777 -#> 8.1668 -1.7108 -10.8280 16.8029 8.0118 0.1464 0.8490 -14.1483 -#> 9.1446 1.4189 0.1367 -3.3847 9.2224 -2.9130 2.0731 -13.4869 -#> -1.2765 21.6822 5.4889 -5.0004 -3.4493 -2.9209 -9.9478 7.0307 -#> -4.0638 6.2533 -9.5217 -2.7831 -0.7303 -1.4290 7.1134 8.5808 -#> -5.4981 -17.6882 -6.5339 9.6844 6.3867 3.6524 4.6800 2.2742 -#> -1.4602 1.0576 5.5146 -13.2516 -14.4008 13.8107 -18.6747 5.9339 -#> 4.9861 -4.7771 20.6300 15.5541 -6.0074 5.1128 -0.4189 -2.9462 -#> -1.4486 16.5589 19.4602 1.9526 3.5924 -7.0477 -4.7282 1.1304 -#> -0.3443 9.8070 -16.0824 -8.1581 3.0776 -12.2659 11.0605 -22.4701 -#> 8.4359 13.0354 13.1142 -1.8719 -10.8537 3.0869 11.9608 4.5571 -#> -11.3341 -7.3854 -0.7040 -8.9241 13.4767 -0.2498 1.4465 7.7940 -#> -9.5916 2.3081 9.4823 -5.6463 -4.7764 -17.7450 14.1080 17.3467 -#> -0.5413 2.8060 10.5147 1.7546 -3.5436 -6.1599 -14.4421 12.4286 -#> 1.0812 -9.0801 -1.7089 -18.4518 2.6798 -9.2463 -9.5171 6.6413 -#> 4.7788 -0.4133 2.3009 0.4398 -5.0574 -4.7202 -5.6557 -13.3648 -#> 12.6261 -3.5589 -2.4662 1.1209 -23.9096 -2.2812 6.0163 -7.3213 -#> 1.1171 20.2254 -3.8290 -15.7041 5.4705 -1.6709 -2.7804 10.5089 -#> 9.1564 5.4900 -7.7571 14.2231 5.9711 -1.7668 -1.0377 -10.1896 -#> 4.7588 -0.2267 -7.0410 -8.5436 -12.5272 -17.1023 18.5268 -5.4560 -#> 8.0987 -4.0980 -0.5273 -11.0374 -7.9019 10.1087 0.3397 -2.1266 -#> -#> Columns 33 to 40 -0.6891 5.1512 -5.9616 8.3684 -6.5541 6.7590 -1.5969 -3.3233 -#> -7.1381 -5.8954 2.3722 1.1176 -1.5182 -0.4858 4.1055 0.5757 -#> -12.1200 1.3603 4.2367 6.1071 7.8496 0.7433 7.9517 7.7278 -#> -6.7656 -8.5868 2.2282 -0.5914 -4.4096 6.2110 -1.2708 26.3763 -#> 1.6977 -3.0817 0.9547 11.4690 8.4347 -0.5228 9.5944 -1.6138 -#> 4.3577 -4.5182 -3.2134 -9.1625 -9.4456 -3.9361 -7.6599 -1.6247 -#> 2.2651 6.7368 9.1694 7.3736 11.9476 -25.9909 7.7422 7.7991 -#> 6.8059 -2.5797 -14.0586 -0.4316 4.3446 1.4599 14.3628 -19.3045 -#> -2.8454 -5.5364 -2.1157 -6.6131 -0.8127 -10.4145 -5.6694 10.0690 -#> 12.9586 1.9260 -3.7913 6.1513 1.6149 0.6863 -12.8163 14.7614 -#> -17.7495 0.5206 3.7497 1.8556 7.8428 -7.2770 0.9337 -3.4212 -#> -13.5710 6.9932 -2.0936 -2.0107 -1.9852 -1.7724 17.9422 7.7792 -#> -3.7335 -7.4195 -2.4935 -3.5388 2.8353 -8.7390 6.6221 -8.8004 -#> -9.6807 -15.3747 4.5706 -4.4656 -1.0659 -1.2077 -4.3275 0.8064 -#> -1.5919 8.4855 4.3687 2.5481 3.1221 1.6635 7.3577 -9.0880 -#> 0.5298 7.4672 4.1995 9.2957 0.1546 5.4831 -13.1099 -8.7616 -#> -9.7556 -26.8600 -9.2641 -11.0434 -10.5407 -10.8140 -0.6515 -5.8294 -#> -0.2582 5.4815 3.4851 -1.8528 9.1895 5.7904 2.5789 15.6803 -#> 6.5863 3.9697 9.1019 -7.4410 -7.5782 -16.5352 -7.4639 -8.7216 -#> 3.1715 6.5688 -5.2957 -4.3494 -4.3843 17.6800 6.5693 -15.0530 -#> 8.9078 -1.9871 4.1219 7.5827 17.6163 14.7036 -0.9792 -9.6755 -#> -16.9230 -0.4241 -8.5176 -0.6277 -15.3566 -11.9403 -10.3961 3.3311 -#> 1.5606 -8.6993 -6.6938 -7.1477 12.3101 -12.5791 -3.6695 -11.3222 -#> -7.9792 4.5259 9.5120 1.5035 1.6864 1.8410 -7.3137 -4.2054 -#> -5.8180 10.1696 4.6879 5.7464 -1.5702 28.5283 -13.1240 6.4315 -#> 2.2603 3.7824 10.2709 -9.1696 6.4035 7.2668 7.3593 -2.1308 -#> -0.7404 -3.1743 9.5376 -12.0170 -5.0920 3.1090 5.7006 -8.4534 -#> -6.4615 -25.2439 -10.9172 -2.2400 4.7356 -2.1891 2.6827 -10.5127 -#> 9.5131 7.4656 -6.5773 4.6907 4.1973 -3.3962 7.0490 -7.4300 -#> 16.4267 3.5051 10.1579 7.0891 9.1914 6.3337 -2.7389 14.1222 -#> 1.3694 0.3559 1.9932 -3.0715 0.0079 -0.8437 -4.3533 3.1633 -#> 2.0113 -3.4719 11.3453 -3.5211 -6.7155 -10.9955 -4.1942 2.4041 -#> 11.0641 -13.8099 4.6128 2.8432 11.7525 3.2607 1.2708 -2.1138 -#> -#> Columns 41 to 48 -0.1047 7.1639 -12.7312 -14.9411 -1.7243 -12.6823 -5.2909 -1.5054 -#> -0.6198 -5.0780 3.3587 -3.7096 9.7318 -11.5688 -6.0776 -5.0235 -#> 3.5723 -9.2005 2.5989 -21.5485 -12.2536 17.0624 1.4070 -7.9798 -#> -12.1266 2.2245 -1.4824 -2.2374 6.7505 22.1302 -10.3626 16.0516 -#> 12.9833 -10.8692 8.1235 6.8576 1.9169 5.8651 10.7781 1.8127 -#> 13.4862 14.4447 7.6561 0.8456 -6.2709 -10.2808 -14.2315 -5.1059 -#> -11.6531 3.8384 -13.5760 -15.2587 -0.4386 -11.6019 -3.1254 1.9812 -#> -14.4318 -8.3302 -5.7316 1.3431 3.1469 -0.5234 2.0978 5.0587 -#> 11.3827 12.2602 13.5625 8.1668 -13.3577 -5.9555 -10.2404 -0.4668 -#> -20.0554 6.6514 -5.7848 -3.7281 2.9242 -1.2270 -3.3500 2.0820 -#> 13.5036 4.9700 14.1329 -3.3614 -2.4387 4.3636 5.2511 -26.7314 -#> 12.3755 -6.0302 13.0706 7.0045 -4.3136 -20.3503 6.1174 -7.6721 -#> -11.2236 2.2306 0.2531 6.8231 -1.9899 3.6850 2.7379 -7.5945 -#> 7.6483 -7.3374 9.9872 4.2223 12.5574 4.1922 11.5467 -1.8445 -#> 6.6880 1.3918 6.2407 -23.0986 3.5472 15.1577 6.0838 -9.5015 -#> -25.3432 21.5193 0.5058 -5.4625 -14.2084 -2.0676 14.9385 2.1116 -#> 5.4331 0.7524 -4.1156 10.5738 -15.1925 -10.3904 11.5019 8.5438 -#> -0.5497 -12.5371 5.6004 -4.6952 15.8923 -8.7153 -3.9162 -8.2549 -#> -0.2119 -0.2621 3.6054 -26.5559 -0.8848 -10.6993 3.7716 -17.5107 -#> -19.2452 -4.8849 2.6568 -10.7631 -3.6147 18.3777 10.0724 -3.3694 -#> -8.3158 -15.7886 12.6284 4.6058 -7.6070 -6.1975 23.2880 18.2722 -#> -0.4041 10.9108 -5.3681 8.0119 -4.9525 14.1613 12.6936 -3.8161 -#> -14.9089 6.0440 14.4983 -9.9542 -14.4793 -4.1608 2.9876 28.2071 -#> -7.5434 14.3462 -13.1011 -0.0543 -0.1406 1.4744 -9.4161 -4.9771 -#> -14.9983 13.5030 10.9352 -10.9594 -6.3634 6.1297 -18.0191 1.8998 -#> 3.8310 -10.9236 -3.8793 6.5608 5.8434 -7.0148 -15.9521 -13.2939 -#> 17.6064 -5.7444 -0.3266 -0.0692 -11.4574 5.3437 -12.4108 -8.0583 -#> 10.7164 -5.7833 1.8178 15.0389 3.0004 -2.2704 -3.4458 4.3220 -#> 4.8007 3.8398 -4.0033 10.0028 4.0875 9.7484 -17.8828 -12.7776 -#> 10.3227 -1.1501 -2.4754 -20.4586 13.4596 -9.2950 1.5691 2.9662 -#> -6.1917 -0.8440 -3.0441 14.1943 -4.7117 2.9086 9.5871 7.8146 -#> -20.5611 12.6771 -7.1091 2.4571 -3.8733 -0.2544 7.2356 -6.9050 -#> 3.2720 2.4587 0.8837 -15.8807 4.8638 -8.7061 -8.4504 -8.2424 -#> -#> Columns 49 to 54 21.8557 4.4202 10.8168 7.5174 -3.7758 2.1122 -#> -2.8456 -15.9854 2.0339 7.9619 0.4806 -0.4653 -#> -9.5906 5.3457 2.7396 8.0575 -3.6745 1.8922 -#> 9.1292 8.5034 -25.1697 -2.3926 3.7174 -3.3314 -#> 5.9401 -1.1539 3.8906 7.1575 0.5660 2.4047 -#> -5.3541 -2.6976 4.3906 2.5245 0.3295 2.7928 -#> -10.4245 7.0469 -6.8186 -4.2369 2.9327 2.8947 -#> 3.0145 0.9534 5.5193 7.3609 0.7635 1.8795 -#> 1.0380 8.5256 -16.1735 -1.1572 -3.3035 -0.8527 -#> -5.8692 -11.7066 -13.6596 4.0496 10.6453 1.8254 -#> -5.8070 -8.3313 21.3003 0.7767 -2.3350 3.4440 -#> -12.4272 -14.2899 1.9738 2.6648 -7.6391 2.7511 -#> -16.9396 -9.4806 4.9570 0.1588 3.8355 3.2884 -#> 5.2529 -7.1969 -4.2218 -6.2748 2.3049 -1.3600 -#> -13.7700 15.0260 -11.5880 -3.3049 -0.6822 -3.0384 -#> -10.5514 -9.3660 5.5534 1.1381 -2.6856 -1.1547 -#> -2.5532 -0.4677 5.0003 5.1773 0.7936 -0.4262 -#> 6.8253 -5.2287 -6.6104 1.5442 -2.8864 -0.4845 -#> 11.6100 14.6819 -9.3433 -6.4061 -13.6912 -0.5850 -#> 0.3866 -5.1808 4.5291 7.7504 1.9457 -1.1015 -#> 0.5982 -8.9206 4.5865 0.8641 1.1949 1.2615 -#> -5.3683 -4.6591 0.5351 -8.2208 -2.7109 3.5416 -#> -13.6418 -10.2297 -9.1510 5.2640 2.1182 -4.0404 -#> 4.0355 -8.2871 -9.8668 -5.2217 -8.3299 0.6824 -#> -3.4248 -5.1316 9.4518 0.5314 0.5506 -2.2042 -#> 1.9089 -0.8906 -2.7357 3.9258 4.9725 -0.2202 -#> 7.4031 -6.5030 -0.5144 0.6044 1.5107 -1.8649 -#> -2.6418 0.4153 3.0424 8.7083 1.7534 -0.0252 -#> 4.0239 3.5966 -7.6770 0.8856 5.5774 -2.7345 -#> 2.3516 -11.4263 0.2037 -6.9024 2.5088 3.9246 -#> 11.4006 -1.0101 -2.1060 -2.4315 0.8993 1.6014 -#> -14.6229 -1.0623 11.6151 -9.1313 6.1282 1.5521 -#> -2.1762 -12.8544 -2.2436 5.0173 2.8554 -0.1953 -#> -#> (3,.,.) = -#> Columns 1 to 8 5.0065 -7.5648 4.3697 5.4119 1.1572 17.5422 -0.4933 3.3253 -#> 4.5796 7.3903 -1.7693 -2.0851 2.1409 3.3564 1.4481 -4.8963 -#> 1.6970 -2.0240 0.1673 -13.3189 10.1225 4.5772 12.1823 -12.9104 -#> 4.4423 0.3706 1.9800 -14.1760 1.7023 -7.2400 0.9277 -10.3556 -#> -4.9464 5.5538 3.4204 -2.9786 -1.0806 -6.1542 -0.2451 -9.0744 -#> 11.9943 -1.9920 5.2217 -4.9694 -3.6795 7.3552 11.9307 8.9454 -#> -6.8636 5.1646 3.3317 -2.5187 -3.2920 2.8625 8.4681 -1.5043 -#> -3.4725 -0.3777 0.8911 -0.7854 -1.9812 -1.6493 -27.8762 2.8427 -#> 4.7066 -3.2704 -0.3828 -6.5696 7.0008 -5.8864 13.5979 -16.6016 -#> 5.4407 3.1746 19.6670 0.1120 -2.7427 -11.9844 1.0637 5.8977 -#> -7.6526 -4.7646 7.5172 2.7917 -15.0073 1.5688 -6.8070 5.4306 -#> 2.9191 1.0430 0.1736 -4.6040 -3.2349 18.2567 -0.5415 -12.4009 -#> -2.2711 4.5532 4.4744 -0.5926 2.2176 14.9981 -11.5919 14.1614 -#> -5.8899 2.6850 0.1409 -2.7625 -0.4703 14.8921 -3.4326 -3.3055 -#> 0.4423 -8.7097 -10.6624 8.3148 6.7439 -2.4797 12.4699 -20.2065 -#> 8.0021 0.3554 0.3091 -3.3935 12.4163 -1.2953 4.3594 -4.7604 -#> -2.1517 -1.1949 6.7236 -9.2434 -11.6224 6.0957 10.3943 -1.4627 -#> 0.7060 -1.5135 -20.7254 -6.8302 -3.2998 15.6868 -5.5653 -9.2229 -#> -5.0176 -0.9663 -1.7610 15.8706 -1.0764 -10.6421 -5.5809 -3.3236 -#> 1.9067 -10.7770 -3.6526 10.3817 9.1903 6.5684 -17.1229 -7.5234 -#> 1.2750 -3.1628 -1.8613 8.5306 1.5362 0.4427 -4.1968 -0.8201 -#> -2.1752 -1.8308 6.5429 7.1769 10.1203 -15.6256 -3.5803 4.7920 -#> -3.9520 1.5366 -1.0068 -8.6163 -5.8413 5.2089 -2.8411 -5.1499 -#> 4.3731 3.5496 -1.5140 6.9102 -1.4003 8.9362 -6.4642 -4.5631 -#> 4.2081 -9.6222 -3.5010 0.5712 -2.3747 6.0067 -4.5879 8.2174 -#> 6.5613 5.2677 -1.1577 15.0782 11.2545 -8.9979 2.4309 3.5133 -#> 4.0774 -0.2183 -7.1818 -3.6122 4.8120 5.3588 7.6044 0.1746 -#> -8.7328 -10.5805 1.2447 3.5645 6.8815 -6.6055 1.7821 -3.7090 -#> -6.2692 -0.6318 -4.2597 -8.9461 -2.7332 -6.6875 -7.8548 4.0098 -#> -0.3510 3.9007 -14.3480 -0.3358 -9.3153 -3.4372 -6.5250 6.8921 -#> -1.2713 2.7064 5.8096 -5.4592 1.9853 5.8544 -6.8093 2.7735 -#> -2.0104 -1.4024 -2.1828 -2.0720 10.3598 8.6430 -0.3503 3.7573 -#> -4.4944 2.3952 -2.2312 -6.9977 -4.7600 2.7363 -0.5609 6.7697 -#> -#> Columns 9 to 16 -6.9998 10.6761 8.6290 1.0619 4.6502 -8.7584 -8.5783 5.7564 -#> 1.2974 -3.5524 6.9653 -19.2955 -4.2965 3.1079 13.4470 -4.0555 -#> -5.2354 4.2810 -0.9473 18.6950 -8.9680 9.0909 -22.2445 -16.3682 -#> -1.2449 6.2986 -12.2026 19.6986 -8.0719 4.8052 -13.2800 -24.9230 -#> 3.1749 5.6027 -5.8446 4.7225 -1.5331 9.1026 6.1800 6.6361 -#> 10.8301 14.3328 -7.5849 17.7657 6.8040 2.6532 -6.7402 2.3008 -#> -2.4575 -5.3909 3.3260 12.3631 10.6478 1.0650 -6.7919 6.5352 -#> -6.5352 -1.1951 -2.9045 -4.2426 1.1777 -9.2049 1.3871 22.7300 -#> -6.4794 8.2347 -10.4154 4.7242 -8.1403 7.2454 -13.2459 0.8178 -#> -1.7903 -5.8577 0.6970 9.9278 5.4227 9.8878 -14.7169 -0.7110 -#> 6.4007 -7.5623 -10.7833 4.0560 5.7243 6.1790 6.8927 -6.2734 -#> 2.0545 -3.5389 2.0349 -10.0992 -3.0352 -4.9398 2.8822 0.6213 -#> 9.8403 9.5956 -10.4498 -1.1934 5.3889 -9.9235 -3.0167 -12.1756 -#> -1.9861 -9.2124 6.4746 -6.2048 12.4562 -2.5963 7.2102 -7.0540 -#> 6.5246 4.7411 -12.6090 2.8226 -13.1843 7.1695 -9.3513 -5.5817 -#> 12.1244 6.2580 -2.8042 -7.4370 5.9493 12.8463 5.8486 -8.9301 -#> 2.3902 -0.9217 6.3379 3.5433 7.7250 -5.8269 6.4756 0.2072 -#> 4.1890 0.4687 13.0917 -8.7310 -4.4070 -12.8233 -3.4196 5.6194 -#> 7.1521 -6.5397 0.6541 -6.3224 10.2388 4.7956 -12.6320 4.4017 -#> -1.5785 10.3988 -10.7020 4.6607 -7.1149 -2.0043 -1.2336 -6.8194 -#> -2.8968 2.4503 -5.4942 -14.4237 6.6551 11.6778 8.5156 11.1045 -#> -1.1568 -2.7098 1.0471 -3.7183 4.8246 -1.9663 -4.6640 -1.1049 -#> -8.5518 7.5451 -15.9437 -1.2995 0.7863 6.2623 1.3795 1.0162 -#> -1.5262 8.2184 1.0894 1.9374 -4.0241 -0.2278 -15.7278 -12.5403 -#> -3.6396 -0.5381 -9.3408 4.3941 -12.5852 11.0130 7.4217 -5.1584 -#> 3.5167 -0.9388 8.3914 -6.8105 -2.8391 -21.0392 0.0891 6.5242 -#> 6.7332 -3.6219 -10.7829 7.0740 7.6049 5.0294 -0.9974 -3.7430 -#> 8.4995 11.3926 -10.2824 -2.8517 -9.0784 11.1800 2.0026 -2.0633 -#> 4.7858 -14.6920 7.6764 14.7249 7.3598 -4.3916 -7.6974 -6.0448 -#> 4.9977 -2.4565 6.1780 -5.7729 -1.9491 7.0341 9.1896 2.2460 -#> -3.9617 0.7266 -8.6600 0.5977 9.7916 8.5833 11.3199 -4.5682 -#> 3.0461 7.4758 2.2648 3.1652 15.9415 -9.5217 -7.9929 -14.9009 -#> 0.0593 8.0995 5.7684 11.2536 -1.8930 0.8747 -0.8771 -17.4723 -#> -#> Columns 17 to 24 2.9991 -7.0031 -4.3098 3.9696 -3.8698 3.7981 -3.4638 -5.8145 -#> 9.8929 -0.6985 1.5678 -7.6750 -1.0575 3.5687 10.1861 -2.8345 -#> -3.5003 11.2459 -9.4071 7.3753 -0.4676 -7.4027 -6.5143 4.4750 -#> 2.7744 3.1267 -5.5507 -3.7926 4.7195 -4.2368 -16.2356 11.4313 -#> 14.7289 3.2852 -5.1273 -26.9714 -5.7715 -11.4138 -6.7957 3.3423 -#> -0.2818 5.4439 -0.5775 -0.4554 -2.5376 18.6859 12.4762 -13.8969 -#> 4.9449 1.3183 -0.1317 -5.1529 3.8452 9.2586 1.8964 13.9726 -#> -1.6536 0.3676 -13.3478 -4.2053 0.0647 -6.0252 9.3051 -6.1649 -#> -5.5615 14.3067 -1.6848 12.2309 14.4537 -8.1157 -0.8579 1.5930 -#> -0.9635 -1.4980 4.2655 -21.3527 -7.6679 -7.1719 -0.3917 1.2707 -#> 3.1651 2.3176 4.5623 -2.1620 2.1102 1.5631 6.1667 -12.4041 -#> -3.5800 15.3647 -13.4311 -7.2799 0.5813 -0.9263 17.0942 -18.8560 -#> -1.9807 1.5200 -10.6795 -0.0452 -4.0646 16.5639 -0.9831 -21.0682 -#> -0.0563 2.6941 -12.5726 -15.9221 -5.0472 11.8357 9.6567 2.6775 -#> -0.7082 15.5344 -10.7737 -14.3932 -12.7231 -17.7728 -9.7606 3.9705 -#> -11.7585 -0.7036 6.9729 2.0480 -1.0557 13.8488 -3.0825 1.0844 -#> -5.2574 2.7561 -4.0692 -7.9613 -7.0095 11.1284 14.2156 -2.0798 -#> 12.7254 -12.6184 -4.4636 10.5271 8.7149 -3.2069 5.7323 4.0469 -#> -14.2750 1.7038 10.0903 -2.5975 15.6578 -16.2574 15.1585 -11.0779 -#> 12.1564 -2.2589 -12.5739 -10.4091 -6.1980 -5.2244 4.4709 -3.7656 -#> -18.4174 -15.6443 -1.5797 -5.4252 6.9559 2.8517 16.7162 -15.9417 -#> -9.5823 11.4253 6.4563 -7.3173 11.0476 4.3020 4.4361 -3.1137 -#> 2.2054 7.7500 4.4457 -11.8929 -18.6036 10.0097 11.4196 5.3231 -#> -1.3877 -14.1901 10.9835 5.7080 2.7517 -2.3153 2.5598 3.1702 -#> 11.1839 -7.5466 6.8821 12.2485 3.6899 10.8344 -6.7247 2.7138 -#> 6.8281 -12.7237 -12.4240 0.8836 -10.6440 2.3363 0.7084 4.0937 -#> -3.7046 3.6566 3.7686 17.9101 2.9497 3.7957 -10.0212 7.2813 -#> -15.1208 3.2913 10.3041 11.0956 2.9664 5.2545 3.8653 5.1548 -#> 0.9173 -13.3204 5.8748 4.8021 4.3568 -2.2284 -1.8059 4.8992 -#> -1.8480 -7.9347 0.0099 11.5403 -5.5167 -14.6196 -13.5348 -3.0772 -#> 6.2677 3.2872 -3.0857 -13.0179 -17.1925 -3.1941 4.9096 -10.0129 -#> -4.5396 7.1775 -5.0795 -2.1652 -4.6862 9.7565 4.7690 -8.4607 -#> -6.4992 -17.9611 -0.7486 8.4730 0.2359 14.2265 -4.9010 13.5942 -#> -#> Columns 25 to 32 -0.4502 -2.3367 5.3966 -1.9736 14.0377 3.0119 -8.4109 -13.9355 -#> 3.9575 -4.0850 -0.7513 -8.3361 -0.8708 9.1271 0.5462 -14.4554 -#> -6.7282 1.3936 7.7278 15.0878 3.4429 -3.3617 -7.3842 -6.6403 -#> -6.4748 -13.2080 3.8655 10.9370 -5.1134 -14.1147 8.0535 -11.6929 -#> -12.5487 -1.3291 3.7207 -1.6684 -11.1489 -1.3516 9.4046 -2.0490 -#> 5.3655 9.5782 -17.2964 -9.7356 0.9319 11.7845 -0.2503 -2.2241 -#> -4.9538 -22.1325 8.2270 17.2945 -4.1733 -6.3520 -0.8176 -5.2402 -#> -4.1229 10.7951 3.8800 -14.9516 -13.4749 -13.0130 -2.2828 -2.7846 -#> 4.6756 -9.9426 -6.9022 4.8072 15.7269 12.2702 -3.2373 -5.9807 -#> 4.7910 -8.1889 3.8486 -5.5455 -18.6664 8.1700 7.6792 -5.5944 -#> 5.4961 -5.3140 -14.3234 2.9961 0.9553 2.8977 -3.2754 10.3027 -#> -0.4717 7.9712 -12.2355 -3.7001 11.2613 4.8899 -19.9430 3.0561 -#> 10.5788 11.4136 -4.8488 -8.5414 1.8690 1.0350 -0.8902 -6.1415 -#> 10.3371 -10.8624 -12.7971 1.8257 -3.5686 -1.8141 -6.7775 -5.4592 -#> 4.4009 -3.1121 4.6466 18.3852 11.2915 -3.8269 -7.2640 2.5717 -#> 13.3067 -0.2645 -2.1777 -3.1643 4.2687 12.0935 10.4656 19.6666 -#> -8.0351 -4.7948 -8.3152 -14.3852 -4.9098 0.2674 -12.8918 -5.0915 -#> -6.9013 -3.8669 6.1630 -0.5587 1.0668 -7.0706 6.9699 8.8962 -#> 4.4448 -4.7124 -12.8896 14.1769 3.3869 -5.7435 -8.1179 -5.2362 -#> 4.2559 16.4625 2.6017 -3.6140 -1.8479 -22.0223 -7.7596 -0.5762 -#> -7.4018 16.2872 -12.9994 -7.2869 -7.2918 13.4027 18.9048 17.6126 -#> 1.9429 6.2656 1.1654 -6.0094 7.2305 -3.4192 -6.5580 -11.7218 -#> -1.4432 -3.2813 -9.4318 -10.5674 -1.7319 17.8822 14.0319 -6.5661 -#> -4.2170 9.2337 2.8840 1.7105 -0.0033 4.7875 7.6331 0.7086 -#> 1.5496 -4.3581 -11.1832 -1.6342 15.4827 0.6272 0.9326 19.2356 -#> -2.5731 -0.6965 15.7973 2.5878 -13.0473 -8.7761 -7.2038 10.3990 -#> 0.1765 -2.1546 -2.9154 17.0378 1.3992 1.8228 -11.6180 11.1169 -#> -19.8042 5.9646 -15.4726 -11.0236 -6.0788 4.6658 -4.0302 -6.4444 -#> -16.1552 12.7106 8.4312 1.0798 -3.3709 -10.7590 18.7330 1.0471 -#> 2.8258 -19.5791 2.6700 24.3105 0.5782 -0.9183 -3.7057 -0.9869 -#> 2.3721 4.8013 -6.0742 -5.4514 -2.3220 0.5604 13.2491 5.0343 -#> 8.4863 3.8528 6.6870 0.3344 12.9178 28.9992 1.1368 -3.5024 -#> 7.6834 -2.3466 -9.0723 -5.1307 -6.5264 14.4130 9.0037 -5.7521 -#> -#> Columns 33 to 40 6.1166 3.4755 3.0107 10.4820 2.2313 21.4302 -0.3769 7.2492 -#> 7.5885 2.9510 -5.9707 -3.1287 -6.9030 0.4736 -9.9255 -11.9223 -#> -7.3547 -2.6745 16.3513 -9.9079 -7.4035 15.0191 6.9734 -0.3176 -#> -3.9057 7.4514 -3.5686 -0.0520 5.5190 -8.5440 16.0547 -6.0800 -#> -3.9551 9.4788 2.8746 -14.6228 -8.6366 -17.5136 -2.6790 -0.4776 -#> -9.5146 -5.2213 11.0260 -14.8032 6.0196 1.7087 6.6660 14.5329 -#> -5.6591 6.8087 -15.9567 -2.8290 -4.5220 -7.0795 4.1778 -3.3880 -#> 2.3287 9.4812 1.4361 2.4778 -4.0854 -14.8061 -15.7904 -13.1929 -#> 12.3652 -12.8175 4.8446 9.9093 2.9067 24.6350 18.0183 2.1561 -#> -12.6300 -1.2267 -5.1545 12.1703 -2.8878 -1.3358 8.3011 -2.3132 -#> -0.8691 -1.6677 17.0963 2.4648 -7.9536 -11.2632 -4.9028 -4.6315 -#> -2.7268 0.6535 15.4282 8.6797 3.8988 5.2600 4.8597 1.6352 -#> -8.4937 -0.0901 5.7381 5.0044 11.3519 -8.4415 13.9622 -1.7646 -#> -16.1659 4.4432 -11.6796 -0.2280 7.2947 0.4916 23.7616 4.4628 -#> -7.2801 7.3802 3.8974 -3.3115 3.7231 2.3252 9.6789 3.7708 -#> -3.8670 -11.5865 1.2037 -4.6438 7.7230 -3.4565 -6.9513 5.7064 -#> -12.7542 -12.4351 2.0368 -1.3465 7.0032 5.8571 1.5565 0.1240 -#> 2.4653 -3.8384 -6.6116 -0.0328 -2.6110 9.6295 -1.0983 -7.9041 -#> 18.4993 -2.1162 -7.7738 0.2191 8.9795 11.0376 7.3325 9.2761 -#> 3.1431 -11.9472 0.6696 7.9047 1.0182 4.9934 -7.1324 -9.2449 -#> 12.1020 1.9057 -1.7627 9.8111 6.0275 -2.3225 -1.9654 -5.9898 -#> -5.4913 -1.6481 21.1171 15.0078 -6.5409 -2.4634 -9.6712 -3.9974 -#> -27.1950 -3.4378 -0.7142 -6.0477 9.2275 -22.3768 -14.0585 7.8989 -#> 0.9481 11.1605 5.6543 -4.2198 5.2720 9.0667 -8.2835 0.0860 -#> -1.8083 -11.5910 3.3375 -21.1419 0.6927 7.6406 -25.4208 12.1868 -#> 2.0968 -8.3505 5.4435 8.9627 10.5742 14.8554 12.7098 -5.0643 -#> -0.4590 -4.4083 -0.0300 -4.9880 8.3520 -3.7739 19.8174 -0.7580 -#> 10.4876 2.0805 1.9403 2.2905 18.1642 -4.9934 -3.4718 -1.4246 -#> -16.6475 11.8381 -9.6935 -20.3861 8.1342 -1.1187 8.6158 10.3533 -#> 5.5304 -1.2605 -15.8330 -13.0794 11.5950 -10.1415 -4.5288 -9.6068 -#> 1.0786 -3.2658 16.3202 0.3046 -4.1398 -7.0007 -3.2427 -0.0090 -#> 2.3637 -7.6107 4.8096 -14.9503 16.2826 -1.3325 3.9388 11.2878 -#> -5.3557 16.9209 0.8013 -1.4738 3.4657 -9.9022 -1.0349 -4.0073 -#> -#> Columns 41 to 48 1.3345 -11.9278 4.8337 -20.2754 3.6607 -7.6054 -10.6370 0.7393 -#> 18.4758 4.1840 5.1066 0.5228 4.7843 25.5798 1.4430 -4.8073 -#> 2.2567 -12.8952 0.7386 -6.6147 -16.7057 21.4841 -9.8661 -0.3491 -#> 0.1938 8.7895 -10.5219 -5.8892 -1.2568 25.8267 -9.5847 -17.0484 -#> 0.5941 13.5316 12.5077 11.4497 -15.2622 7.9676 3.8970 -5.2587 -#> 3.5475 -0.9832 4.7096 10.4700 7.5481 -6.0933 -1.1715 0.8234 -#> -15.7558 10.3998 -3.9478 -2.2627 -3.1319 -13.5418 22.4130 5.0423 -#> -15.0093 0.8777 8.9129 -0.8943 6.5934 -8.7079 -9.1901 1.2953 -#> 0.7930 -18.1611 -9.1900 -1.9323 -4.9864 0.0681 -2.9314 1.2715 -#> 8.6339 5.4117 14.6414 2.9027 18.3501 9.9014 -8.9372 -0.7209 -#> 4.2320 3.9845 7.9440 3.8098 -7.0927 6.7669 -6.9893 17.8406 -#> -7.9320 -18.7648 -7.3420 2.7295 5.3951 3.3670 -8.2192 4.4860 -#> -2.9330 20.8977 5.6355 -2.1919 -12.0907 -3.9535 6.5782 8.8423 -#> -9.3347 -10.6555 -17.5191 -6.0478 -6.2106 1.9033 17.4623 -13.6784 -#> 1.0635 -20.0535 -9.3549 -19.6059 0.9665 5.6258 2.6148 -15.1398 -#> 7.1171 13.7309 10.8487 -5.9154 2.0694 -2.6388 2.8409 7.6188 -#> -9.5625 -5.8672 -2.7635 9.1451 -1.4415 -10.1499 8.3045 -3.1570 -#> -0.8245 -1.6012 8.4346 7.0643 0.2006 -6.0378 -5.0282 7.1434 -#> -20.1260 -14.5046 -4.1219 -16.9426 2.5031 7.4219 -3.9130 -0.1316 -#> -9.7300 -6.6407 -14.1674 -9.5041 4.6654 13.3797 -17.6050 5.9562 -#> -5.4223 -14.4687 7.3019 5.0101 -12.4661 -10.5039 -1.3772 9.8727 -#> -1.9127 -18.7884 8.1959 1.4009 -0.5677 -3.3554 7.6146 -4.5400 -#> -8.3890 -4.7103 -10.4115 -10.5468 -11.2971 6.0386 -0.3137 -4.2605 -#> -0.4814 11.7382 8.6822 6.9266 7.6994 -3.8786 13.3594 5.3836 -#> 6.7302 -2.0560 2.8992 -1.6217 -7.0485 -3.6673 -8.4619 20.2970 -#> 1.4750 1.6396 1.5047 14.3273 12.3591 -3.2715 6.9656 -9.5265 -#> 11.8076 1.5431 -13.8370 -2.2326 -7.8101 -4.1678 6.3293 -11.9680 -#> -4.2533 -2.1013 -11.6261 -1.8696 -1.4513 -3.3382 0.7499 -3.3817 -#> 1.1298 16.2342 -3.1134 0.6689 2.9858 -5.9440 6.0355 -6.6737 -#> -2.8073 11.9571 12.6337 8.9343 4.8625 2.5022 5.9198 4.3786 -#> -1.4268 0.1202 -1.7580 5.3506 5.1749 1.9549 -8.0605 -6.0836 -#> -2.4020 8.9237 7.2497 -13.5399 -2.3874 12.1784 4.0796 16.0535 -#> 0.6426 23.0241 2.1876 -4.6716 -2.3914 4.1827 5.9588 12.7554 -#> -#> Columns 49 to 54 -14.4496 -8.8349 1.4433 2.1690 -3.7491 0.9587 -#> -7.7497 -6.5779 -8.7464 9.5712 -5.7366 -5.5686 -#> 4.6983 5.0069 -3.8794 11.1549 -4.5174 4.5328 -#> 1.5492 -0.2539 -1.7878 -1.7838 2.4953 -0.5657 -#> 3.1202 0.9545 3.0343 -1.6115 -8.4593 5.7329 -#> -4.5750 7.9842 3.8035 -13.5802 -5.4309 -4.1773 -#> -5.8305 6.6706 5.9399 -4.3707 6.1034 -0.6240 -#> 3.9163 -7.3941 -6.3986 -1.3349 -4.0819 -4.4972 -#> 4.2833 1.5292 -2.7047 0.2604 -0.3884 3.3314 -#> 7.3679 18.8472 -7.1299 -6.6655 1.7560 -7.6865 -#> -8.4181 -8.1825 2.7599 5.0031 -9.8373 -0.7030 -#> -1.7533 7.0408 1.4097 4.1171 -2.0185 0.9093 -#> -15.0229 2.6766 -0.4257 -7.9381 -2.5720 4.4902 -#> 0.2040 1.1599 13.5789 -7.6558 10.2325 -6.6165 -#> 4.6438 -3.2434 10.0951 3.6194 -4.4035 -0.0836 -#> -9.2463 2.0903 -0.0023 10.8511 -6.4154 -5.2644 -#> 2.7451 -13.9360 -9.0759 -7.8088 2.8181 -5.4235 -#> 3.7136 -6.3668 -1.2021 2.3239 7.6207 -3.6235 -#> -0.2203 3.0907 -11.3325 -1.1252 -11.7583 -3.8903 -#> 7.2152 0.2197 -1.1622 12.1529 -2.1452 3.1754 -#> -9.8983 2.5051 -0.5053 1.2530 1.9963 -3.8447 -#> -6.6937 8.3050 0.6103 1.0726 -4.1585 -3.8931 -#> -0.9202 5.0524 0.0446 8.4999 8.9261 -1.5784 -#> 4.6479 -1.3398 11.7661 -1.6349 0.1920 -0.1233 -#> 8.7547 -3.9087 3.6918 5.1973 3.0588 2.7503 -#> 16.5450 9.8928 0.2408 0.5691 6.6154 1.0948 -#> 0.1952 -11.5720 13.2281 -6.8009 3.6701 0.9840 -#> 2.0523 -17.0915 -1.5819 -4.7299 -3.5012 -0.7101 -#> 7.6151 -2.8723 3.9770 -5.7623 2.6486 -0.8233 -#> -3.1527 4.0325 8.9116 -5.7383 -0.0428 7.4691 -#> 7.4466 9.5679 -3.7968 6.6464 2.2457 -3.8793 -#> -13.5084 7.2839 -5.4541 -4.2447 0.3259 3.5467 -#> 7.9819 -10.1110 6.5147 -0.2204 2.2264 -2.8546 -#> -#> (4,.,.) = -#> Columns 1 to 8 -2.1050 12.3998 -0.4774 1.2383 5.8022 -6.9281 -5.4728 6.0727 -#> -1.8157 0.5752 1.5176 9.4034 10.4397 -3.4623 -8.0909 8.9566 -#> 2.8523 -1.4397 -2.0096 0.4399 -2.9058 13.6328 1.0217 13.0719 -#> -1.5294 2.2925 -6.5809 -2.0880 -19.8867 14.5897 -13.2773 -11.1911 -#> 6.7109 -7.0110 0.6445 -0.2109 1.6469 9.4859 -5.7427 -10.0133 -#> -4.8364 5.8289 4.6019 3.3430 -1.8501 -10.7135 21.6716 -15.6815 -#> 7.1862 0.7821 12.1798 8.3415 -3.7003 6.7931 -1.0732 -6.0702 -#> 6.0033 -0.7861 -5.6532 8.0808 9.2785 12.5299 14.3445 3.3772 -#> -2.2405 4.2441 2.1795 2.5741 -12.6495 2.4456 6.2146 -5.1256 -#> 3.2793 13.7299 3.0425 -3.4000 3.7587 10.1255 7.9639 -18.8988 -#> 0.8084 -8.2614 1.0017 10.9275 10.0781 5.7191 -3.9948 0.1607 -#> -1.2355 -1.4259 -3.7475 1.8783 0.8166 0.9565 -4.1186 5.8185 -#> 1.4094 -12.0985 -11.9980 7.8775 9.0769 15.9720 -4.7439 2.9385 -#> 1.5571 0.0629 -0.1035 -1.5539 -3.8787 -14.4867 -3.6049 -22.4549 -#> 4.0708 -4.1490 0.1908 -5.8443 -4.5887 -0.4553 -0.2272 -8.7493 -#> -4.0910 10.3983 0.2122 2.4811 -10.1094 2.2425 -10.9766 5.7353 -#> -1.1070 -3.0927 -0.9344 0.5191 3.5854 0.6241 -0.4108 -3.0465 -#> -6.5693 4.7755 -9.9835 3.8776 -1.0840 8.8799 -12.0977 14.3494 -#> 1.4439 5.9479 14.3636 -2.7812 -2.9505 -1.6697 -2.3293 -10.8652 -#> 5.3586 0.2086 -5.2899 -5.4280 9.1076 -3.8572 5.6281 6.1843 -#> 2.3577 3.8243 -3.7754 0.7548 -6.3366 -7.6611 0.5793 1.8813 -#> -1.0511 7.6017 -20.3287 7.5251 -7.2557 5.9654 8.1025 14.8469 -#> 7.8334 0.2638 -5.3560 5.1233 -0.2849 1.2744 -9.1953 -2.2904 -#> -3.4615 2.4915 -1.4092 -8.7137 1.1823 4.8117 8.3513 16.4244 -#> -3.6477 8.8516 -8.0689 0.4341 8.3788 2.0504 -8.3104 11.5212 -#> -5.6302 0.9044 3.8023 -4.9904 18.6028 -11.3102 7.7670 3.6899 -#> -0.8651 1.9924 8.2783 -7.4464 4.8625 -15.3295 0.5469 -9.0181 -#> 0.7463 -6.4614 1.3538 3.3951 2.1724 12.7520 7.3761 -11.6228 -#> 2.6962 -12.6195 -3.5614 6.3831 -0.7390 4.9695 7.9886 9.1374 -#> 0.5365 6.8245 -1.8546 -7.4207 -12.1718 12.5915 -8.2897 5.3908 -#> 0.0312 4.4716 -0.3820 3.3666 -15.6750 -0.7308 -5.7290 -10.0158 -#> 0.0913 -9.0728 -10.2081 0.3705 0.5208 -4.9135 -13.3187 -3.3307 -#> -2.4412 -8.5937 2.6922 0.0637 7.2697 10.2367 -0.7692 7.1194 -#> -#> Columns 9 to 16 1.4035 1.4040 4.8057 -16.6953 -10.7570 15.9873 -3.9211 2.1780 -#> -1.6966 -5.1725 19.3307 -6.3238 6.2951 4.6960 -2.6081 20.1427 -#> -1.5471 -7.7275 3.1516 -0.9020 -10.0211 -7.9036 1.2798 0.7613 -#> -0.2585 -4.4660 -6.3808 -6.9989 7.8038 0.3097 3.0570 -5.2005 -#> 1.4197 1.7325 -17.5488 -3.1597 13.1648 2.9228 -3.4837 -14.5672 -#> 15.8725 23.6275 -5.4269 -23.7333 -21.2530 0.0843 1.8968 14.8612 -#> 13.6775 10.2984 -13.2271 -13.6772 5.7547 2.0044 -3.7326 -19.6737 -#> -16.4962 6.2314 5.0742 -0.6590 0.0642 0.3928 -11.1059 -7.1018 -#> 11.2382 0.8385 -1.0045 -4.1813 -10.6938 4.4621 14.5863 -18.2117 -#> 1.6609 -5.2850 5.2872 0.7011 -0.5309 -15.8560 4.4899 -23.4390 -#> -14.5825 4.0461 11.6341 -17.4078 -7.0555 -5.2670 0.8999 -5.6901 -#> 2.8568 6.2582 10.6664 -14.0320 4.2543 4.6557 -4.4138 2.9445 -#> -5.7549 11.1731 -8.0999 -14.7524 7.7781 -2.1008 -18.4997 2.9257 -#> 2.8536 8.2467 13.3471 0.4846 5.7097 1.9348 -6.6551 -8.1418 -#> -3.3299 -4.0816 13.1217 16.2571 -1.7483 7.3860 14.2170 -11.9363 -#> 11.0684 5.0762 5.3195 0.1432 12.0400 -3.7210 2.7910 -1.7928 -#> 19.4745 -2.3785 8.6529 -4.5426 -2.4921 7.6448 0.1893 -0.8252 -#> -1.9241 -9.2395 5.6588 -7.6330 -0.7623 -0.6414 9.5515 8.5972 -#> 9.1962 -6.1070 6.8629 -12.9698 -1.6276 24.5720 16.0840 -17.1156 -#> -19.7998 6.6669 12.4325 8.5895 4.8028 -7.0879 -12.0306 1.7905 -#> -9.7381 -10.4174 12.7223 17.6191 16.4640 1.4797 -2.2388 4.2665 -#> -3.5749 1.9684 4.4351 0.8422 3.2918 -15.7919 -17.1336 -9.5118 -#> -5.4865 18.5721 13.7783 1.7104 9.1968 8.7916 5.1292 -8.2372 -#> -10.1325 19.9818 2.2621 -2.6759 1.3810 10.1955 1.6041 0.8605 -#> -8.0232 8.5082 -8.4552 -5.3418 2.3370 13.6287 2.0805 8.9783 -#> 11.1102 -8.4891 2.2244 2.1921 -7.0178 -18.9312 -15.8038 1.8319 -#> 6.3255 -1.7980 -12.3073 0.5514 2.2092 2.9887 -5.8563 2.6762 -#> -15.6557 -3.5616 2.8305 -8.2303 -19.6977 11.7540 1.1495 1.8021 -#> -2.1927 17.4035 -0.7762 0.3718 -5.3043 -6.2854 -6.9760 -15.9494 -#> 15.0398 -13.4662 -24.0948 1.7141 14.0322 16.5744 0.9768 -0.3527 -#> -14.3824 5.7630 15.4500 -0.7237 2.4711 6.4872 2.1575 -3.1848 -#> 10.7607 -2.2199 4.4435 4.3157 6.8573 2.1049 -20.8255 23.3302 -#> -2.3852 14.4152 1.5751 -5.2666 -9.6740 -7.5906 0.5588 12.2082 -#> -#> Columns 17 to 24 -0.3194 -14.9899 -11.1432 7.6206 -2.6515 20.3641 6.9513 1.0261 -#> 8.1171 -7.2204 -20.3917 -8.9583 -0.4736 9.6779 12.4881 -6.0153 -#> 7.8626 -5.3077 -1.0013 -4.2854 -3.4871 12.5138 -4.1042 -4.1187 -#> -0.2872 -14.5971 -15.9643 6.7193 11.3281 16.3837 -6.2956 -8.5059 -#> 7.8821 -2.9166 11.2147 -13.4587 -1.7634 8.1924 -8.6742 10.9291 -#> -5.3366 3.4671 7.0700 10.0996 -3.0505 5.1970 -6.2224 -0.0521 -#> 3.1443 13.5610 8.5994 -11.0686 4.1030 1.0456 -10.7196 -4.6311 -#> -5.1903 1.7303 -10.9142 -3.1094 1.8742 -16.7151 -0.4450 14.1182 -#> -5.8055 -7.5304 3.0536 -5.1155 -7.8851 18.8317 -2.9984 -7.9829 -#> -9.4636 0.1752 1.8338 -3.9810 -18.1357 7.3099 3.1274 -19.7982 -#> -1.5617 16.0187 -1.5374 -20.1662 -5.5443 7.1038 -4.8856 3.5684 -#> 4.4075 -20.2555 15.7781 2.6051 16.9903 -1.7933 -4.4984 -6.9582 -#> -0.3479 13.9096 -15.6244 -11.1793 -3.5875 -15.4238 -7.4633 8.6940 -#> -12.2850 -4.8526 8.2597 -7.7774 10.2423 10.6756 15.3577 3.9057 -#> -9.5595 -12.0798 5.7455 13.8555 2.1398 11.5047 2.4477 -7.0594 -#> 6.9720 -3.8493 11.8396 11.0641 -2.8332 -0.1463 -7.4761 -6.3588 -#> 5.5957 1.3788 12.8145 -16.4845 -2.3937 -4.1697 2.6650 9.6846 -#> 16.3475 -1.5722 -3.2263 -1.4248 2.0256 -1.4739 4.0604 -10.2989 -#> -0.6586 3.4046 4.6824 0.4668 -9.5215 -0.6462 -1.9205 0.3976 -#> -16.8921 9.5151 0.8390 14.2056 -3.7218 9.2721 -5.7017 -8.5205 -#> -14.3077 9.3742 3.4875 9.6195 -6.0802 -3.7222 10.8635 5.9279 -#> -2.1217 -0.1907 4.4814 6.3114 4.5167 -3.5535 2.8435 9.9919 -#> -13.9939 15.1288 4.2053 -9.5740 1.5628 3.6481 -1.3641 -3.7322 -#> 8.7557 8.7261 25.1453 0.8178 4.6255 -6.7057 -8.6511 -0.1091 -#> 7.4447 -1.8325 3.1293 7.0337 -2.7863 1.6286 -16.0444 -3.8087 -#> -2.8873 2.8116 0.0587 3.4188 -9.5266 -14.8540 4.2804 -8.2651 -#> -9.8895 -6.4253 -10.2173 5.8122 14.4629 -1.7608 -11.0493 3.2803 -#> -1.0160 -13.9493 -9.9600 -15.3795 -3.2849 5.6923 -7.4746 12.0561 -#> 2.1805 4.5089 -2.3862 -4.1503 10.5217 -16.1267 -6.3030 6.9384 -#> -1.3073 7.8117 7.1677 2.6476 -8.0955 8.7233 2.8246 4.7810 -#> -18.4368 -2.3778 0.1608 13.3684 6.7485 6.4731 6.1167 -1.2516 -#> -0.6043 24.9120 -1.4423 -6.7582 8.9642 -8.6571 -14.0482 16.8610 -#> 17.7286 20.2208 18.4647 -7.7855 1.3575 -9.3874 -0.5325 -11.0693 -#> -#> Columns 25 to 32 5.6326 5.8350 -13.5859 2.6041 -4.8413 -0.6250 -6.9518 -5.0680 -#> -12.4608 -1.0832 -3.9246 -6.0014 -14.6060 9.3799 -5.2067 9.0054 -#> -9.7539 1.2320 -1.4374 -8.5207 -10.2753 -2.1604 -1.0834 -9.2973 -#> -2.7700 -3.0238 -6.5836 22.4792 -6.0836 -3.0262 9.0822 -13.0279 -#> -3.6857 -8.2444 -9.6928 -5.6210 -3.9370 3.0351 7.6838 -11.3569 -#> -0.2301 6.7172 -1.5944 -1.4519 0.7078 -22.6055 -0.0719 -9.2618 -#> 22.1931 -19.8122 -7.2434 -1.8270 -7.0690 -20.1464 -11.8704 2.4956 -#> 8.5558 -13.4251 9.5341 -4.0036 -0.1540 8.6128 7.1111 7.8143 -#> 7.0072 4.0444 -14.5680 -15.6934 16.3534 -8.4037 2.4889 -12.2590 -#> 4.0391 13.6764 -10.0684 6.1171 -5.3206 -15.5514 5.7784 -2.1322 -#> -0.0031 -7.7351 -9.6109 6.7777 10.3609 -5.9732 -5.8301 0.0472 -#> -7.5225 -8.9016 -3.5149 -12.1527 -13.5420 -10.9205 1.6544 -16.7605 -#> 0.7409 -9.4100 4.2403 6.1743 13.0698 4.1074 -2.2450 11.6677 -#> 7.2575 5.4975 1.5036 8.9499 -6.6605 1.1732 10.0563 -2.5328 -#> -13.7980 2.2803 -11.4877 -8.7844 -11.2398 -18.5173 21.1523 -24.5849 -#> -13.6781 4.3191 -1.2661 3.1900 -9.3190 -2.6247 2.3197 -7.3712 -#> 14.6047 -0.4196 6.9357 15.3953 5.5046 -5.5453 0.5608 -0.1292 -#> 7.9832 2.5611 -7.4931 8.8954 -12.2209 9.0451 -13.8076 15.4440 -#> 16.5112 0.9398 2.9589 -0.9302 10.1620 -6.0823 -2.3119 11.8207 -#> -2.4773 11.4658 4.3734 -14.3540 -4.6328 6.7320 10.7169 -2.2529 -#> 5.1290 7.3429 -6.4964 -1.1527 9.2697 11.9739 -7.9363 7.1066 -#> -6.5489 -4.9425 -4.4167 11.7435 -4.6573 3.6331 -4.8614 -0.2404 -#> 4.7502 -13.9168 8.9244 1.8197 -3.4975 2.8928 5.0875 -16.2945 -#> 8.9716 -22.2495 -7.6811 -13.3519 -6.7208 -3.6837 -6.2659 -10.6813 -#> 10.0090 -2.3997 -3.9190 11.7128 -11.4862 2.7779 10.6134 -17.2960 -#> 5.2759 10.7843 6.3188 -9.5183 3.7632 -13.0112 7.6870 -9.9708 -#> -17.1446 0.4763 1.7853 8.8129 0.8247 -12.7194 13.7973 -10.3702 -#> -2.9073 7.9717 -1.6721 4.9187 17.0543 27.8806 -3.1481 3.5462 -#> 0.6223 -5.8077 21.9198 14.0404 1.6793 11.8098 -2.5856 5.5326 -#> -1.5546 8.4663 -19.7711 -3.5745 -9.0930 2.5057 -8.8058 1.5028 -#> 2.9323 2.4301 0.4083 1.3978 2.1401 -3.1432 -9.2657 10.6113 -#> -8.3944 2.1382 3.2571 5.1617 4.1159 11.8319 -2.4654 3.5631 -#> -9.5940 -3.8198 -0.0708 0.2324 -0.3559 5.0562 -9.3026 -0.0156 -#> -#> Columns 33 to 40 5.4751 -4.9685 -1.1140 12.0255 -2.7069 0.1429 -3.9142 -1.8313 -#> -21.6627 7.3637 -1.2541 -10.8029 -6.5101 4.8416 -2.5360 5.8329 -#> 0.8076 18.1017 -4.4381 -0.4216 -1.6496 0.0826 8.9062 -9.5129 -#> 11.0231 -3.6290 -7.2440 1.0281 5.9202 -2.9915 -0.1684 -6.0654 -#> -1.8620 4.0979 3.5261 -2.2876 -3.8825 -4.7818 30.0266 -5.1267 -#> 9.2627 -9.3801 -6.3897 4.1681 3.5414 -1.2313 2.2550 -7.4358 -#> 3.8479 0.4866 21.6011 11.6342 -7.4530 9.2773 4.7980 4.0954 -#> -12.4142 0.9372 -1.0704 -8.8271 -10.6478 7.2930 3.4420 -7.5093 -#> 0.1698 -2.8374 -9.1786 3.2934 9.4190 9.5889 -12.3461 -6.6323 -#> 6.0238 -16.7210 -2.1618 6.7680 9.0894 -11.5117 9.2601 4.0035 -#> 7.0340 -5.0686 -6.7381 6.7828 3.2271 -1.7524 6.9165 -1.2493 -#> -2.5774 -9.4541 5.8497 -10.5635 7.0962 4.1460 -3.2895 -4.5333 -#> -0.0374 -5.8546 0.1963 -1.3632 -10.0908 9.7387 -1.9190 -8.9648 -#> 2.5153 -6.6226 7.6842 -8.7523 -0.5876 -4.2538 0.6357 7.1436 -#> 15.0600 8.3653 -5.3016 -8.9727 1.9099 -4.7034 -14.8080 6.8321 -#> 5.2180 1.6998 9.6705 -1.0297 -0.2300 -18.0583 -5.9676 2.8278 -#> -6.6438 -3.8793 10.7047 1.1690 -10.4911 -4.8902 -8.9453 2.6225 -#> 2.6865 -7.0715 17.3035 -1.0292 8.6419 1.5127 11.3743 4.6567 -#> -0.8823 1.9922 -1.6805 2.6441 7.0532 -3.5863 1.5426 -9.0900 -#> 3.4192 22.3566 -13.5973 -0.9803 -0.6945 -13.5652 -6.6271 -8.6951 -#> -4.7106 -1.8860 4.7969 -3.3994 -0.0136 -1.0474 7.1799 -12.5076 -#> -3.5109 -6.8928 -10.2952 -27.2561 -4.6149 1.7142 -12.5523 -1.0221 -#> -6.7093 10.3142 8.2573 2.4934 -1.7516 -11.4688 -2.4966 -15.7521 -#> 7.1038 -8.7811 1.0369 -5.1111 10.7257 -0.7682 -2.9512 -2.0453 -#> -11.9724 15.1011 -3.2683 5.3593 10.3720 -2.2955 6.3947 4.5287 -#> -6.6439 6.2459 -5.5014 -5.6457 0.1835 12.0601 2.1755 11.0305 -#> 1.2823 -1.0151 0.4840 9.5057 6.6848 -7.5415 3.5770 1.3109 -#> 7.9927 -9.9826 1.0853 -13.3779 -7.0886 3.5114 -5.8115 -1.8867 -#> -1.5721 -1.4694 5.1739 -5.5387 13.7887 3.5508 14.0987 8.0543 -#> 11.2043 6.1202 1.7812 17.8911 9.0120 2.1963 15.4762 12.7197 -#> 8.6884 -13.1554 -4.0781 6.2083 10.8189 -9.2288 -7.2779 3.9982 -#> -8.0184 16.5070 5.8675 -2.3072 -16.1820 18.5364 -5.8738 -16.4123 -#> 3.2305 -0.1945 8.0007 15.5950 6.0886 3.3087 -5.3623 0.7048 -#> -#> Columns 41 to 48 -8.6006 -4.8646 27.5523 -1.5430 -15.4093 7.7829 7.3981 -0.3793 -#> -15.8642 -0.2972 -3.1149 -0.9991 -3.9445 -12.1184 -10.9670 6.3001 -#> -0.7089 -11.8555 2.6656 0.5943 16.2644 -5.9812 -12.9371 -1.6014 -#> -12.7581 -0.4289 6.1141 -8.4463 -2.0868 -6.8926 -2.6126 -9.0016 -#> -8.2506 21.2704 -4.0491 -2.2546 12.1547 0.1667 -8.4576 10.7930 -#> 16.0594 4.1562 10.0196 -7.5173 -0.3256 16.0685 -0.1030 -16.9110 -#> -0.6021 12.6787 2.3527 -8.8386 -7.3754 13.4047 -8.9777 -5.7262 -#> -2.1249 -14.3050 5.0873 6.6320 -6.5322 -16.8134 -7.4323 -3.4885 -#> 12.5125 -10.3117 18.7218 4.0142 2.5446 8.3031 0.4404 3.8229 -#> -0.8576 1.2573 0.1647 -5.6551 25.1561 2.0695 -17.7567 -22.7935 -#> 17.9452 -11.9911 9.8603 7.5845 -2.0050 6.4060 -4.7783 0.6968 -#> 3.0729 -2.5976 17.7593 -0.2200 -14.6376 11.7562 12.4318 -22.8368 -#> -3.3752 -8.7275 22.4086 10.6214 -6.4601 17.0018 1.2447 -9.9543 -#> -17.1848 7.0295 -20.1981 15.9027 -13.7001 14.6122 -12.5801 8.7190 -#> 7.3749 -8.2294 -10.3922 5.2209 10.2326 -1.1034 -2.3768 5.0779 -#> 6.5220 6.8331 -5.5146 -26.3487 10.0968 7.4076 4.2510 -0.1241 -#> -16.3551 23.9117 -12.9491 -6.2878 -8.0376 -8.9809 -11.6969 -16.2643 -#> -3.3320 -7.2563 7.0096 -5.1591 -10.8903 -8.3051 11.5121 -4.8205 -#> 30.1463 -27.2953 -10.3255 1.2407 -3.8466 -11.4208 10.3090 -21.3064 -#> 16.6340 -28.7060 7.5365 -0.2053 -9.2341 -6.5734 -10.7766 -2.7444 -#> 11.6982 -2.6341 -11.3039 -7.5007 5.1457 7.6688 26.6237 -15.1442 -#> -9.7540 -8.5738 -14.3518 -5.7013 13.9404 -3.8985 -0.1138 8.8463 -#> -10.4308 11.0512 -5.5217 -22.5346 1.6473 -5.2179 -9.4511 -7.9082 -#> -2.4912 0.3161 -11.0292 6.5892 0.1307 -1.6834 -1.0987 1.6923 -#> -3.9673 -7.9401 22.3718 -2.3691 11.7177 -11.3505 6.1523 3.1161 -#> 2.6399 13.7468 -0.3763 18.8045 8.5708 4.0262 -8.7736 4.2290 -#> -2.8576 7.1759 -8.0101 10.8422 6.3865 5.0936 1.2168 9.2354 -#> -10.8633 -1.0824 -0.8987 11.4758 -11.9743 -10.4646 13.7240 -1.7358 -#> -14.1178 21.2416 3.7955 24.8035 6.0983 0.3335 3.4862 4.4295 -#> 11.0182 5.3442 -3.7926 -20.3325 23.6837 -3.2902 -8.2127 -9.5710 -#> 1.1583 2.6508 -16.1883 2.1579 -12.3988 16.1296 1.6853 2.6362 -#> -4.6801 -1.2721 3.6902 8.4217 20.9060 -13.4412 18.0424 -1.5549 -#> -8.8554 17.1304 -6.1151 8.3124 -10.3070 -2.1585 -10.7280 1.6565 -#> -#> Columns 49 to 54 -6.7632 9.0205 3.9832 1.1877 3.4006 5.9775 -#> -8.4520 -5.9237 2.4574 0.5103 -1.2747 -0.2187 -#> -6.3469 0.8673 -5.0039 -2.8374 -1.5800 0.5130 -#> -15.8223 -10.4915 -2.9986 -0.9827 -1.1368 -2.5114 -#> -10.8001 7.0442 9.2466 8.6179 2.7842 0.3649 -#> 20.2535 11.2160 3.6390 13.7110 13.4839 8.3034 -#> 4.1736 -9.9043 -5.7932 -7.1145 1.6513 -1.3660 -#> -2.9477 -6.7830 2.2032 4.6185 4.4635 2.1922 -#> -19.0045 9.1668 -3.2519 5.0947 -1.5906 1.6095 -#> -5.0198 -7.6808 -18.1146 5.1944 4.6247 2.0399 -#> 8.0772 10.3144 -3.3351 4.8508 -1.3147 1.5389 -#> 6.1753 -1.1323 7.2598 9.2451 -4.0144 4.3835 -#> 2.9787 0.5584 7.9027 -0.5738 -0.4446 3.5570 -#> -2.4050 -19.9975 -0.9384 -1.0172 1.1957 -3.5614 -#> 16.0283 11.9335 1.4864 13.4882 0.2626 2.1828 -#> 6.7011 1.7985 -3.5908 3.4675 0.3625 1.4105 -#> 14.4403 6.5076 8.2722 12.7859 4.8887 5.4573 -#> -4.6987 -9.9025 -0.4740 -6.8387 -0.7232 -0.5840 -#> -0.4707 3.2551 -7.0753 1.6114 -1.6163 -0.1045 -#> -3.9348 7.0254 -7.9377 -5.2363 0.3976 -1.0753 -#> -5.7204 6.0980 1.9556 -4.8694 0.6084 -1.2481 -#> -8.0397 4.9181 1.6892 -0.0915 0.5730 0.9259 -#> 5.1109 -1.9765 3.3636 4.5646 6.2376 3.4933 -#> 6.6943 -21.3563 -4.1890 -4.2812 -1.6969 -2.0259 -#> 15.2933 -11.0076 -7.0053 -1.5960 -3.4357 -2.7896 -#> -1.0028 -14.8088 -0.0199 -5.7920 -1.0041 -0.3028 -#> 11.7314 -0.1637 -1.2221 -3.7700 -1.0029 -3.3862 -#> -2.0893 6.2374 18.0681 -0.3367 -1.2742 -0.1907 -#> -0.0101 -6.8336 -1.6090 -7.7200 -3.4319 -1.2397 -#> 5.7481 -7.0355 4.3746 0.9063 -8.7526 -4.9835 -#> 7.4789 -5.0154 6.5255 6.2043 1.7799 4.7899 -#> -6.0105 7.1342 -0.0061 -2.5677 -8.1663 1.6100 -#> -0.1814 -12.1102 -1.1473 -2.8114 7.7898 -4.2462 -#> -#> (5,.,.) = -#> Columns 1 to 6 -6.4262e+00 -1.0302e+01 -8.0426e+00 -3.3373e-01 1.1466e+01 -1.5697e+01 -#> -1.1519e+00 2.0001e+00 -4.2894e+00 -7.6826e-01 -1.1227e+01 2.5261e+00 -#> 1.6970e+00 -3.3100e+00 -4.4895e+00 1.3259e+00 1.8962e+01 8.1508e+00 -#> 2.2075e+00 2.1765e+00 1.4384e+00 1.5917e+01 -1.1309e+01 7.0364e+00 -#> 3.8093e+00 -3.8368e+00 -1.6455e+01 1.2298e+00 -1.0367e+01 -1.5784e+01 -#> -9.8932e-01 -5.0788e+00 -1.5136e+01 -9.8048e+00 -6.8520e+00 1.4605e+01 -#> -4.7910e+00 -5.1260e+00 1.6453e+00 -5.3460e+00 8.2288e+00 2.0704e+00 -#> 6.3422e+00 9.8608e+00 5.8286e+00 -5.4466e+00 5.3061e+00 -8.3891e+00 -#> -4.0661e+00 -2.3066e+00 2.0202e+00 -1.3354e+00 8.2298e+00 -7.6327e-01 -#> -1.5477e+00 7.6396e-01 4.5311e+00 3.5211e+00 -1.4649e+01 7.8986e+00 -#> 2.5916e+00 2.3080e+00 6.7754e+00 -1.1042e+01 -1.3366e+01 -3.8321e+00 -#> 2.5285e+00 4.0808e+00 8.8407e+00 -6.2941e+00 1.3328e-02 3.6616e+00 -#> 4.6916e+00 -5.0228e+00 7.1309e+00 -4.2475e+00 1.1867e+01 -8.2675e+00 -#> 8.0642e+00 7.6726e+00 5.1226e-01 4.8608e+00 2.9767e+00 2.1985e+00 -#> -4.2219e+00 6.4403e+00 1.7694e+00 4.1442e+00 7.6349e-01 -1.0233e+01 -#> -9.3862e+00 -3.3176e+00 -1.1481e+01 2.5465e+00 -3.9686e+00 1.6527e+00 -#> 8.8190e-01 -3.4220e+00 1.1836e+00 -2.9330e+00 1.0608e+01 1.6665e+01 -#> -1.4457e+01 -2.0386e+00 -7.8769e+00 9.1628e+00 -1.2645e+01 8.8112e+00 -#> 3.3646e+00 2.8787e-01 -2.8648e+00 9.7906e+00 -4.8234e-01 2.3953e+01 -#> 5.5200e+00 8.0760e+00 8.1474e+00 1.6808e+01 3.8783e+00 -2.1914e+00 -#> -2.4432e+00 -6.1003e+00 -3.4941e+00 9.1001e-01 -1.7641e+01 -9.1526e+00 -#> 7.2679e+00 2.8391e+01 -5.7809e-01 6.8612e+00 1.4567e+01 3.5080e+00 -#> 6.7239e+00 6.1337e+00 1.0413e+01 1.2382e+00 4.5004e+00 2.7951e+00 -#> -2.6557e+00 -6.5517e+00 2.6339e+00 -3.3769e+00 4.9165e+00 1.9107e+00 -#> 4.2102e+00 -3.3543e+00 -1.2687e+01 2.5395e+00 2.5638e+00 1.9196e+01 -#> -3.8726e+00 2.0134e-01 4.1404e+00 -6.0846e+00 3.6685e+00 -6.0531e+00 -#> 2.7909e-01 1.0917e+00 -7.4160e-01 -2.7103e-01 8.1912e-05 -3.0494e+00 -#> 7.3136e+00 6.4701e+00 1.0309e+01 7.2169e+00 2.8521e+00 -2.0348e+00 -#> -2.8647e+00 -3.3199e+00 -1.1787e+01 -2.0081e+00 -4.9858e+00 -7.1792e+00 -#> -9.1410e+00 -1.2388e+01 3.4625e+00 1.7626e+01 -1.7789e+01 2.6632e+00 -#> 5.0764e+00 6.8630e+00 8.2448e+00 -1.5367e+01 -6.3745e+00 -8.8637e+00 -#> 3.8776e+00 -6.5466e+00 -1.8389e+01 -6.6696e+00 5.8418e+00 -6.7904e+00 -#> -3.2302e+00 -8.0956e+00 -5.7334e+00 1.4126e+00 -2.7921e+00 -3.9389e+00 -#> -#> Columns 7 to 12 -4.0465e+00 8.8923e-01 -1.4303e+01 -9.4046e+00 6.4015e+00 1.5997e+01 -#> -2.9243e+00 1.7137e+00 4.3121e+00 -7.3565e+00 -1.0214e+01 -1.2201e+01 -#> 7.1019e+00 5.0987e+00 1.9010e+00 -2.4699e+00 -3.0563e+00 2.9805e+00 -#> -8.1475e-01 8.9229e-01 1.4172e+00 -3.4651e+00 5.2722e+00 1.2982e+00 -#> -5.4192e+00 -9.6536e+00 6.7028e+00 -2.9846e+00 3.8598e+00 -8.2039e+00 -#> 1.7103e+01 4.1835e+00 -1.9952e+00 5.5272e+00 -1.5506e+00 -9.4011e+00 -#> 2.2505e+00 -1.9357e+01 -2.2002e+00 2.4192e+00 1.0321e+01 1.7182e+01 -#> -1.0851e+01 9.3675e+00 -1.4618e+00 -1.9284e+00 -3.1685e+00 9.2756e+00 -#> 1.1996e+01 -8.0681e+00 3.1487e+00 4.6738e+00 7.7214e-01 9.2540e+00 -#> -2.0899e+01 -2.4664e+00 -1.0292e+01 -2.3162e+00 4.3699e-01 3.0183e+00 -#> -7.5234e+00 4.0622e+00 2.8611e+00 -5.0612e-02 -2.7615e+00 1.0727e+01 -#> -6.7269e+00 1.7070e+01 1.3479e+00 2.8591e+00 1.3784e+00 1.5057e+00 -#> -2.5754e+00 1.4918e+00 -1.6116e+00 9.1770e+00 -2.8859e+00 6.1093e+00 -#> 4.5459e+00 -6.6762e+00 1.0926e+01 -8.5546e+00 5.9326e-01 -4.5555e+00 -#> 2.1496e+00 -6.6026e+00 2.0679e+00 -2.2635e-01 1.2354e+01 8.0611e+00 -#> -1.0398e+00 2.4987e+00 -6.6752e+00 1.6018e-02 -5.0908e+00 -7.8365e-01 -#> 2.1959e+00 6.0708e+00 4.3137e+00 2.1468e+00 -6.7604e+00 -2.5864e+00 -#> -7.0599e+00 5.5481e+00 -4.1306e+00 -1.1196e+01 7.3052e-02 2.0858e+00 -#> -1.9106e+01 -4.3791e+00 -1.5648e+01 -3.9139e+00 4.2754e+00 1.4006e+01 -#> -1.2236e-01 1.2929e+01 1.3368e+01 -7.0120e-01 8.7745e+00 -6.1214e+00 -#> -1.8933e+01 -2.1145e-01 -5.3975e+00 -3.9796e+00 -9.0409e+00 -8.7610e+00 -#> 7.7373e+00 -2.1278e+00 3.6165e+00 7.6281e+00 4.9305e+00 7.7253e+00 -#> 8.2651e+00 1.7974e+01 1.0775e+01 -1.3353e+00 8.9461e+00 -3.0251e+00 -#> 4.0353e+00 -5.0770e+00 7.8504e-01 8.8365e+00 -3.8695e+00 1.3443e+01 -#> -7.6197e+00 1.1457e+01 -8.3000e+00 -1.5143e+01 -4.2876e+00 -8.1806e-01 -#> 3.5782e+00 -2.8537e+00 9.9314e+00 2.0893e+00 -3.4795e+00 -1.2507e+01 -#> 6.0520e+00 -1.2071e+01 9.9488e-01 2.2217e+00 5.7808e+00 -1.6132e+01 -#> 1.4790e+01 1.4483e+00 -1.9808e+00 -3.8774e-01 -8.9233e+00 4.7249e+00 -#> 1.2036e+01 2.3744e+00 3.1593e+00 1.5919e+00 5.9775e+00 2.4469e+00 -#> -1.0775e+01 -9.3839e+00 -1.6414e+01 5.2102e+00 1.5185e+01 5.9463e+00 -#> -3.5521e+00 -7.9817e+00 -2.9207e+00 5.7468e+00 -1.1332e+01 8.4551e+00 -#> -1.0195e+01 1.2005e+01 -3.6723e+00 1.1780e+01 -1.6271e+00 7.6696e+00 -#> 2.1713e+01 5.2783e+00 9.3111e+00 8.0497e+00 -1.6087e+00 4.5775e+00 -#> -#> Columns 13 to 18 -1.8652e+00 -1.6296e+01 1.8492e+01 -8.9194e+00 -2.5883e+01 1.1869e+00 -#> -6.7993e+00 5.6378e+00 1.4581e+01 -4.0838e+00 6.1944e+00 -2.9574e+00 -#> 2.2431e+01 -2.6491e+00 -1.2315e+01 -4.1996e+00 -2.0293e+00 9.9904e+00 -#> -1.8455e+00 3.2875e-01 -6.2301e+00 1.0648e+01 -3.1354e+00 4.8623e+00 -#> -2.4562e+00 -4.0854e+00 -1.4479e+01 -4.5397e+00 -1.3717e+01 -3.2325e-01 -#> 1.6040e+01 9.0584e+00 -8.3440e+00 3.6967e+00 9.8770e+00 2.2980e+00 -#> -1.2911e+01 1.2589e+01 6.9104e+00 3.5571e+00 -5.1544e+00 -7.4146e+00 -#> 6.1413e+00 -1.8190e-01 -3.3456e-01 -2.6882e+01 -1.4690e+01 -1.1079e+01 -#> 1.1708e+01 -9.3690e+00 -1.1440e+01 9.3169e+00 7.7772e+00 -1.8925e+01 -#> 1.5197e+01 1.3778e+01 -5.0570e+00 -9.0739e-01 -6.1582e+00 -2.7271e+00 -#> -2.1772e+00 -1.6658e+01 -6.2780e-01 5.7757e+00 -1.0697e+01 2.5060e+00 -#> -1.1280e+01 1.2305e+01 5.9221e+00 -3.2115e+00 -7.5718e+00 8.3008e+00 -#> -1.5762e+01 -2.2615e+00 1.2355e+01 -1.1957e+01 -9.3039e+00 6.8012e+00 -#> -8.7712e+00 8.6509e+00 7.4090e-01 9.2350e+00 1.9766e+00 -1.6338e+01 -#> 1.4290e+01 -3.2918e+00 -1.5809e+01 1.0007e+00 6.8876e+00 -3.0997e+00 -#> 1.4291e+01 1.4412e+01 -8.0414e+00 9.9127e-02 -4.8554e+00 3.1393e+00 -#> -9.8472e+00 9.9958e-01 2.5397e-01 -4.2138e+00 -1.1304e+00 -1.6261e-01 -#> -1.1017e+01 -4.3670e+00 1.7037e+01 8.7364e+00 1.3894e-01 7.6550e+00 -#> 3.4132e+00 -3.2857e+01 2.2227e+00 5.0719e+00 2.8524e+00 -1.2048e+01 -#> 1.4289e+01 4.6767e+00 -1.4386e+01 -8.2073e+00 -1.0569e+01 -1.3552e+01 -#> 3.5160e+00 1.1464e+01 7.5946e+00 -5.3213e+00 -1.9107e+01 -1.2373e+01 -#> 3.4990e+00 -7.3915e+00 -1.1307e+01 -6.4806e+00 -5.4687e+00 -2.5775e+00 -#> -2.8884e+00 2.8935e+01 -6.5814e+00 1.8260e+00 -8.2682e+00 -1.2801e+01 -#> 1.1532e+00 3.3882e+00 -1.7850e+01 2.8981e+00 1.5333e+01 -9.5075e-01 -#> 5.5637e+00 1.7700e+01 7.4352e+00 8.8628e+00 7.4742e+00 2.0415e+01 -#> 2.9779e+00 2.2275e+01 -1.2943e+00 6.2881e+00 2.5515e+01 -1.5408e+01 -#> -1.9378e+00 5.2831e+00 2.9522e+00 1.0344e+01 4.0745e+00 -1.2546e+01 -#> 1.8626e+00 -2.3987e+01 4.6421e-01 -5.2962e+00 -1.8111e+00 4.6379e+00 -#> -8.5860e+00 9.5225e+00 9.0563e+00 -7.0315e+00 4.4661e+00 8.3436e+00 -#> 9.7812e+00 5.3378e+00 1.3720e+01 1.2092e+01 1.2569e+01 6.4263e+00 -#> 6.8753e+00 -2.3651e-01 -7.5481e+00 5.3557e-01 2.3002e+00 -2.2790e+01 -#> 2.4410e-01 -1.2898e+01 1.6192e+01 -1.8877e+00 3.8370e+00 6.8946e+00 -#> -5.2318e-01 -5.6432e+00 -6.6090e+00 4.1998e-01 5.3408e+00 1.3252e+01 -#> -#> Columns 19 to 24 -1.4445e+01 -7.1449e+00 -1.5830e+01 -8.5493e+00 -1.3246e+01 -1.0818e+01 -#> -5.8257e+00 2.5380e+00 4.0747e+00 3.3717e+00 1.2595e+01 -3.3998e+00 -#> -7.3894e+00 -4.2604e+00 -1.2218e+01 -4.3442e+00 9.4085e+00 -2.5070e+00 -#> -4.1200e+00 -3.6416e+00 9.7555e+00 2.0301e+01 4.9697e+00 -1.1485e+01 -#> -3.8639e+00 3.4220e+00 -1.6872e+01 3.9867e+00 2.3141e+01 1.2290e+01 -#> 1.7509e+01 1.1767e+01 -8.2266e-01 -3.9060e+00 -1.0922e+01 -1.0966e+01 -#> 7.7470e+00 -1.4266e+01 3.9405e+00 -6.0676e+00 1.2489e+00 8.5741e+00 -#> -1.4520e+00 -7.8036e+00 -8.0052e+00 4.7120e+00 1.9998e+00 9.7785e+00 -#> 7.1853e+00 -5.4099e+00 -1.1661e+00 -9.6124e-01 -3.5060e+00 2.8660e+00 -#> -4.3456e+00 1.4829e+01 1.3369e+01 -6.0624e+00 -6.6249e+00 -4.8913e+00 -#> 6.3133e+00 -5.3194e-01 -1.5952e+00 -2.1070e+00 2.6445e+00 5.3696e+00 -#> -1.2118e+01 5.5716e+00 4.6971e-01 -4.8059e+00 -5.0967e+00 -9.4180e+00 -#> 1.5625e+01 -2.5577e+00 -1.1879e+01 -4.4374e-01 -8.2021e+00 2.3238e+00 -#> -4.9951e+00 1.4574e+01 1.0015e+01 6.7822e+00 1.8982e+00 -9.4548e+00 -#> -1.3176e+01 -7.8126e+00 -1.8460e+00 -1.2364e+00 8.4664e-01 -1.9703e+01 -#> 1.4479e+00 1.4478e+01 5.8062e+00 9.4571e-01 8.8584e+00 -5.1504e+00 -#> 1.1455e+00 -6.0675e+00 9.1297e+00 -1.1173e+01 -4.6109e+00 5.5358e-01 -#> -3.5382e+00 9.8266e+00 1.4570e+01 7.3645e+00 -3.1711e+00 8.0725e+00 -#> -1.3910e+01 -1.2044e+01 -1.6904e+01 -6.0675e+00 -4.3612e+00 -8.3484e+00 -#> -4.3631e+00 5.4232e+00 3.4341e+00 2.0606e+00 -5.7591e+00 -3.5410e+00 -#> -8.8304e+00 2.7338e+00 5.3586e+00 -4.3458e+00 8.0168e+00 4.1562e+00 -#> 3.4144e+00 1.1629e+01 -4.1753e+00 -2.2655e+00 2.4787e-01 -1.4080e+01 -#> 1.0627e+01 7.8785e+00 1.1896e+01 -7.2031e+00 -4.6135e+00 -2.0067e+00 -#> 1.1385e+01 1.1738e+01 -1.7749e+01 1.7336e+01 8.0784e+00 7.6059e+00 -#> 1.2071e+01 9.1543e+00 -9.2724e+00 -3.8312e+00 7.5362e-03 -8.5426e+00 -#> 1.0760e+01 6.5716e+00 5.0156e+00 -6.8049e+00 -1.5221e+00 -1.8318e+00 -#> 5.8628e+00 -5.8969e+00 -2.5412e+00 -7.0412e+00 -7.1329e+00 -4.1548e+00 -#> -5.9491e+00 -4.2658e+00 -1.1386e+01 4.8017e+00 -4.1747e+00 -1.5459e+00 -#> 6.2477e+00 3.3169e+00 1.0762e+01 8.2661e+00 -4.6610e+00 1.3204e+01 -#> 1.6609e+00 8.2084e-01 -2.4961e+00 -1.3010e+00 5.8499e+00 1.2042e+00 -#> -1.8975e+01 9.0164e+00 6.3102e+00 1.0104e+01 -4.1855e+00 -2.9264e+00 -#> 1.7456e+01 2.2383e+00 -7.5770e+00 1.1101e+01 -7.0678e+00 -7.7324e+00 -#> 1.2254e+01 1.9148e+01 -9.5767e+00 1.3521e+01 5.4999e+00 1.3138e+01 -#> -#> Columns 25 to 30 4.7672e+00 3.9393e+00 5.8296e+00 -3.1557e+00 9.4979e+00 -2.1713e+01 -#> -6.2923e+00 1.0280e+01 -9.7095e+00 2.7831e+00 3.1785e-01 1.5338e+00 -#> -1.8244e+00 2.1574e+00 1.1629e+01 8.0770e+00 4.8884e-01 1.8520e+01 -#> 1.5629e-02 -1.8364e+00 1.8026e+01 1.5302e+01 -2.8889e+00 1.3968e-01 -#> 1.1179e+00 -9.3442e+00 2.7496e+00 -6.8790e+00 -4.7582e+00 -1.4035e+01 -#> -6.6229e-01 6.1338e-01 -6.1479e+00 -9.7994e+00 -1.1166e+01 -3.6618e+01 -#> -1.2571e+01 2.6986e+00 3.4589e+00 1.0057e+01 6.3649e+00 1.6529e+00 -#> 1.3031e+01 4.0839e+00 2.0510e+00 -1.5870e+00 -9.3332e+00 9.5409e+00 -#> 1.5391e+00 -2.6677e+00 9.4479e+00 1.2192e+01 -1.5544e+01 4.3714e-01 -#> -5.6149e+00 1.4890e+01 2.8167e+01 2.1890e+01 3.7060e+00 -2.3475e+00 -#> 3.1602e+00 -6.4318e+00 -5.0051e-01 -7.0032e+00 2.8286e+00 -5.1985e+00 -#> 2.1729e+01 4.4250e+00 5.0968e+00 6.9575e+00 1.7531e+00 -2.5253e+00 -#> -1.0086e+00 -2.7643e-01 4.9174e+00 3.8020e+00 1.1939e+01 -2.0344e+00 -#> -6.7084e+00 -3.3351e+00 7.7175e-01 2.1211e+01 2.6425e+01 6.7001e+00 -#> 8.5745e+00 -6.3598e+00 4.2359e+00 -6.8517e-02 -8.5386e-01 5.5816e+00 -#> -8.9783e+00 4.1437e-01 -8.1550e+00 -8.5186e+00 -3.6995e-01 -4.0838e+00 -#> -9.3662e+00 -9.4380e+00 -1.6104e+01 -1.6740e+01 -1.4065e+01 -1.4101e+01 -#> 4.0266e+00 -6.0926e+00 -2.1354e+00 -1.2275e+01 -3.8985e+00 -4.7642e+00 -#> 7.2370e-01 -8.8531e+00 3.0463e+00 1.1427e+01 -4.3989e-01 3.9344e+00 -#> 1.6464e+01 1.1835e+01 9.4309e+00 1.2652e+01 4.9765e+00 2.0312e+01 -#> -5.0763e+00 -5.5975e+00 -5.2981e-01 -9.1223e+00 7.8726e+00 -1.1486e+01 -#> -5.5547e+00 4.3114e+00 1.8073e+00 1.1570e+01 9.9499e+00 6.8600e+00 -#> -6.6898e-01 -3.5121e+00 1.2643e+00 3.2227e+00 -3.4610e+00 1.8418e+01 -#> 4.0002e+00 1.0894e+00 8.4338e+00 -5.1922e+00 3.8902e+00 -6.6107e+00 -#> 3.9277e+00 -6.1210e+00 -1.3713e+01 -1.8275e+00 -3.6425e+00 1.0173e+01 -#> -3.0410e+00 1.9244e+01 1.1407e+01 2.2155e+01 3.5196e+00 1.0021e+01 -#> -1.0214e+01 -3.6439e+00 5.5281e+00 -1.3027e+01 1.2913e+01 -8.8776e-01 -#> -1.2979e+00 -2.1701e+01 -3.9833e+00 -1.3435e+01 1.7802e+01 5.0667e+00 -#> -8.7162e+00 -4.8520e+00 4.9603e+00 -6.5037e+00 -7.7042e+00 -2.6138e+00 -#> 2.7359e+00 -1.6787e+00 1.8766e+01 -6.5021e+00 -6.3636e+00 -1.6554e+01 -#> -2.1235e+00 3.6546e+00 2.7977e+00 2.9845e+00 4.1591e-01 1.8891e+00 -#> -1.6986e+01 5.1112e-01 -5.8523e+00 -8.3724e+00 1.6497e+01 1.3320e+00 -#> 3.8910e+00 1.3113e+00 -4.9165e+00 -1.6934e+01 1.6983e+00 -9.6708e+00 -#> -#> Columns 31 to 36 -2.1679e+01 -2.4109e+01 8.4593e+00 7.2496e+00 -7.6315e+00 -2.8294e+00 -#> 9.3554e+00 -1.6916e+01 -9.3243e+00 -4.0114e+00 -7.3527e+00 4.0655e-01 -#> 1.4200e+01 1.5864e+01 -5.9292e+00 1.3862e+01 2.3485e+00 2.4782e+00 -#> -1.8995e+00 2.3539e+01 -2.6220e-02 1.8460e+01 -1.2601e+00 1.1129e+01 -#> -1.4494e+01 -9.6573e-01 -5.0733e+00 -1.0772e+01 -2.2918e+01 -3.4051e+00 -#> -2.4427e+01 5.6512e-02 -2.4875e+00 -1.4292e+01 -5.5592e+00 -1.3762e+01 -#> -1.4701e+00 8.7661e+00 3.7526e+00 3.6377e+00 -1.2703e+01 -9.5928e+00 -#> -1.0701e+01 -1.4778e+01 -3.5459e+00 -1.4791e+01 -1.0391e+01 1.0136e+00 -#> -9.1675e+00 -1.6266e-01 1.1090e+01 1.5069e+01 1.9667e+01 -7.0169e+00 -#> 2.6015e+01 -2.2708e+00 -1.2905e+01 1.6435e+00 -2.3520e+00 -6.1251e+00 -#> -1.7792e+00 -2.3453e+01 -1.9723e+00 -4.5929e+00 -1.6097e+01 -4.4947e+00 -#> -5.4744e+00 -6.4192e-01 1.6510e+01 -2.6668e+00 2.4133e+00 -2.6066e+00 -#> -1.9293e+00 1.5536e+01 1.1604e+01 -1.0907e+01 -1.0784e+01 1.3530e+01 -#> 1.2010e+01 -9.5282e+00 -7.4836e+00 8.5354e+00 1.5636e+01 6.4028e-01 -#> -8.9697e-01 1.3402e+01 -8.8782e+00 -7.5711e+00 1.0922e+01 -1.6372e+00 -#> -8.5683e+00 7.1512e+00 -3.3600e-01 -8.1503e+00 -1.5620e+01 2.9184e-01 -#> -7.3744e+00 -9.9613e+00 3.9222e+00 -3.9316e+00 -4.0211e+00 -1.2877e+01 -#> 3.4622e+00 -2.0458e+00 1.0160e+01 1.9177e+01 -1.0452e+01 -7.6392e+00 -#> -1.7070e+00 -1.3017e+01 -3.2206e+00 4.0881e+00 -4.6195e+00 1.0797e+00 -#> 1.7468e+01 9.2982e+00 -1.2351e+01 2.2376e+00 -3.9956e+00 1.3261e+01 -#> -4.4173e+00 -1.4719e+01 -9.7518e+00 -1.7765e+01 1.4858e+00 -3.2349e+00 -#> 3.7165e+00 5.9601e-01 1.4029e+01 -1.0961e+01 8.7351e+00 7.7419e+00 -#> -6.1524e+00 1.3197e+01 3.9808e+00 4.3193e+00 -1.2020e+00 -1.2825e+01 -#> -5.0571e-01 -1.0672e+01 1.8364e+01 8.0442e+00 1.3081e+01 8.4952e+00 -#> 4.4564e+00 5.5969e+00 5.6686e+00 -1.1134e+00 -7.3456e+00 7.7669e+00 -#> 1.3500e+01 4.4614e+00 4.9841e+00 8.9211e+00 2.0098e+01 3.8844e+00 -#> 3.6128e+00 1.1122e+01 -4.4044e-01 -3.6093e+00 1.3210e+01 8.5443e+00 -#> -1.8090e-01 -1.0985e+01 -4.1344e+00 -5.0467e+00 -1.5144e+01 -1.9789e+00 -#> -1.3420e+01 -7.0009e+00 8.1848e-01 -5.0013e+00 1.8268e+01 8.2374e+00 -#> 6.3996e+00 1.8506e+01 -1.1341e+01 7.0303e+00 -1.0490e+01 -4.4531e+00 -#> 5.5464e+00 -1.7529e+01 -7.0866e+00 -8.8303e-01 -5.3377e+00 -7.3746e+00 -#> -8.8998e+00 1.7904e+00 4.3107e+00 -1.4987e+01 -1.3366e+00 5.2522e+00 -#> 8.7654e+00 -7.8483e-01 -7.9464e+00 1.0005e+01 -1.1502e+01 -2.4911e+00 -#> -#> Columns 37 to 42 -1.2982e+01 -4.2126e+00 -3.8256e+00 3.6204e+00 3.1385e+00 -3.8542e+00 -#> -7.1096e-01 -1.1357e+01 1.9772e+01 3.5039e+00 4.9852e+00 1.4891e+01 -#> 1.5609e+00 -2.5245e+00 -8.8441e+00 4.7378e+00 3.7088e+00 8.7780e+00 -#> 7.1751e+00 -5.4980e+00 -1.6342e+01 8.2999e+00 2.4324e+00 -7.6220e+00 -#> 1.0249e+01 1.2807e+01 -6.5801e+00 -5.8649e+00 -4.9927e+00 2.6560e+00 -#> -2.1048e+01 6.4206e+00 1.0280e+01 5.0778e+00 1.0445e+01 4.2637e+00 -#> 1.2026e+01 -6.7137e+00 -8.5624e+00 2.7173e+00 3.2165e+00 -3.5499e+00 -#> -2.1791e+00 -2.1789e+00 4.3917e+00 -1.3265e+01 5.4625e+00 1.8335e+01 -#> 7.3245e+00 -5.1060e+00 7.6107e-01 6.8573e+00 1.4891e+01 -4.1370e+00 -#> 1.1617e+00 9.7119e+00 -6.5697e+00 3.8368e-01 -1.1239e+00 -7.7994e+00 -#> 4.0031e+00 -3.4288e+00 -1.2127e+00 -7.2412e+00 2.3004e-01 8.5065e-01 -#> 5.2983e+00 5.3690e+00 5.7870e+00 -2.9230e+00 4.3193e+00 4.7594e+00 -#> 1.4966e+01 -4.6703e+00 -2.1251e-01 -1.6869e+01 -6.9216e+00 9.3986e+00 -#> 1.6732e+00 -2.7369e+00 5.2217e+00 -2.9938e+00 5.3384e-01 -3.6932e+00 -#> 6.5449e+00 4.4988e-02 -8.0596e+00 -1.0725e+01 9.6437e+00 -1.2317e+01 -#> 1.0285e+00 8.7999e+00 1.4183e+01 1.0116e+01 9.0127e-01 -5.8294e+00 -#> 4.4251e-01 1.2525e-01 -8.3928e-01 8.1308e+00 1.8535e+00 -1.1609e+01 -#> -1.7475e+01 -2.8897e-01 6.4601e+00 9.6699e+00 2.5221e-01 -1.0648e+01 -#> 3.5728e+00 -6.6077e+00 -1.7886e+01 4.0742e+00 1.7618e+00 -3.7558e+00 -#> -2.0264e+00 -2.7089e-01 4.3377e+00 -5.0581e+00 2.4265e-01 8.6139e+00 -#> -9.2931e-01 6.4229e+00 8.4352e+00 6.6326e-01 1.1871e+00 1.4929e+00 -#> 6.7805e+00 -1.0345e+00 1.5609e+01 -1.3094e+01 -1.1205e+01 5.9305e+00 -#> 3.6864e+00 1.0229e+01 1.1679e+01 4.0239e+00 -4.9793e+00 -2.2810e+01 -#> 1.8718e+00 -2.4655e+00 -4.8008e+00 9.0004e+00 -3.6858e+00 6.8290e+00 -#> -5.7660e+00 2.6562e+01 -6.6253e+00 5.0441e+00 -3.5265e+00 -3.7034e+00 -#> -1.9097e-01 -7.5070e+00 -2.1687e+00 -1.3804e+01 -1.7359e+00 1.9275e+00 -#> -7.2095e+00 -9.9872e+00 3.7455e+00 4.0918e+00 -9.8750e+00 3.0196e-01 -#> 5.1022e+00 1.0027e+01 -1.4910e+00 1.2692e+01 -6.9118e+00 7.6790e+00 -#> 1.3758e+00 -7.4730e+00 -1.0146e+01 1.7413e+00 -2.0952e+00 -3.0648e+00 -#> 3.3219e+00 4.3010e-02 -1.1769e+01 -5.4523e+00 -8.0371e+00 -1.5348e+01 -#> -4.3930e-01 -1.0443e+01 8.5679e+00 1.2639e+00 -1.0131e+01 -7.0613e+00 -#> 1.1237e+01 -1.5008e+01 -1.1160e+01 -1.6202e+00 -1.6306e+01 -3.6980e+00 -#> -4.0589e+00 -8.3484e-06 2.1629e+00 1.3816e+01 -3.1079e+00 4.6911e+00 -#> -#> Columns 43 to 48 9.5938e+00 9.8600e+00 9.7126e+00 1.8094e+01 -1.2962e+01 4.5320e+00 -#> 8.8597e+00 -3.8651e+00 3.1434e+00 -1.8626e-01 -5.8093e+00 1.1049e+01 -#> -2.7338e+00 -1.1493e+01 7.7413e+00 -7.4347e+00 -2.8650e+00 -4.0432e+00 -#> -5.0291e+00 -3.8119e+00 7.6550e+00 4.3292e+00 -3.0579e-01 -3.6069e+00 -#> -5.2778e-01 -7.3304e+00 5.8848e+00 -1.2196e+01 7.4105e+00 -1.9333e+00 -#> 2.8526e+00 1.9088e+01 4.3373e+00 1.6240e+01 -1.1476e+01 -6.8856e+00 -#> 1.4744e+00 1.0112e+01 8.1093e+00 -3.6238e+00 1.1439e+00 -7.5707e+00 -#> -5.3667e+00 -7.7414e+00 -1.0272e+01 6.6667e-01 1.0756e+01 7.8223e+00 -#> -1.2314e+00 1.2076e+01 -2.8305e+00 7.6454e+00 -1.7534e+01 7.1417e-01 -#> 7.0782e-01 2.2438e+00 1.0972e+01 -2.3305e+00 3.3615e+00 8.5989e+00 -#> -2.4608e+00 1.9173e+00 4.5344e+00 -9.2493e+00 9.1561e+00 3.1954e+00 -#> 5.0468e+00 7.4412e+00 9.2711e+00 -8.5012e+00 -9.2720e+00 4.4542e-01 -#> -6.5111e+00 6.7097e+00 -4.1257e+00 2.3944e+00 2.2047e+01 7.6683e+00 -#> 6.7561e+00 7.3840e+00 7.0542e+00 -1.7891e+00 -1.0972e-01 -1.3271e+01 -#> -7.1963e-01 1.1568e+01 -4.8449e+00 2.3278e+00 -1.7470e+01 -4.0480e+00 -#> 4.8309e-01 -5.2807e+00 1.1232e+01 6.8060e+00 -2.8304e+00 -1.1613e+00 -#> 3.3958e+00 5.9262e+00 1.4133e+01 1.1143e+01 3.2042e+00 -6.2356e+00 -#> 7.5310e+00 -2.0502e+00 -2.2026e+00 -8.8068e+00 -4.4770e+00 2.5733e+00 -#> 6.1012e+00 4.0045e+00 3.6754e+00 3.0534e+00 -8.3425e+00 1.6813e-01 -#> -1.0076e+01 1.3927e+00 -2.6783e+00 -9.6086e+00 3.5074e+00 1.0511e+01 -#> 1.0308e+00 -4.1848e+00 -5.3140e-01 4.9027e-01 -1.4362e+00 1.7054e+00 -#> 9.5634e+00 2.1233e+00 -2.3149e+00 -2.8003e+00 -1.4120e+01 1.0925e+01 -#> -8.9510e+00 -2.0107e+00 1.3626e+01 7.5999e+00 -1.0009e+01 -8.6540e+00 -#> 5.8128e+00 4.3800e+00 5.1025e+00 -3.1124e-01 -9.5418e+00 -8.0206e-01 -#> 4.5954e-01 -1.9820e+01 -1.0856e+01 -2.4205e+00 3.2851e+00 3.5446e+00 -#> 7.4073e+00 1.0443e+01 -9.5225e+00 9.4974e+00 -3.1614e+00 8.7247e+00 -#> 6.9447e+00 -3.9645e-01 -7.9139e+00 6.0300e+00 8.5722e+00 -1.7283e+00 -#> -1.0806e+01 -1.5662e+01 2.5612e+00 4.0088e+00 5.8340e+00 7.7349e+00 -#> -4.5984e+00 -2.4696e+00 -5.8806e+00 3.0326e+00 1.4204e+01 -2.3575e+00 -#> 2.8538e+00 -1.9296e+00 3.6106e+00 -2.7019e+01 -9.1201e+00 1.2406e+01 -#> -1.4059e+01 6.0087e+00 7.9251e+00 -2.9546e+00 -1.3986e+00 1.0299e+00 -#> -6.0447e+00 -2.5707e+00 3.1745e+00 9.1589e+00 4.0167e+00 9.2205e-01 -#> -3.8542e+00 -1.4597e+01 1.0325e+01 -2.1651e+00 9.4333e+00 -1.0715e+01 -#> -#> Columns 49 to 54 -2.8126e+00 6.2290e-01 1.2094e+01 5.7506e+00 6.0703e+00 1.8163e-01 -#> -6.2051e+00 1.0590e+01 -1.1221e+01 4.7619e+00 -6.3597e-02 5.8108e+00 -#> 7.0618e+00 -3.9845e+00 1.2555e+00 5.8426e+00 9.2383e+00 8.9532e-01 -#> -1.9856e+01 -6.1150e+00 -5.9260e+00 2.8530e+00 9.8016e-01 -2.5336e+00 -#> 1.4864e+01 -6.8976e-01 1.2750e+00 2.0616e+00 3.1718e+00 -6.2053e-01 -#> 4.0523e+00 1.1325e+01 1.2280e+01 3.6007e+00 3.4541e+00 -3.6653e+00 -#> 8.3939e+00 1.1663e+01 1.1109e+01 8.8074e-01 5.1258e+00 -6.6330e+00 -#> -2.4700e-01 -5.0268e+00 -2.8844e+00 -7.5633e+00 -1.9285e+00 -1.2111e+00 -#> -3.1770e+00 -1.8981e+00 3.9011e-01 7.7184e-01 5.0163e+00 -5.7405e+00 -#> 6.6273e+00 -1.8221e-01 1.1584e+00 -2.7757e+00 -2.0168e+00 2.6052e+00 -#> -5.7066e+00 -5.4887e+00 2.3780e+00 -6.5638e-01 7.6786e-01 -7.7979e-01 -#> -5.6415e+00 6.8836e+00 2.1560e-01 7.0334e+00 1.0236e+00 -6.6481e+00 -#> -1.0836e+01 4.5798e-01 9.5420e+00 7.7963e+00 -3.3061e-01 -1.6403e+00 -#> 4.2271e+00 -1.2098e+01 -1.3656e+01 2.4545e+00 1.2756e+00 2.7649e+00 -#> -2.0405e-01 4.1072e+00 -4.2119e+00 9.9203e+00 2.4864e+00 -4.2951e+00 -#> 7.1166e-02 1.0848e+01 3.1254e+00 7.9914e+00 -3.9279e+00 3.5528e+00 -#> -5.9704e+00 1.2113e+00 -1.4374e+01 -3.6522e+00 -7.1488e-01 -3.2987e+00 -#> -9.3306e+00 -2.4419e+00 -2.1639e+00 -3.6770e+00 -6.2138e-01 5.5423e-01 -#> 4.0496e+00 -1.0450e+01 3.8506e+00 -3.9680e+00 -7.0082e+00 -1.1381e+01 -#> 5.7722e-01 -1.5393e+01 1.1042e+01 -1.5000e+00 -3.2943e+00 -7.7883e+00 -#> -1.3542e+01 7.2040e-01 -1.4342e+01 -1.2992e-02 -1.2799e+00 5.8970e+00 -#> -1.1502e+01 1.9461e+01 -8.2887e+00 5.0860e+00 -2.7199e+00 5.8708e+00 -#> -5.4900e+00 -1.0352e+01 -9.3241e+00 5.3902e+00 -1.1108e+01 -4.9735e+00 -#> 1.1056e-01 5.0094e+00 -1.0799e+00 4.5442e+00 -6.3421e-01 -1.9181e+00 -#> -5.7869e+00 -1.0112e+01 8.2043e-01 4.2532e+00 4.8572e-01 -3.0658e+00 -#> 1.2075e+01 5.5917e+00 6.5248e+00 -3.1185e+00 -3.3772e-01 3.6188e+00 -#> 9.3177e+00 -2.4290e+00 6.1698e+00 3.9114e+00 -8.6027e-01 6.6250e+00 -#> -1.5134e+01 -1.6484e+01 -1.0285e+01 9.0815e-01 6.7053e+00 -6.5309e-01 -#> -8.6335e+00 3.7887e+00 -6.0815e-01 7.2834e+00 -4.3596e+00 1.6165e+00 -#> 7.0482e+00 1.2581e+00 -2.9573e+00 -1.2959e+00 -5.8671e+00 1.1073e+00 -#> -1.2857e+01 1.1212e+01 -6.9868e+00 1.1672e-01 -4.6736e+00 7.4065e-01 -#> 1.4155e+00 -7.0236e+00 -6.6755e+00 1.3262e+01 -4.7395e+00 4.9263e-01 -#> 1.0220e+01 -1.6622e+01 5.0489e+00 -5.7797e+00 -5.4130e+00 2.3046e+00 -#> -#> (6,.,.) = -#> Columns 1 to 8 -1.2906 13.5946 10.1609 1.2739 2.2736 2.0222 -0.3510 3.4442 -#> -5.7275 -1.5648 -8.0162 0.4344 3.3434 -8.8662 5.8548 -2.1173 -#> -2.5602 -3.4971 7.1341 12.4275 -2.3825 -1.8784 -1.5559 -8.6028 -#> -5.2301 -2.8007 0.5347 11.6950 -20.4631 -3.5238 -4.4829 6.4217 -#> 6.0463 -12.9998 3.9110 5.6952 9.0395 5.3502 -2.5195 1.1746 -#> -6.3904 2.9205 7.7085 5.1703 9.8267 7.5979 9.2738 5.0525 -#> 5.5847 -6.8299 11.9292 -0.2180 1.3347 11.5043 -1.4268 -6.0069 -#> 0.2869 10.9916 -0.4642 1.6550 10.1435 9.0110 -5.8498 -6.5407 -#> -5.0244 -0.8937 5.7387 -3.5939 -20.8283 -1.1995 2.6885 0.3491 -#> 1.9330 0.2644 4.1328 18.1709 -11.5749 5.7500 23.1029 3.4215 -#> 6.1167 -2.6856 -1.9285 -4.0633 15.5307 0.2766 -2.4825 -7.5930 -#> -1.4727 -6.1985 1.0367 -9.4084 5.9970 -5.9196 8.1893 2.3675 -#> -0.8975 -0.4740 2.3347 2.8178 11.5984 -0.6683 -8.1911 2.5873 -#> 6.2782 -5.5500 4.2726 -1.2471 3.3512 -4.0309 -6.6749 0.4875 -#> 6.7282 6.7253 4.8038 6.0504 -10.3696 4.3334 -2.6760 -1.5284 -#> -5.7392 -2.9991 -1.2506 5.1530 -5.3925 8.0087 -2.9595 2.1494 -#> 3.1805 -3.1789 3.7345 -1.2337 14.6059 11.3571 10.6334 -5.8997 -#> -7.5094 -1.0523 -0.3500 -2.5702 0.7187 -4.1282 4.0303 -7.1820 -#> 8.0910 11.2153 16.4467 -2.5688 -4.1481 -5.9578 -9.3378 5.4305 -#> -2.5413 9.1377 -0.1849 0.8289 -4.1381 -8.1889 -7.9123 7.7997 -#> 6.4039 9.6086 6.6445 6.9689 9.6904 -8.8979 9.8943 -1.0717 -#> -2.0200 10.0184 -16.7365 -9.6430 -1.2828 -0.5102 -8.9521 -8.7141 -#> 1.0829 -0.9323 -1.9788 -11.2347 1.2045 3.7669 10.7768 6.0730 -#> -6.5144 -1.7845 -3.0716 -19.3201 -3.5435 1.3338 -0.8777 -3.1286 -#> -6.1790 8.0965 -6.8826 2.2282 4.6184 12.5645 3.0924 -3.1263 -#> -4.5024 -0.9420 1.5130 -8.1597 12.1098 2.0007 5.8567 -2.0337 -#> -0.1062 -1.1714 -0.1613 -14.2301 2.1413 -3.3113 0.8350 2.4379 -#> 8.3067 7.4910 11.0588 2.2626 5.9629 -6.2164 -7.1244 -0.4086 -#> 3.5105 1.5456 -4.8592 7.5274 3.0848 3.7589 -1.5261 6.4598 -#> 7.7199 1.7536 2.7952 3.6789 -5.1802 6.6892 14.5219 10.6791 -#> 0.7732 -0.6465 -3.0562 6.5930 -9.0988 1.7741 1.0375 -0.2780 -#> -3.2239 -0.4057 0.9958 -8.9499 0.6236 0.7190 -18.1055 4.8374 -#> 0.3086 -7.5484 5.2433 -0.9087 9.3846 2.9502 5.1732 -3.6332 -#> -#> Columns 9 to 16 14.4922 -9.0520 1.7655 4.2005 -0.9748 1.0361 -3.8319 -3.0438 -#> 0.3332 6.1866 -1.8103 2.5202 -4.6763 7.3212 11.9548 13.8410 -#> 2.7913 -6.1743 9.6053 -13.5915 9.1463 15.6517 1.2033 -8.0341 -#> 6.7896 -3.6931 -1.0360 -15.0298 2.2860 7.6720 0.8738 -1.3508 -#> 0.9471 3.8266 20.7929 -9.0604 18.7446 7.6388 0.2071 -9.4922 -#> 6.1561 8.5326 -1.2114 5.5687 -5.6578 1.3177 13.0493 -9.0726 -#> 1.6027 9.1719 5.6655 11.2977 -10.5055 -12.8792 2.7278 1.8168 -#> -15.2426 1.5844 -3.5033 7.4899 3.4438 18.5491 8.6613 -1.9612 -#> 7.7457 -3.5828 -14.3712 6.5652 -15.7928 10.0078 8.9950 -14.0851 -#> 6.0489 2.3172 -9.3602 2.7904 -3.2131 13.4063 19.7302 6.5122 -#> -7.6034 1.8109 8.9422 7.0092 16.0588 -1.2942 2.1396 7.1782 -#> -0.0329 -13.2314 5.2734 -9.0207 -6.3200 15.1783 5.7580 -5.6126 -#> -13.0665 -0.2777 -12.8806 -10.6230 9.8472 0.1550 -2.8208 -4.6425 -#> 8.4266 7.6153 -4.0710 -7.4449 -12.7797 -12.6976 11.2124 -1.8019 -#> 2.2260 -0.8911 -8.0302 -6.1254 -6.7894 3.7435 -2.2531 -15.0217 -#> 4.3711 7.2751 3.4861 -9.9310 -9.6459 5.1806 -3.0555 -1.2542 -#> 7.7591 -1.4780 10.0699 6.3439 -10.2799 -1.5439 -16.2412 -3.7900 -#> -6.9606 2.6252 -1.3341 3.2982 0.2523 4.2664 3.4766 -0.9335 -#> 17.7835 2.1302 -7.5523 3.5047 -18.0717 -6.1698 -13.9270 -5.4293 -#> -11.9347 -2.7808 -11.9411 -15.3471 18.3825 10.5389 -3.8798 0.3755 -#> 3.5219 5.9095 0.5010 -5.5723 -11.3272 -13.9500 0.7060 7.0516 -#> 2.6564 -11.4641 -5.9555 -2.0962 -14.1869 13.4808 7.7111 -4.0473 -#> -11.7709 -7.2278 3.2929 -4.9161 -5.4113 6.6190 9.6433 6.6927 -#> -1.6251 -0.1537 -0.8085 -4.9274 -2.5047 -7.8605 14.4556 12.5811 -#> 10.3315 -6.0728 8.8520 6.2369 16.2882 6.1643 -18.3501 -9.8453 -#> 1.9906 3.0279 -15.3214 9.9443 -2.8701 1.5651 1.0282 9.6610 -#> 3.4091 0.9268 2.7886 -4.3620 4.8404 -10.7781 -2.5085 7.5129 -#> 0.1259 -11.2227 14.8560 -6.3446 18.5227 -0.2072 -1.8997 -10.7735 -#> -2.5384 0.8208 1.6124 2.8615 11.5942 -3.2394 -12.9641 12.4291 -#> -2.6944 2.3566 -2.2074 -2.1300 4.6018 -9.9700 1.1242 18.0932 -#> -2.4919 4.8262 3.4302 -6.9013 -7.1076 -2.1004 8.5142 1.9163 -#> 5.3395 -0.6464 -13.5380 -5.0712 0.9941 10.4185 -17.7384 -5.8762 -#> -12.3580 -1.6421 9.2891 3.8499 21.0648 -9.4984 11.6074 4.4902 -#> -#> Columns 17 to 24 5.7558 -5.1015 3.7358 3.3919 10.1574 4.8178 -21.1594 -12.2321 -#> -3.0562 0.7140 10.4205 -6.7750 -5.5420 2.6335 11.1883 -11.4260 -#> 3.0956 -0.0853 5.8817 8.6862 -4.5400 7.8507 6.7873 -6.8118 -#> 13.3198 -0.1248 10.6121 -0.0760 -18.0245 11.9826 0.3320 14.2314 -#> -4.0194 2.2549 2.3223 -6.7249 4.2722 -14.4571 9.4587 11.3563 -#> -17.4306 4.3258 -21.1685 8.0124 7.3428 -5.9854 -11.9992 -12.7875 -#> 15.3551 11.2782 11.0832 -2.1655 -4.6535 -4.5613 -4.1144 8.5404 -#> 7.6964 16.3118 2.3128 -8.6316 0.7952 4.3366 -0.4963 -19.6819 -#> 8.0289 -7.0709 -13.0450 5.5670 -1.3630 -7.3526 -19.8146 -2.7441 -#> 8.7356 7.6161 20.4846 4.1788 1.2975 10.3680 22.0978 6.2228 -#> 5.9167 -9.3964 -5.7256 -9.7319 -4.4493 -12.1559 1.1821 -4.4289 -#> 9.2950 0.7511 -0.2363 -12.1582 11.4554 -10.0148 5.8216 -10.9486 -#> 9.9000 16.4000 -0.7640 1.1786 4.3074 10.8577 0.4916 -6.8029 -#> 10.3147 5.5437 -7.1659 12.2463 2.2565 0.3769 20.3979 8.4227 -#> -0.3663 2.1757 5.1778 -5.3807 -1.6984 2.3133 2.5216 -2.2583 -#> -0.9128 -14.7464 15.0432 -9.6708 0.6328 -7.2687 6.0182 12.2746 -#> -3.4113 5.8405 -1.8712 9.3081 5.1289 -6.4269 -10.4528 8.3959 -#> 8.7077 -19.5687 9.4994 -7.7430 -1.2161 -5.4623 -9.8941 1.4983 -#> 13.8759 4.1344 9.4766 11.2207 2.2961 -8.7867 5.9747 -7.1402 -#> 12.1295 12.9967 0.6639 -1.5193 0.4410 10.6945 13.6656 -3.1914 -#> 5.4100 3.8496 1.6437 -3.5398 7.0846 -9.9937 15.4963 5.8231 -#> 7.5987 3.5565 4.1458 -7.8678 0.6812 -4.4799 24.1990 -5.0146 -#> -10.0771 -3.8329 4.9017 7.5790 -6.6872 -0.7476 9.1992 7.2369 -#> -5.0835 5.0043 3.1980 -5.0508 3.8299 -4.9766 -9.6292 -1.7594 -#> -15.1777 -5.5674 -3.6914 11.2230 -6.3116 13.1206 -4.2019 3.9232 -#> 4.7549 14.9044 -9.8537 17.3155 9.6950 7.1425 4.4975 -5.3993 -#> -16.9693 2.4203 -20.4703 8.3267 -2.2390 -6.7381 -2.9412 12.7916 -#> -11.6139 10.8966 -3.9312 -0.5212 -6.7344 3.3800 -12.3245 10.8742 -#> -5.0495 -12.9478 -1.5114 -13.5049 9.3209 -2.9167 -2.9070 -9.1915 -#> -0.3623 -6.9541 8.1159 12.0325 -9.8860 -8.7181 -2.5255 5.7182 -#> 1.4827 2.1857 -3.2521 2.9576 -7.1097 4.2946 8.4858 -0.9879 -#> -0.2729 2.7021 -13.6980 -0.0209 6.9267 -1.2844 7.1979 -2.7620 -#> -6.2110 -11.8811 5.1298 3.0679 -5.8170 -4.0543 -4.4873 -10.4857 -#> -#> Columns 25 to 32 2.5510 -0.2038 14.5818 -4.6291 -0.7677 -1.7932 12.4390 -4.2010 -#> 15.1813 0.3858 5.0331 -2.6030 -12.5818 8.9777 6.5527 -18.1512 -#> 6.1780 5.2303 5.6359 -4.6120 15.2766 -8.3173 -5.6248 -1.8824 -#> 8.1504 -1.0806 12.6728 -0.5206 -17.4869 -0.0462 8.9059 6.3680 -#> -12.1255 4.2941 12.3835 -4.6695 2.1640 9.4901 -1.9697 -12.8155 -#> -2.8595 -11.3560 -23.4308 -12.8827 -3.2424 -22.6272 4.2050 7.6287 -#> -11.3116 4.9729 6.0999 3.8416 -6.1895 -9.4986 3.3053 -7.4039 -#> 3.3873 -0.7899 4.5732 21.9898 6.3325 6.0113 2.0164 15.7458 -#> 10.9122 2.3831 -5.2849 -9.6523 7.8680 -2.9722 -0.3612 14.7341 -#> -0.2008 11.2113 -2.5775 -2.3837 -0.5849 0.4306 -6.9612 1.6097 -#> -14.1493 -8.8999 -1.6389 11.8062 4.1949 -5.4295 -10.4731 11.8199 -#> 8.0969 -8.5132 -24.8994 4.6152 -3.4242 -2.6175 -4.3879 -4.9619 -#> -9.4342 -15.1767 -5.6920 21.9019 5.1764 -10.5333 -5.2888 16.3868 -#> 15.5493 11.3486 -0.2930 4.1731 -7.7393 -3.6599 -7.6600 -5.2174 -#> -3.5141 -4.6415 0.3840 -5.3498 -2.1020 -3.9604 0.0056 3.5127 -#> -4.7510 -12.2849 -4.9326 -5.1865 -12.1735 -7.7928 4.1647 -2.0988 -#> -1.1027 -9.5895 4.5745 -5.6741 4.7831 2.0540 10.7514 -14.1080 -#> -2.9644 5.5836 -8.3494 5.9076 -1.2470 -1.0219 11.5192 -13.4809 -#> 13.2887 6.7956 -0.6082 23.5026 -13.5884 -6.3495 15.0499 18.1156 -#> 3.5274 6.5587 2.6680 5.6992 0.6544 7.6869 -19.7288 5.1232 -#> -0.5296 11.7520 -6.7091 -1.0494 13.8793 6.6161 -9.0327 -8.9474 -#> -3.4582 1.6826 -11.8708 2.1495 11.1554 2.5380 1.2823 -2.7771 -#> -0.4756 8.0583 0.6827 -11.9828 -2.3517 11.3830 -0.6355 -2.7412 -#> -11.4423 -5.4743 -15.1870 12.4140 -9.7539 -17.8683 -0.7848 7.7927 -#> -0.5499 -6.9762 -12.1503 10.2159 -0.7098 -0.9954 -0.4118 -11.7689 -#> -4.6858 -6.1957 0.0519 -7.1924 -2.5395 2.1236 1.3794 -15.5540 -#> -9.3657 -0.6970 5.4592 -16.0231 8.2241 -4.5828 1.4532 4.9496 -#> 12.8412 -3.9305 7.5273 -3.0112 15.4585 -6.6945 15.4474 1.7287 -#> -22.4385 -0.7342 -11.1161 4.6486 -2.7083 -9.0396 14.7538 -1.0047 -#> 4.5947 -21.1098 -16.8718 -6.1116 -9.9848 6.5708 -4.7937 -2.0188 -#> 24.0556 -8.9986 -12.3429 17.4882 -4.6548 -13.4062 6.2575 10.7854 -#> -31.1434 1.9465 0.7855 -15.7536 10.1426 -1.6422 -8.7910 -0.0079 -#> -10.3111 0.0067 -2.4724 -6.9653 -3.8297 -21.0870 -3.7484 -4.1639 -#> -#> Columns 33 to 40 -13.3018 11.4347 4.3881 -15.3451 -14.7796 -10.7482 -9.8222 -7.6255 -#> 5.7348 -2.8622 10.3386 -4.3851 -3.2253 2.3243 10.5710 -9.1831 -#> 10.8954 -7.3811 7.6372 11.4209 -8.9989 -3.6543 -15.2089 -7.5075 -#> -4.4141 -10.8794 11.8271 -1.5246 -4.1511 -3.5885 0.3060 1.0262 -#> 1.4067 -11.2596 -3.3783 -10.3016 -5.7747 -2.5181 -14.6493 -8.2378 -#> -10.3898 -6.8135 -0.2176 13.7231 -7.8071 -5.1213 -9.7749 -10.0052 -#> -14.7091 9.5471 3.8610 -1.7426 -9.3497 -2.6168 -0.9749 1.7457 -#> -22.4270 -0.7327 3.1410 -10.7137 -3.9862 -6.6951 -16.3995 -10.5144 -#> -3.1132 3.0083 12.4543 13.9422 -2.3628 2.5201 11.7110 -3.1082 -#> -16.7121 -8.7479 7.5455 19.0781 1.5504 -4.3607 -11.3715 -11.4387 -#> 1.3740 2.8587 1.4961 -0.6484 -5.6926 -0.5601 -9.7146 -4.9405 -#> 3.9314 3.1979 5.8303 0.5067 5.4296 12.0537 -3.3515 -11.3496 -#> -24.2184 7.8634 14.0017 4.3131 -0.1126 -7.2120 -9.5349 0.8077 -#> -4.7040 -15.6956 7.7416 8.2861 8.7321 22.4792 2.6020 -9.6495 -#> 9.5377 -11.9823 -6.4117 8.4295 0.3117 -2.6650 7.8302 -3.7887 -#> 5.5278 -2.7329 -2.2512 -7.1119 1.3326 8.0848 2.5164 0.9978 -#> -1.0899 14.8330 -14.3933 -7.3525 -7.7192 -7.0869 -3.4828 -12.8309 -#> 8.4395 4.9174 -4.2364 -3.0753 21.9154 3.6303 6.2544 0.7341 -#> 2.3007 -6.2655 5.5432 -5.3002 -9.3214 -8.0130 7.6070 1.5934 -#> 4.3254 -11.5373 -4.2664 2.0801 -0.6847 3.1335 -5.3338 1.9036 -#> 12.3775 -7.0790 -28.5314 -0.3746 18.1311 2.2420 0.0830 -12.6389 -#> -3.0356 -13.9939 14.8970 4.8902 7.3808 -2.6823 -6.3736 -12.9497 -#> -16.7200 -2.1116 -2.9996 6.8710 2.5699 5.4890 8.0162 -2.9067 -#> -13.6207 -4.2545 11.4654 14.8817 -2.2713 0.8434 7.1582 1.3200 -#> 5.0770 -2.2040 12.7372 11.4328 -1.7731 5.3843 -4.7258 5.3155 -#> 9.3367 7.5425 -4.8037 6.9640 18.1435 8.1916 6.8829 -1.8885 -#> 13.0478 2.9280 -0.1942 1.3672 -6.5255 2.8133 1.1775 7.4638 -#> 11.6807 -0.5185 -1.4964 -4.7613 -6.2701 -10.8524 3.5170 -3.0822 -#> 3.3242 -5.4642 4.7088 5.5320 8.1024 -5.8204 -3.0596 10.6272 -#> -1.0657 3.4552 2.3027 -7.0692 10.1231 0.8105 -8.5921 13.5097 -#> -15.2484 -10.5065 -3.6962 -1.5426 0.5034 8.4190 1.9529 -18.8593 -#> -11.2276 11.5599 1.5902 2.1219 -1.6046 3.9939 -3.8218 6.1057 -#> -2.9041 -0.2319 20.5079 10.3067 -19.0063 -7.7289 0.2409 9.4719 -#> -#> Columns 41 to 48 -14.8274 11.1019 -2.0624 -16.6093 14.9514 -5.6329 -3.2294 -11.8248 -#> -25.6942 -11.7620 15.1268 -16.4583 -2.4520 -1.4868 -7.3511 -11.5673 -#> 10.0451 9.0156 -5.6282 1.9462 -13.0216 -7.2568 5.5590 13.4736 -#> 4.9235 -5.8640 5.4821 17.9450 -0.4013 -9.8630 3.4781 6.9655 -#> 15.0701 -16.0830 -11.3884 5.5517 -4.9922 -9.6732 -1.6430 6.1311 -#> 3.8547 13.6872 1.2339 1.6836 8.5749 -8.7089 -14.2435 11.3817 -#> 12.4370 6.2683 -8.8350 5.0442 2.9360 -2.7908 -10.8230 12.0263 -#> -3.1593 -1.5564 3.6364 21.1648 -2.1498 -4.9529 5.2285 9.4477 -#> 15.8922 8.4917 1.9509 1.4087 7.6780 7.7388 -8.4368 -12.4227 -#> -16.1673 13.1916 -15.0930 1.8549 9.6663 -17.2185 -1.4829 7.0265 -#> 16.0285 15.2274 3.2461 -3.6231 14.0717 -8.7764 -1.0086 3.2117 -#> 9.4831 -2.8242 3.4838 5.2793 -4.4744 -3.2964 -3.9987 -9.1133 -#> 7.7553 -12.7563 10.4662 5.3940 4.2116 -5.8236 -9.4847 -3.2442 -#> 1.3194 -12.7428 4.0818 -1.5940 1.5243 5.1842 -7.2948 -5.7164 -#> 4.8879 -2.9317 -6.2709 -9.3459 -11.3165 5.0118 -0.1904 22.1940 -#> -21.0022 6.7985 4.4091 -10.0046 -6.7588 3.2733 2.6421 -4.7261 -#> 3.9414 9.6563 -5.8217 10.3224 2.6729 -11.2405 -0.9543 5.5213 -#> -1.2199 -1.8814 4.7612 1.8873 8.5311 -6.8704 -0.2994 -12.2560 -#> 17.0057 1.9599 -10.4429 -0.1507 -9.0125 11.3883 15.8209 -4.7667 -#> -3.3215 -1.4828 -8.9326 -14.8883 10.0406 -0.3560 2.7622 10.9950 -#> -19.2001 -0.0726 -1.9566 5.3143 7.0787 -2.4969 -4.9595 -22.0236 -#> 1.4301 6.5826 6.1883 3.8234 -0.1367 5.4273 4.0576 -11.0166 -#> -7.8817 0.6544 6.0732 11.9236 7.7939 0.3150 -18.9965 1.4273 -#> 2.6034 -10.2298 -4.7003 -5.3183 -3.9666 9.8251 -2.7840 2.1820 -#> 8.2280 11.6911 -6.6334 7.4555 8.5682 -8.2273 -3.4682 0.9171 -#> -14.0598 -11.5994 1.5577 -7.6353 -0.4864 5.9775 -6.4676 8.4718 -#> 9.0035 1.8517 12.9204 1.1901 -7.3776 9.1212 -8.3669 13.1364 -#> 17.7747 10.1759 8.1650 16.2932 -3.2379 10.2901 -3.4087 0.9077 -#> 26.6333 -1.2646 5.5650 27.8534 -8.3948 2.2099 15.4857 8.4266 -#> 5.2051 -7.7246 10.6129 -5.9848 1.8629 1.8079 -2.9143 2.8773 -#> -4.5634 -6.9991 -0.1841 -1.9090 -3.3952 -11.1046 1.2958 -5.0358 -#> -1.5619 -4.0569 0.3958 -2.8646 -3.9297 5.8160 6.0127 -14.5869 -#> -2.1185 6.3288 14.7029 5.6887 -7.8463 9.2926 -5.1830 11.8611 -#> -#> Columns 49 to 54 7.1016 -15.8208 -2.0877 8.7288 -3.1307 -0.8803 -#> -0.4871 0.7266 4.5678 4.0105 1.2923 5.1693 -#> 20.2181 11.8863 -7.4998 -6.5552 -5.5006 2.7607 -#> 8.0890 -2.5741 1.3882 7.7029 -3.1784 0.5210 -#> 2.2496 -4.2309 5.2862 -3.9217 4.9795 -0.7292 -#> -8.6583 6.5319 0.0370 2.8436 10.4838 -0.7236 -#> -1.1649 -4.4350 -6.3319 5.5223 -1.6165 1.1725 -#> 4.8659 3.2011 -8.5719 -1.7483 -1.6685 -1.5847 -#> -2.2247 13.8979 4.1977 18.8982 10.7389 0.0871 -#> 3.9782 -7.1217 0.0354 -1.4489 -5.5193 1.8505 -#> 18.8135 -5.2025 0.9429 -3.0978 1.4687 -2.5012 -#> 11.6686 16.0039 19.1609 9.8649 14.1246 6.5454 -#> 13.3683 -8.2871 -6.7784 -8.3795 -0.2653 -3.5511 -#> -21.7449 8.3611 -4.8974 7.8489 11.9476 5.2951 -#> 11.3051 4.7276 -3.7736 9.6208 4.5921 6.3180 -#> 15.5787 -15.6322 -1.9335 2.3981 0.3116 -0.4815 -#> -12.1117 9.1798 -4.6968 0.7476 -1.0237 3.7553 -#> -7.8043 -0.7208 8.2035 3.4730 -4.0109 1.5793 -#> 3.9935 -16.5139 -31.5667 0.2528 -0.0504 -9.1094 -#> 11.5752 -5.3425 3.3273 -0.7484 2.1920 2.7529 -#> 0.3867 1.0408 2.0689 -6.1926 0.9572 1.3587 -#> 14.7566 4.6171 8.8172 -5.3254 3.3079 -2.3980 -#> 8.7173 8.6772 12.3852 11.0251 11.5906 0.0299 -#> 4.6733 -7.0466 14.9059 -4.1834 21.6208 -1.9280 -#> 3.1884 6.7935 18.1165 -7.7078 -2.9576 -5.1778 -#> -18.3818 6.2766 7.1564 2.9596 0.0514 5.4900 -#> -11.5391 3.1277 -0.8124 -2.9626 -9.9937 -3.3890 -#> 2.6400 7.5140 -11.0859 -9.9491 -5.3755 -7.1824 -#> -5.2797 -3.5233 2.8756 -5.8039 -2.0460 -5.4051 -#> 4.4817 5.8428 14.4519 2.6131 -9.7038 -6.9624 -#> 1.5933 -4.5518 0.6980 -4.8872 14.2196 -2.3431 -#> 13.2308 -9.6762 -9.5056 -3.2635 -4.0156 -6.7775 -#> 12.9216 5.4440 0.0763 -1.1853 -3.6075 -3.9249 -#> -#> (7,.,.) = -#> Columns 1 to 8 2.9533 1.0860 -8.1119 -0.0494 -5.6527 -3.9036 1.4153 14.9349 -#> -0.0426 -5.0686 0.9671 9.3674 -1.9175 -4.4150 -20.3450 -11.1734 -#> -5.7712 0.4861 -5.7229 2.3722 -0.5183 4.7114 -5.0389 9.6950 -#> -4.9070 -3.9710 -4.8358 -5.3924 -1.4841 -23.6475 3.8715 10.7274 -#> -2.5420 -0.1729 3.0972 2.3531 -9.6480 -8.7437 19.5345 -5.5293 -#> 4.4810 5.3877 7.0822 7.6412 12.4621 -10.4872 4.7130 3.0768 -#> 0.2558 8.5553 -1.4754 6.0410 3.9560 -4.5169 11.5759 20.3742 -#> 8.0860 -2.6291 -2.9121 7.6992 3.2772 -6.0553 4.4048 6.1588 -#> -4.3041 -3.4693 -10.2610 -6.9206 8.7375 2.1002 2.5182 1.9419 -#> -4.9646 -0.1113 -1.3192 -9.4292 4.6407 -18.0276 -4.2440 6.6633 -#> -5.8289 5.9275 2.1018 -1.3827 -2.2998 1.0696 4.6979 -9.5232 -#> -2.5592 -10.3871 -4.0104 -1.7658 -10.1056 -11.3087 15.4066 -7.7634 -#> 6.3138 -1.6091 -5.9848 -13.2303 -23.1200 -7.9112 10.0865 -1.6915 -#> -0.5376 -1.2372 -2.6146 1.1744 3.0030 2.3626 0.6140 -4.1495 -#> -1.0276 4.0819 1.0181 -17.9625 -1.2514 -7.6360 -2.6066 4.1145 -#> -11.0159 6.8261 14.0863 4.3507 -5.7617 -3.7358 2.9299 -8.3246 -#> 2.3352 -3.0110 12.2271 18.0495 -4.5932 -7.1297 5.2973 -8.6118 -#> -0.5394 -6.9015 5.5024 -7.8668 12.8111 1.7763 9.8438 -1.7404 -#> 3.3877 -4.2540 -2.2542 -6.0180 18.3600 -5.4755 3.0592 -11.9726 -#> -10.2080 7.4090 4.1472 -2.6033 -4.1521 3.2111 8.6643 11.6759 -#> 1.0144 -5.1425 0.0963 -7.1641 -0.3397 19.8292 -0.3321 -16.5396 -#> 4.8501 8.3368 3.3213 1.6215 -0.6207 0.4590 -15.0183 -14.6612 -#> -2.5804 3.4855 16.8857 9.2382 3.0253 -6.7524 0.1960 18.0831 -#> 1.7473 0.6880 -11.5274 -8.9468 3.7359 -17.2174 1.2827 8.5942 -#> -2.4625 13.4675 13.3387 -0.6154 4.2879 -11.0969 -5.7258 21.5706 -#> -4.4280 -14.9012 -8.8041 -14.0777 4.9153 13.6668 5.2954 3.0269 -#> -1.2733 -3.2601 -1.8351 -5.1108 -3.5288 1.6957 -8.3590 -5.3381 -#> 4.1251 0.9375 -3.7061 23.1854 -1.0771 -12.4512 -2.0129 -4.0775 -#> 4.0202 0.9914 -3.9503 -4.0601 7.2828 -3.2260 9.1126 8.2552 -#> 4.5288 13.4074 1.2214 -2.5435 6.7230 2.8578 9.0410 -2.8760 -#> -3.6678 -2.4159 -3.7118 9.2260 8.0656 -4.5218 7.3985 -8.2251 -#> 5.0364 -9.3228 12.9259 -8.1170 -17.5855 3.5425 -3.0200 -10.1435 -#> -6.1906 -4.4283 -0.9003 -0.7803 9.6682 3.5322 -0.8297 8.7635 -#> -#> Columns 9 to 16 1.5017 -8.6068 -6.9181 0.8914 -5.0705 15.9220 5.7279 0.2399 -#> 7.3119 1.3518 -5.9874 -0.2507 7.6702 -9.8376 -3.9085 7.4234 -#> 8.6301 2.0793 -11.9055 -3.7911 2.2005 -7.8006 -5.2940 21.5218 -#> 3.0038 -1.6677 1.7950 -4.5508 2.2488 5.3606 -9.1701 11.3947 -#> -5.7708 0.4336 14.7154 -3.9453 7.6350 4.7351 -8.6861 15.0580 -#> -1.3754 -17.9018 -1.1444 12.8072 -1.4358 -3.2275 -3.6352 7.4320 -#> -5.0755 -5.4128 1.8288 -5.4710 -3.5631 14.3366 15.0004 -6.0529 -#> -10.4102 -6.2087 0.9492 6.0118 7.5276 -6.7697 10.3562 -11.1823 -#> -11.1926 -13.0750 9.3369 -2.0888 -19.6530 -11.2297 0.6006 -12.6120 -#> 3.0238 -16.1934 -2.7229 1.8922 13.2706 7.3099 -13.3667 1.7925 -#> 12.4466 -0.6007 5.8977 -3.2004 -0.3359 -1.2151 0.8104 -3.3276 -#> -1.6860 -14.1189 11.2619 -6.9821 4.1982 -13.8986 -2.2289 16.0245 -#> 7.7072 3.2524 2.7273 -0.8512 1.0577 2.9917 7.9595 -3.3827 -#> 7.0726 -7.8132 0.4509 -2.2803 6.6153 -4.2307 3.3257 12.5965 -#> -7.9716 -23.9474 -3.5218 1.5840 9.3275 1.9662 13.4866 -2.4736 -#> 2.7921 -3.6859 -3.9322 -4.6172 9.0890 2.9150 -0.3922 4.2977 -#> -5.9871 9.6314 9.5661 -4.8246 -8.7754 -3.9940 -18.3416 4.9380 -#> -0.0987 -0.1037 10.6268 -0.9204 -13.3450 -1.6316 10.4843 2.5481 -#> -14.0458 -11.3099 3.7968 -0.4273 -0.2426 5.3267 2.5750 -17.5287 -#> -2.2571 -2.0932 -14.2280 11.7172 17.4334 3.4308 0.5894 6.1905 -#> 0.2811 15.8760 2.4301 -2.2227 7.1458 11.6980 -1.3617 1.6834 -#> -1.4061 -1.7128 -5.3377 -13.2902 -0.3802 -10.0704 11.9042 -0.1962 -#> -7.3468 -1.7626 11.2446 4.4840 10.7810 -18.4213 -13.0671 14.1763 -#> 2.9852 -13.0677 -2.5682 -2.7484 -0.3983 0.4290 6.8680 -12.5134 -#> 13.3903 -0.9157 -2.4934 3.4122 -3.4299 2.6442 -15.6976 18.5193 -#> -6.8927 -4.6039 -1.2142 -5.0980 -6.1670 -5.9587 2.6590 0.4891 -#> 9.6014 11.4081 -11.5250 -7.0517 -0.4210 3.0345 -4.5130 0.2101 -#> -0.8494 15.2047 9.5280 1.9386 -13.3232 -3.7443 -10.3539 -14.5014 -#> 12.9063 7.2197 -0.1236 -8.5376 -2.7291 -1.1630 -10.6046 -17.5495 -#> -0.0908 10.3624 11.3390 0.9138 -0.4849 13.4013 11.7459 -8.2455 -#> 1.9956 -6.9150 2.7161 1.0772 2.4628 0.0884 9.3908 5.6051 -#> 5.1412 20.5724 7.5165 1.3354 9.4225 1.5700 0.4186 19.6713 -#> 13.9985 7.4152 2.8320 13.5761 6.1628 -12.5014 2.0404 -2.9885 -#> -#> Columns 17 to 24 9.7553 -13.1200 7.1868 4.8436 2.1946 -6.9015 1.4137 9.7216 -#> 10.7145 -15.9968 -7.6058 3.1467 9.2966 14.3546 -0.1738 9.8027 -#> -9.2212 -0.5251 3.5527 5.3623 -11.8983 4.0554 4.5675 -6.3613 -#> -2.6589 8.1325 7.2018 2.1848 -4.5962 2.5252 1.0252 -14.3119 -#> -7.3349 2.4963 6.5116 -10.1706 -3.1291 -7.2340 12.6099 -0.4696 -#> -0.1394 -7.5460 -22.2137 -2.3903 2.1341 -22.2161 17.2772 28.6837 -#> -2.1571 5.4578 -1.2019 -22.6302 -2.6689 9.1606 -3.8338 2.8104 -#> 4.4399 -4.1536 -0.9956 9.4338 -4.5337 -8.7534 -5.4436 5.9705 -#> -11.4283 -1.5159 -6.5968 10.1358 -15.0663 -4.5839 -3.3741 7.3909 -#> 5.8492 11.3488 -2.2672 12.7561 -7.0420 -2.7220 15.0504 8.2648 -#> -18.0644 3.9038 1.3762 -20.8657 -11.8200 4.8428 3.3412 -21.4536 -#> -5.5596 -23.4892 -6.5741 11.1924 -15.3357 -2.2479 6.0471 -0.6416 -#> -5.0136 0.5910 -2.1618 6.7253 -1.5704 6.2009 1.3919 9.3597 -#> 10.9548 -13.9191 12.5985 -1.2022 10.8440 -5.5861 4.4983 3.3493 -#> -1.2791 3.0089 9.8931 -13.5114 -4.6666 -1.9426 4.4158 -10.9277 -#> -12.9990 0.6456 8.3221 -4.9207 -0.7035 17.5786 7.1658 5.9181 -#> 17.2244 -19.4162 2.3888 -0.8057 -11.6695 4.9724 13.7872 1.6801 -#> 5.4351 -6.6086 -7.4150 -2.8655 4.8108 5.5622 6.3642 -10.0064 -#> -11.7144 11.1556 1.1453 21.3122 -11.9525 -6.5990 9.2495 -5.2196 -#> -15.3457 6.0296 4.1158 -0.5219 1.6137 -11.4727 8.4630 -2.4792 -#> 2.8535 0.0787 1.0286 15.6627 13.0209 -6.1135 14.4283 9.9701 -#> 2.5635 -3.7947 -1.3162 6.7904 0.7413 -17.1797 -21.7695 8.1626 -#> -3.6684 -4.9924 -2.0509 -4.8486 -2.1283 -4.5470 -1.8739 9.6315 -#> -14.5102 9.8342 -1.4152 -10.0871 9.7394 12.9369 -12.2200 -8.0341 -#> -10.4517 -4.9812 2.9651 7.1184 -1.4927 6.9153 16.0767 -8.4478 -#> 4.6687 3.0401 -1.0172 2.2060 1.4107 2.0808 -11.0041 -0.6075 -#> -4.7864 -0.0843 2.4379 -12.3772 9.2019 -3.9495 -1.5036 -11.3102 -#> 0.9289 -7.3573 -4.8422 5.4141 8.8234 -12.1827 6.7583 -5.4115 -#> 4.5193 7.5225 -6.1402 -12.0619 3.5380 2.5877 -13.4143 -13.8412 -#> 4.8879 11.3728 8.5100 -22.6038 -11.1467 9.6744 10.0143 -21.5391 -#> 10.9835 -5.5687 1.3013 2.8671 -5.4365 -4.9046 -20.4971 2.6865 -#> -18.5385 4.6653 -6.3887 -0.4565 3.7783 1.8561 -6.8878 2.6661 -#> -20.1066 1.6469 2.6033 -19.8456 8.6572 3.5506 8.2641 -13.4063 -#> -#> Columns 25 to 32 -1.2997 2.5660 11.4786 21.9807 10.5563 12.5488 -16.1947 8.9865 -#> 30.3387 4.3806 -2.5467 6.4973 8.2565 11.6147 1.4779 1.1958 -#> -8.6803 -5.0393 -4.5055 2.4810 -7.9344 7.4099 -4.2127 -3.3463 -#> -4.6563 23.6231 -14.3282 -0.1461 2.4661 14.0052 -8.7004 -1.3035 -#> 5.4801 16.1340 1.9629 14.6541 3.6288 10.3801 -2.2284 3.0719 -#> 9.1457 4.7753 26.7420 1.9238 16.1791 -2.2101 -5.3664 11.4996 -#> -2.6553 23.0783 -3.7477 -4.2019 11.1723 -4.5220 4.4127 -4.7281 -#> -2.5818 4.4406 10.7327 2.0380 2.0969 9.6335 3.9120 8.2685 -#> -33.8974 -12.8143 -6.0901 -9.0709 -7.8182 8.9614 -2.5334 -12.7631 -#> 3.2786 -1.6036 7.5980 -3.9095 1.2602 2.3739 3.2521 -4.8686 -#> -7.3327 10.4040 5.5009 4.0083 16.3784 -8.6116 5.7804 10.3379 -#> -14.2673 -6.2080 -4.0674 0.1512 8.3497 -9.4116 -4.0423 -14.0406 -#> -20.3385 3.9015 2.0622 6.2539 6.8554 -7.8619 -3.0550 -4.7275 -#> 2.5882 -12.9390 -22.0963 -3.0532 1.9954 -0.9987 3.1953 -8.4127 -#> -5.0772 16.6097 -9.3671 -0.4289 -0.8395 -4.9621 -8.1858 -7.3859 -#> 7.6867 20.9398 -0.7564 -10.0955 4.2113 3.5465 -1.3963 8.6430 -#> 1.0266 7.9831 -4.8256 2.7832 20.5387 -2.9075 1.2404 7.3119 -#> 4.3127 -3.5009 1.9082 1.2073 -3.6349 0.1758 7.0910 2.4704 -#> 5.2905 -13.3534 18.8257 -26.3553 2.9182 -13.3157 3.7616 -7.0972 -#> -9.0565 -9.4113 -7.3716 -6.4189 -1.4251 -0.6970 -11.1698 6.4259 -#> 9.8357 -20.6985 6.5946 -6.4993 1.4075 -10.1752 -2.8046 -0.5371 -#> -7.0588 -15.2924 -4.0809 17.3121 -5.1022 14.4247 -6.8520 13.0136 -#> -5.4876 -12.7528 -31.4428 -4.2435 5.3583 0.4308 -8.7641 -7.9566 -#> 2.7864 3.8519 -3.5083 -1.5176 -2.9368 9.7348 -5.4117 -6.1777 -#> -1.2136 -10.0024 -8.1429 -4.3171 -6.6987 -1.9511 -9.4266 -5.6240 -#> -7.5783 -7.4562 6.2961 -6.0867 1.4181 -16.5222 -2.1903 -8.6514 -#> 9.0910 -3.3295 1.5234 -4.1993 1.2210 -0.2420 2.8981 -2.0521 -#> -5.5707 -18.4647 -0.6560 -2.7191 1.2130 17.2127 -7.8321 4.2030 -#> 4.6750 -2.4481 -14.0304 6.0465 -5.7216 2.8683 3.5524 -2.6627 -#> -4.3774 4.2675 11.3429 -9.4706 -2.9024 -6.9048 -0.8443 -2.9756 -#> -6.5006 5.7395 -0.4179 -7.7384 -0.1501 9.6269 -4.5812 12.0688 -#> 13.9179 -15.6514 -3.5161 5.1031 19.9115 -3.0878 4.0127 -15.4881 -#> 6.5677 1.2792 -3.8230 2.6097 -13.5379 10.4018 0.3715 7.4503 -#> -#> Columns 33 to 40 5.3052 -7.0661 4.7852 8.9610 -15.2011 -3.8709 17.9290 1.6281 -#> -5.1879 5.2382 -1.0851 8.2228 4.6034 -5.9043 -1.7003 -11.1631 -#> -6.9663 6.2021 3.7609 -16.0715 0.8179 4.0172 2.3748 2.4703 -#> -18.8565 15.3300 -2.0820 -7.5073 -6.7966 15.5286 -13.4748 6.2408 -#> 0.8957 3.5834 12.3402 -17.6054 6.3492 7.4437 3.7392 -8.5734 -#> -2.9738 -7.2329 -2.5599 1.7708 2.4251 -5.8812 -5.8491 -5.3014 -#> 6.4677 0.8822 1.9027 -8.3738 -8.0058 12.0594 -8.5700 -5.1170 -#> 18.9312 17.7423 -3.8196 10.4943 -0.9489 -7.5846 -2.0368 1.1635 -#> -2.4501 4.5884 -6.6799 -4.8542 5.8958 1.9316 7.5522 4.8995 -#> -9.1462 -12.1461 8.4455 14.3685 -16.3043 -9.3552 -8.0756 -6.7934 -#> 10.4133 -12.1937 15.6228 -9.6192 2.0900 -1.8004 -6.6100 -7.5872 -#> 8.8462 0.4353 11.5399 -4.4965 -6.6862 -1.5440 7.6812 -2.9164 -#> 5.2504 -0.7551 9.2843 -1.7086 -3.6233 0.1112 12.3992 -0.2854 -#> -9.8543 -2.7453 2.0754 1.1477 6.3524 -7.0851 0.0442 10.8561 -#> -9.9474 9.1364 -0.2938 2.6699 1.5531 15.9298 -7.0656 -4.3737 -#> -2.8141 -23.9811 2.7360 10.6291 -3.0334 10.8106 0.6688 -9.3946 -#> 4.9860 0.2085 -0.2466 1.6133 -3.4966 14.4860 -2.2270 -23.5474 -#> -2.5455 -2.8775 -15.3958 -7.0275 3.9818 12.7210 -13.5770 16.3814 -#> 11.6903 -1.1662 -3.7867 17.0957 10.7837 -6.2491 7.7903 -1.4236 -#> 6.9868 3.5043 10.6076 -3.3494 9.3721 -1.1851 -16.0425 7.7467 -#> 6.9866 -8.2193 -5.6146 15.1631 4.8171 10.9608 0.2572 11.4145 -#> -4.2670 7.7636 13.9072 14.0630 -3.8046 -5.2689 -9.9894 1.9107 -#> 9.0186 -1.3330 2.4330 20.1800 -10.8788 -3.1260 -5.8801 -9.1647 -#> 5.5179 -1.6877 -19.2072 -14.4144 -5.0496 -7.9839 6.6813 12.4073 -#> -12.5223 2.0891 -12.9087 -7.6491 -1.4437 -8.5859 1.9822 11.2169 -#> 9.9112 -18.6615 -7.3252 -5.5876 -1.6898 -3.7585 0.5299 -15.4517 -#> 5.1917 -6.7378 0.6070 -1.0705 0.7240 -6.5471 -4.8843 -0.6816 -#> -7.0180 16.8131 -3.6724 3.8947 11.0605 -0.7513 -1.2543 -1.7039 -#> 8.3646 10.8441 -21.4233 -16.3048 6.7304 -8.0358 -16.3618 14.3174 -#> 3.0343 -7.0327 -5.2200 8.3443 -11.4185 -15.7410 5.5705 -2.6389 -#> -1.9251 -4.4846 10.0314 4.7699 2.6357 2.8893 -2.4603 -1.3923 -#> 1.6812 -1.4077 -6.8064 0.4776 -2.6904 10.8166 4.8353 -27.0714 -#> -7.4296 -3.7325 -5.3549 -8.5688 -9.1646 -6.8415 -2.5583 6.1067 -#> -#> Columns 41 to 48 3.1174 16.3425 7.7411 4.7303 6.4072 -0.6967 -15.2469 4.0689 -#> -19.8591 20.3703 -22.3870 -0.0186 -3.8217 -4.0515 -10.3993 8.4956 -#> 6.2955 -6.7682 1.2646 1.5836 -2.6305 5.3780 -4.7371 -6.3778 -#> -3.3552 3.5098 -1.0505 8.3239 -8.7252 8.8065 6.4772 8.2973 -#> -3.9051 -7.4153 -2.2978 4.8423 -6.7482 2.8870 -4.6284 12.2837 -#> -3.4785 -24.9972 3.0788 1.1094 2.2645 -26.4156 3.6031 -6.0978 -#> -5.8234 3.4496 -2.9199 -8.7993 -10.5120 -4.7851 -11.1427 -23.1279 -#> -11.7796 -0.4631 -13.1887 -11.4661 2.6153 7.1247 -10.4822 -15.9656 -#> 13.4413 -11.6371 -0.9134 13.4668 -4.4373 -6.4646 -1.3270 2.8753 -#> -2.1881 -6.9783 -10.5101 -10.4630 -3.0733 -2.1742 8.1180 -6.3409 -#> -1.8502 -9.7822 2.1889 -0.1678 -5.9537 15.3674 -0.7551 -4.7234 -#> -8.0885 8.0203 -4.0589 -8.8241 -8.3500 -2.8535 9.3532 10.9975 -#> -18.7868 0.3839 21.5336 9.5365 -2.7460 8.3454 6.3590 -6.5452 -#> 1.0877 11.8411 -7.7273 -3.8096 -1.4145 -16.3855 13.4120 -4.7363 -#> 15.1865 5.0968 -3.6312 1.6900 9.4821 2.4396 3.0815 5.8806 -#> -10.7069 8.0684 -4.1255 13.1902 -7.9577 -1.6689 -4.5992 7.6382 -#> 2.7812 -9.5274 8.5902 -4.6735 -2.9463 1.7383 -6.5527 -14.1294 -#> -17.0843 14.5295 7.6360 -1.0162 0.3424 23.1627 -11.0931 6.9220 -#> -9.3723 6.4530 -1.4624 -0.4699 -5.1367 4.7219 -9.0282 4.5032 -#> 12.0693 1.0159 -23.7530 -1.8362 -2.2409 -5.2844 -4.3661 -0.5389 -#> -0.3056 3.2738 3.7991 -2.3939 8.0024 -1.8377 5.1208 10.9426 -#> -9.6241 14.9390 -16.1278 -9.0459 -16.5176 -9.3462 0.4452 4.7589 -#> -10.3488 -1.6794 -5.0653 -8.7524 -7.6349 -16.4446 -11.3896 -7.0746 -#> 1.4225 -17.2559 4.2896 8.8625 10.0452 -8.2771 -0.2759 7.0084 -#> -8.0495 -7.6525 2.5955 -0.5583 15.6281 -17.6703 0.5897 11.1602 -#> 27.0729 5.4696 -1.3197 -9.3038 13.9750 -2.9348 4.7927 -2.7989 -#> 29.5099 -20.9963 24.6360 -6.2216 15.2439 -11.4710 16.2514 6.2927 -#> -3.5272 -19.5338 3.2136 7.2999 0.0861 -2.9485 -9.8887 0.4744 -#> -1.2962 -1.0744 14.8624 -2.6720 19.5613 23.0580 -6.7557 7.1725 -#> -17.8225 -0.3951 -7.6426 10.5787 -2.1145 -5.5434 20.5069 3.5584 -#> -2.0788 2.3581 -3.1871 7.5324 -25.1909 4.3661 7.6013 -0.0651 -#> 12.2884 8.4565 20.5663 9.6368 18.6797 -2.5369 13.3668 1.8755 -#> -10.1990 -10.4719 -0.4565 10.4234 4.8150 1.9160 2.6964 -5.6770 -#> -#> Columns 49 to 54 4.7604 -17.0443 4.2362 -0.1506 5.7847 4.8275 -#> 12.6656 2.2082 -15.0857 -9.7875 -1.2137 7.7689 -#> -7.9383 3.0470 -2.7657 -15.1841 7.5981 3.2548 -#> 13.3431 -7.0009 -9.5811 3.9706 8.5158 0.8813 -#> -2.0129 13.4008 -1.1993 -6.5928 6.5611 -4.5572 -#> -13.2717 12.4965 -5.4569 -0.3310 10.9039 5.8270 -#> 8.9575 -5.0186 -2.2719 0.3510 -4.7358 -3.2742 -#> 6.5051 0.5433 -8.9381 -0.5561 -6.3295 -1.4019 -#> 3.8464 -4.6707 -13.7753 -0.9268 5.7068 -1.6860 -#> 3.8926 -6.4982 -3.0241 1.7773 -3.8354 2.0144 -#> -6.7829 7.4228 -0.6287 -1.7426 -6.6208 -2.8569 -#> -12.3822 14.1381 -11.0661 9.7871 2.0780 -1.8625 -#> 8.6162 4.5793 -2.0453 10.8411 -8.9352 -2.9064 -#> 8.2304 9.6105 -18.9576 -9.7135 8.1245 6.5796 -#> 4.2747 15.7482 0.5116 0.3624 10.7449 3.9772 -#> -4.2663 -3.5648 -1.8642 7.4056 0.9312 5.9999 -#> -2.1748 -9.0366 11.5350 -4.8309 -3.1486 9.7313 -#> -3.5441 -3.7128 3.9071 1.9930 -5.4614 -4.3329 -#> -3.4657 11.1438 -16.1882 8.2909 5.3822 -11.1595 -#> -6.5959 -4.9608 1.6281 5.4256 1.1819 -7.4167 -#> -12.6978 0.9185 9.9471 -9.0970 -8.7337 -0.1304 -#> -14.7662 1.3690 5.9851 -0.7951 3.1230 6.4365 -#> 7.6856 0.4292 -14.4094 -0.8849 0.1469 3.4671 -#> 5.4269 22.9641 -11.5888 -0.5557 12.3558 -2.9540 -#> -17.0178 1.4444 -3.5835 1.6870 7.0000 -9.2378 -#> -3.3226 -4.0039 4.7945 6.7741 -10.1068 4.7550 -#> -3.1680 -10.0941 1.2355 -4.2952 3.2572 4.4980 -#> -1.9808 -11.6065 -7.0829 -7.6412 7.5375 -9.0860 -#> -5.1760 -1.4036 -5.5432 -5.4713 3.0101 -2.3162 -#> 10.2289 1.8357 -9.1445 22.0267 -3.5414 -11.8593 -#> 4.8033 6.7724 -6.2342 3.3569 0.4787 4.3952 -#> 10.7536 3.2190 -5.0490 10.7763 0.7685 2.7364 -#> 3.8563 5.1250 -8.5096 -8.6343 6.7572 -4.6396 -#> -#> (8,.,.) = -#> Columns 1 to 8 7.1508 6.7372 5.9819 -0.0601 2.6715 -5.1312 -4.7280 -6.3784 -#> 7.6682 -5.1930 -3.7598 6.8804 11.7423 -9.1464 10.0633 6.0598 -#> -5.6035 -1.0263 1.6200 0.2141 4.6984 -2.8773 -12.6556 2.3026 -#> -2.9972 0.1561 7.8324 2.6244 11.8952 4.3623 3.7364 -25.3673 -#> -9.6151 1.6634 -0.2185 -7.4899 4.8237 -4.0181 4.5082 -0.4462 -#> 4.0094 -2.8231 0.4799 3.3922 -11.7035 -13.0875 -10.3550 5.3582 -#> 3.9708 2.3987 2.3578 -3.6164 -9.5522 6.8756 -12.5137 -12.8285 -#> 0.9439 0.7691 1.1446 3.5504 -5.8556 2.2269 1.1489 3.9464 -#> 1.8752 -4.9550 3.1504 -6.5187 -8.1246 12.7283 -14.8513 -1.5619 -#> 0.9727 -0.8928 15.7404 4.6312 8.3961 12.8783 7.6629 6.6705 -#> -6.8592 -0.9109 -4.1087 -2.7627 -13.6369 -0.1231 2.1088 -2.5499 -#> -3.5121 -2.3167 -5.9088 3.2831 -0.5246 -4.8987 2.8872 16.5708 -#> 1.6531 6.3598 -0.8847 22.2141 -2.2368 -8.2070 1.2624 -2.6903 -#> -1.7068 -4.0863 -5.5869 3.6179 9.6339 -3.1257 29.5030 -1.4866 -#> -2.4773 -5.0205 4.1964 -6.4092 -4.1407 -5.0958 -4.7861 14.8317 -#> 6.5156 4.4029 6.6035 1.2587 1.1257 -16.5518 -3.6082 16.4726 -#> -1.0780 3.4886 6.4448 -3.6593 0.8148 -5.3412 -0.4133 4.7233 -#> 9.4846 7.6809 6.4711 2.0481 -1.3859 -1.0715 4.2044 -20.6968 -#> -4.3761 1.6670 2.4490 -8.4354 -2.7067 9.7746 2.3280 3.9501 -#> -3.8817 -6.5747 -2.0872 2.3250 -12.6315 -5.6352 1.2802 15.7961 -#> 4.2989 1.5583 4.6880 5.4209 1.1403 -0.6907 -0.5091 16.0139 -#> 1.2061 -14.1669 -8.0375 -17.3101 5.6985 -0.8953 12.6394 -9.9687 -#> -5.8559 -8.4070 -11.6112 -4.7989 -24.5551 -16.0300 -10.0286 9.7208 -#> 8.2044 6.9436 -3.6526 23.1866 8.5349 11.6573 -0.4488 2.0429 -#> -3.9382 -3.7899 -5.7767 5.4737 2.7066 -1.5215 -7.6963 -3.5784 -#> 9.5496 8.5969 10.5426 15.4591 5.8838 14.7248 8.8217 16.1680 -#> -1.1191 -2.1120 -6.3763 1.0351 11.4906 -7.1137 -11.8626 -0.3931 -#> -10.9741 -4.9215 -13.0459 -15.8233 -3.4078 -12.5638 2.9955 -5.9202 -#> -0.1297 8.3024 12.6717 10.2556 13.5064 13.4313 0.2281 -1.7045 -#> 1.6663 3.6732 -3.0221 3.8763 16.6424 -5.3196 4.7441 -17.1505 -#> -6.5405 -3.9878 -7.3507 1.7192 -9.0308 -9.1152 14.1827 -6.6667 -#> 0.8203 4.1849 3.5764 1.3127 3.9395 -12.8159 -17.8305 2.2912 -#> 3.0475 8.3816 -2.4021 6.5701 4.2364 -9.9423 0.9770 -9.7009 -#> -#> Columns 9 to 16 -10.0151 -9.5474 16.3357 8.2495 -1.5964 0.5695 -4.5643 9.8078 -#> 30.0856 0.3242 -7.8498 8.6488 2.6957 -5.3110 -8.6937 -1.7189 -#> 1.6592 -7.2534 -2.4388 12.4133 5.6499 15.6555 9.9671 1.4648 -#> 11.3157 -6.9102 -4.3567 -1.6289 -1.9218 3.2060 9.5204 -16.0418 -#> 7.6599 13.5402 0.1472 3.2803 -7.8039 5.7619 12.2552 -0.9779 -#> -15.4246 20.6626 18.6428 11.2379 3.3039 9.3777 -0.9136 1.9426 -#> -14.6024 3.6175 9.5540 -3.6661 10.4562 15.3631 18.0100 -3.4185 -#> 1.6321 -2.9346 -6.9568 -8.9791 -3.5516 2.2035 -0.5501 -7.4288 -#> -4.0139 -5.5305 -11.6019 8.2809 -23.6624 -2.0877 3.3591 -1.3592 -#> 3.6416 8.6007 -10.4185 7.1283 4.3517 13.6260 5.8551 -3.1176 -#> -11.1471 -15.0331 -2.3048 -1.3674 -2.6871 3.3351 -2.7968 -7.0620 -#> -6.1378 14.2888 -1.8011 -2.4272 -3.7110 -0.9645 -4.7425 9.3068 -#> 16.9800 25.1799 9.1233 -7.2462 4.7891 -5.0251 0.7113 -9.1539 -#> 9.0895 3.0683 -24.9544 6.3863 5.4586 -9.7773 -0.5664 -7.1917 -#> -10.7116 17.3930 -13.5514 11.1032 -0.2145 12.2160 -5.1363 2.3689 -#> 0.3445 -2.2634 0.8781 3.2789 4.5291 12.1612 6.0505 2.4993 -#> 5.3868 3.0195 -0.2482 -5.4812 8.4448 9.9567 -0.4724 -6.5618 -#> 10.9629 -12.1021 13.6602 -3.9014 -15.5234 -8.2013 -9.3046 12.3432 -#> -14.5482 -5.0352 -10.1014 -4.7969 -0.9369 10.4475 -10.6851 -0.4472 -#> 12.1259 -6.1691 -3.0353 4.5765 10.1662 -0.7835 2.3517 -10.8543 -#> -8.3979 -4.6937 -15.2274 -22.2141 -8.5639 -6.7124 -0.9374 7.7074 -#> -17.2386 -7.9246 -6.5068 3.9143 11.5533 3.2035 -11.3446 -2.0618 -#> 9.9309 -13.9416 -10.4132 -16.5851 3.7950 10.6597 2.1974 -4.4408 -#> -0.4629 -0.9287 22.9732 1.8167 9.5406 -3.2515 1.6038 1.8185 -#> 11.0190 -21.9261 22.7242 -17.9158 -7.3786 -7.2195 -0.3001 18.1530 -#> 5.9330 19.3633 -2.7365 9.7048 -4.1241 10.0479 -3.0356 8.2367 -#> -10.5649 2.1628 -13.9123 9.5574 -5.5361 3.2461 1.5565 3.8338 -#> -3.5414 -14.0745 -3.8302 -14.1880 -6.9356 -9.2233 -1.5608 -11.7840 -#> -5.1279 1.8206 -1.8614 5.3786 8.2715 2.3127 -1.2312 3.2170 -#> -1.4440 1.4858 9.7767 -1.3624 -2.6655 5.8860 5.7552 9.4817 -#> 1.4178 -16.0870 -17.0147 -6.7973 3.1098 -0.3210 7.8234 1.2850 -#> -10.4717 1.5837 -1.9258 -2.0323 13.3036 7.7405 7.7084 -5.0524 -#> 9.6043 -7.5064 0.8189 0.4072 9.6478 1.5139 20.2461 -1.0598 -#> -#> Columns 17 to 24 8.1001 -1.0351 -6.7461 -11.0419 -8.1504 -15.6931 -19.7889 -2.3716 -#> -10.0340 -5.0651 14.4485 -11.9413 6.1078 -14.9792 2.4590 -15.2920 -#> -3.9651 6.6120 10.5014 7.5160 -9.6985 4.0468 -7.4204 -11.1921 -#> -13.8032 4.4341 -0.3191 14.4921 -10.4447 -1.8446 12.3376 -0.8948 -#> -8.0753 -1.2589 -8.4688 3.2476 -26.8242 -4.4277 -13.5342 2.0415 -#> -4.3877 -0.1280 -20.6544 -12.8968 10.8765 -10.9568 -11.3636 17.6642 -#> 9.3794 -1.1094 -9.6291 3.5754 -2.1581 -8.7307 -2.9342 4.1637 -#> -6.2747 -11.1756 -8.4495 -13.4040 -13.1809 -13.8689 -7.2018 -13.4809 -#> 12.8036 16.1512 5.4492 15.9207 -0.3596 0.6861 13.7002 -4.8839 -#> -10.9333 2.2218 -14.8068 13.8960 -5.6335 -0.0825 5.0305 -6.7653 -#> 14.1881 0.2138 -1.2540 2.3505 -2.9398 3.0173 -16.3710 2.6998 -#> 3.5745 14.2715 15.6316 2.2813 -7.2114 -1.7036 -5.6726 6.8289 -#> -1.3762 -16.8109 1.8873 5.7485 11.4382 -7.3215 -15.7174 4.9650 -#> 0.9686 5.8856 -8.1113 7.1000 5.8756 -9.7769 10.2416 7.2427 -#> -10.5910 22.1120 -8.2317 -3.1513 -9.2054 -6.5088 1.6777 -1.6041 -#> -9.1672 1.9800 -16.2627 -9.7974 3.4530 13.2416 -4.8867 -3.5655 -#> -12.8684 -11.0718 -13.4428 -13.2373 -10.2448 -2.0117 7.0435 3.1549 -#> -1.6896 6.6603 15.4529 7.5885 4.8641 7.3906 -2.2209 -6.2453 -#> -2.5293 -9.9172 -13.6721 -0.2927 -6.7671 8.2823 19.9966 7.8450 -#> 2.8503 4.1798 1.5580 -14.8536 -6.3334 2.6650 -4.5798 -8.2165 -#> -4.0300 -14.2928 -2.4017 -20.5895 4.7263 14.9711 -2.4875 14.4660 -#> 5.1115 -13.9482 -3.6005 4.3206 9.7899 -5.5324 8.6255 -7.1614 -#> 5.7668 2.7724 7.4530 -12.2066 3.6582 -2.3568 -3.9457 3.7212 -#> 13.5755 5.8316 0.1492 9.8553 16.3314 15.4315 -7.3261 3.8208 -#> 5.3305 12.3318 1.8238 2.7613 11.6012 8.4931 -3.4332 5.4188 -#> 8.1775 12.1450 15.1162 2.3753 2.2250 -3.6509 12.8902 -2.8555 -#> 7.8905 10.0615 -0.4912 6.3662 -0.6782 9.5253 -15.2438 2.5168 -#> -11.2149 -10.0954 3.3422 -2.7070 -7.0759 2.4042 -3.7015 1.2223 -#> 2.3879 6.8082 0.2533 11.3374 -1.4567 14.0965 2.2258 -1.8460 -#> 3.4320 9.3158 -0.6690 17.7832 9.3512 13.3587 9.3576 5.0262 -#> 9.3727 1.7643 2.7273 -7.8212 -7.6490 -2.7739 7.7751 -6.1483 -#> 0.2295 0.1979 2.4044 6.5403 8.0871 1.0703 -1.9011 7.0253 -#> 4.3509 5.3380 -7.0635 10.6731 10.6836 0.4986 -7.3226 1.9557 -#> -#> Columns 25 to 32 6.7455 1.7868 -8.8835 -8.2019 4.5433 -1.0239 -3.1092 1.6769 -#> 8.4804 5.0411 11.5685 -0.9770 2.6991 12.6585 -1.4397 -14.9938 -#> -10.9128 -5.3340 -9.6483 -6.0048 9.7009 3.1092 5.1960 1.1162 -#> 11.1955 -2.2595 9.9399 -9.6134 -1.3063 9.5693 3.4519 1.7172 -#> 3.1671 1.4267 -2.8816 14.6492 7.2395 1.8898 5.8684 13.9336 -#> 4.4651 8.7700 -5.7401 6.6528 -5.2405 17.1295 8.9922 8.9185 -#> -3.7940 -18.7592 -11.9993 -18.3577 10.6142 -13.1228 3.9631 11.6180 -#> -1.4272 -1.2968 -1.0754 3.7228 16.9619 -9.0493 -7.8127 6.2468 -#> -7.6175 2.0480 -11.3912 -27.6616 -2.1672 1.8430 2.4369 0.5673 -#> -3.9454 5.5963 8.3102 2.1835 -15.3158 24.8253 -5.0833 10.2927 -#> -8.1090 8.9425 -9.0912 7.1353 5.2831 -4.3799 -4.0059 25.5929 -#> -4.5726 -0.3739 -18.7746 9.9125 -14.6715 15.2602 -3.1135 14.5251 -#> 6.0025 3.2679 10.2790 5.0409 1.6172 -0.0848 -5.3120 4.0880 -#> 10.8606 -4.5095 20.0533 -6.8515 -18.1393 7.5030 -8.9183 -7.8453 -#> -3.4917 -7.7637 -9.8252 16.4568 -2.8053 8.6339 13.7757 -0.1181 -#> 14.0185 9.1554 5.6943 0.7750 -10.0743 2.1277 -2.2721 13.1413 -#> 10.1655 -4.2152 1.7572 4.1830 2.1623 -8.7048 -3.9587 1.0580 -#> -6.0097 10.9930 5.1062 8.3181 0.8457 -0.1577 -18.2184 -2.3131 -#> -5.5882 -2.2534 -22.0129 4.8418 -15.4079 5.5410 -4.4979 -0.6346 -#> -14.6428 7.7315 6.3648 -5.0603 5.9393 -1.0989 11.1776 6.7368 -#> 9.9365 24.6246 -1.8262 14.6768 -21.6443 -14.6404 -12.6476 -2.4409 -#> 1.4070 -16.3869 -19.9686 -6.7619 4.0466 3.5202 4.3108 2.6778 -#> 9.7351 1.3257 -9.6109 -21.4991 -2.8457 -9.2122 -3.7684 12.5494 -#> -1.6200 -0.4281 -3.1035 -1.5048 1.5300 16.4868 -5.0592 -1.0714 -#> 8.1314 15.9266 -7.8683 -6.8566 -8.5155 -0.3033 -16.4273 -7.1421 -#> -13.8999 -5.4036 8.2061 1.4796 -2.7902 4.8665 -6.9505 0.0025 -#> -7.4138 -4.6629 -4.7055 -3.7802 -9.6058 6.1842 -2.8387 -6.5854 -#> 10.8963 14.8133 0.6713 -1.0068 0.1435 -19.0033 6.1731 -15.9684 -#> 3.5799 -7.0449 -4.7986 4.5287 -7.1192 -8.4011 5.7986 1.1786 -#> -13.1699 -20.9364 -9.5993 5.8298 -7.7431 23.6490 4.8607 -11.3314 -#> 6.8217 -2.7287 11.7833 -12.5530 -7.4013 7.5500 -6.6971 10.5138 -#> 10.9274 13.0032 -9.6783 15.5270 -2.9641 2.4258 -6.9870 2.7114 -#> 12.2095 10.0677 12.3627 -0.1106 3.4960 3.6681 2.2930 -6.1216 -#> -#> Columns 33 to 40 -3.3622 -10.3849 10.2555 11.8728 5.8306 -10.4878 0.4853 8.9719 -#> 18.4105 -7.9307 11.4194 -0.1296 -5.5568 3.6137 9.7678 -2.1159 -#> -3.3697 0.7608 -12.6738 -10.1236 4.4349 10.4471 -4.0326 -0.8935 -#> -0.3232 -0.7280 -0.2048 5.9241 0.5660 -11.9498 -1.1860 21.1517 -#> -2.7066 3.8992 -7.7092 2.9674 -2.1234 -1.3887 0.8307 -2.0074 -#> -0.5790 16.3288 -1.7148 -13.5067 -1.4230 -3.2069 -0.7859 12.1890 -#> -4.6540 -1.9929 -0.4003 0.7257 -10.6383 -10.1322 6.7866 7.2317 -#> 3.8970 -3.4188 -11.7834 -4.6428 4.0508 2.1324 4.1699 -4.1890 -#> 7.3724 16.4938 10.6386 22.3735 -2.0468 7.5446 -19.1999 2.6527 -#> -13.2517 -5.3879 -10.5804 5.9759 -2.5601 -3.3287 12.2843 6.9087 -#> -0.9873 0.0392 -3.8649 9.3947 2.5094 -8.7465 -14.7925 5.2627 -#> 12.9071 17.7511 6.9966 -1.4220 -10.6707 4.9446 -6.7811 -4.1808 -#> 3.2866 2.8973 -1.3446 14.3974 7.3142 4.2670 1.8701 1.2805 -#> 9.1203 -7.5222 -4.2375 3.9771 1.2064 2.9481 -2.7665 -11.2163 -#> -11.7766 -6.0367 -11.9295 -6.2336 21.8730 1.4385 -13.9398 0.4111 -#> -4.0622 4.9269 -9.6263 -9.0863 15.4455 10.2114 -8.1764 5.1174 -#> 7.1776 -7.6390 4.4247 -12.3476 -15.5966 7.9522 2.4543 -2.4679 -#> 2.6628 -2.8408 15.8058 2.4088 -8.0436 -3.6906 9.7034 -3.1776 -#> -18.7927 -17.2439 -23.2489 19.8174 22.7149 -10.6282 -5.0924 -4.8641 -#> -8.5925 1.8419 -9.6410 -8.1455 9.6088 -5.6521 3.3799 -1.7245 -#> -3.4918 -10.8292 -2.7675 -10.0738 3.9026 9.0146 3.3701 -11.1880 -#> 17.8822 4.8984 0.0398 2.5109 6.7534 4.2931 -5.9973 10.8163 -#> 1.7897 5.7650 2.3038 -28.0216 -12.0954 -4.8501 -7.7992 4.6672 -#> -0.3011 3.7991 -4.8132 16.1519 13.6011 4.7395 -7.8466 -8.0761 -#> -6.7641 -1.3968 -2.4330 -26.8625 -14.1981 -3.8306 5.0209 4.4976 -#> 6.8127 7.9355 5.9901 -5.3127 -5.0978 8.3256 0.7121 -8.4116 -#> -1.6115 12.8354 10.4227 0.5847 -5.0641 2.4588 -4.8166 -3.2331 -#> 4.9475 -2.3680 -10.0948 10.5419 2.5680 -7.0339 -1.7302 -0.8638 -#> -13.0108 -5.0026 6.1649 5.7417 -4.4212 -12.3816 11.8004 4.8924 -#> -11.2446 -16.1600 -13.2574 -0.8420 -4.7720 -2.6097 13.6350 -3.3475 -#> 17.9535 1.4541 -8.9213 -3.8055 1.3003 2.3363 -12.3381 -5.6345 -#> -9.3921 -4.6415 -0.2370 -11.6773 13.2384 -5.2224 -6.8675 5.8824 -#> 5.9682 6.0117 -12.0649 -5.3713 2.1772 -21.1429 3.9779 -3.2444 -#> -#> Columns 41 to 48 -7.4203 -4.3417 -8.7223 9.8122 3.8623 -7.4629 13.9484 -23.2190 -#> 3.4445 -1.3355 7.6531 10.1566 -4.0611 -8.9193 12.1846 13.4072 -#> -9.1451 -2.1885 2.3641 -12.4758 -16.5542 0.9120 -3.5647 12.2927 -#> 4.6303 -5.2585 1.0651 -4.5631 -7.7577 -0.9423 -7.0766 11.9225 -#> -3.1262 -4.0675 2.1898 -8.6319 -2.6677 4.4339 -0.6671 5.0801 -#> 6.2554 2.5598 -0.1234 13.0930 -1.8810 28.3096 -6.6054 -0.4741 -#> -11.8179 -12.9049 4.6068 2.6987 2.6213 -11.2886 3.4658 -5.2631 -#> -2.5020 5.5396 -3.7473 0.5592 -12.7861 -18.4386 20.1443 -9.0295 -#> -6.0943 -6.2892 -15.0037 -2.9393 -1.0958 15.7839 -17.5685 -4.1025 -#> -14.5351 -11.2419 -10.0126 8.4245 -3.8231 13.4642 -2.1626 19.3573 -#> -2.6913 1.2570 0.0191 -5.7630 -1.7038 -12.0102 -2.8262 -1.9102 -#> -9.3014 -2.7707 -5.6902 3.2790 -12.9711 6.5359 6.9999 -10.7766 -#> 2.9156 8.3690 -0.5395 -6.0872 11.8689 -4.0404 2.1184 -9.6522 -#> 12.3743 -1.0288 12.3481 1.4885 -0.7266 4.8815 0.7529 -0.9136 -#> -19.4519 -7.9898 7.5399 2.6351 -5.8580 6.3750 0.1136 -2.1771 -#> 1.6941 -6.4103 0.5496 11.4846 7.6194 3.6779 1.9675 5.1756 -#> 3.4512 6.0714 4.5734 27.0274 3.3834 8.3279 -5.7394 -1.7655 -#> -2.3163 7.9076 -5.1612 -5.3446 9.2760 -25.0731 13.2667 8.3713 -#> 6.8684 -9.4366 -14.1991 8.6951 10.7399 -2.8100 -2.2233 -10.4838 -#> -14.9936 -2.4823 18.8716 -10.5186 -11.2061 2.7645 9.7941 -20.5271 -#> 2.9695 10.8395 -10.4739 5.1254 -4.9681 -5.0584 -18.1055 2.5028 -#> 2.7110 0.8587 -4.6111 0.3774 -11.9666 3.8982 -5.3369 6.5355 -#> -12.3133 -15.5227 -2.7682 4.9590 1.8040 0.7331 -1.1232 -6.3707 -#> -0.3749 5.4597 -11.5046 2.4737 4.4048 14.9951 5.7543 -16.8669 -#> -2.6534 -2.9687 0.4743 -3.9876 -11.8613 8.7067 11.5766 -13.5679 -#> 0.9820 -10.8301 6.7273 -1.5765 -8.2368 14.5064 6.3943 7.6894 -#> 3.5332 -6.1911 6.6732 4.3271 -9.1047 14.4457 -19.7375 12.9010 -#> 15.6519 13.4092 -7.4260 -5.9673 -16.3715 -12.2971 -15.9744 -9.7981 -#> 8.2944 9.8776 -10.1648 -10.2498 -0.8161 -5.1461 8.3299 0.3224 -#> -5.7386 -5.1824 0.2582 -9.6135 15.1354 -0.5289 -12.7296 -13.0442 -#> 13.5382 -0.9888 -13.6341 -5.3329 14.3248 0.2444 -4.2781 4.5478 -#> 9.2803 -0.4922 5.8904 9.9608 -0.0805 11.7786 -7.9851 1.9345 -#> 2.3301 -4.4072 2.7353 0.9836 3.9349 -6.1501 1.5588 -10.2998 -#> -#> Columns 49 to 54 -20.8314 16.5843 -13.9129 -2.0479 6.8089 -3.7914 -#> 6.2111 6.6074 -5.1333 2.0526 0.3237 -2.0470 -#> -14.5700 7.4540 -0.3076 7.1954 -6.8094 -4.0095 -#> -15.0809 0.6740 -0.7762 0.2321 -12.1556 2.6609 -#> 14.3694 -1.3963 6.5437 10.2236 -8.9210 2.5166 -#> -12.3399 -26.4408 2.1555 -2.6613 3.1474 -2.0783 -#> -1.7603 -6.9358 7.6874 1.6465 -10.0625 -5.6285 -#> -2.7790 3.9554 -1.3829 -9.3238 -5.7191 -2.2334 -#> -11.1100 -4.3702 14.1653 4.4415 3.3352 -1.5379 -#> -14.6190 -1.4964 -1.2528 -9.3638 -8.0048 -4.8460 -#> 8.4043 -8.5918 -5.1574 0.0703 1.7428 -1.1935 -#> -0.7531 1.1208 -4.4809 4.9666 14.7488 -3.2977 -#> 4.3365 3.2914 4.2250 -2.9052 -11.3092 -4.1889 -#> 3.3596 -16.7166 3.0995 1.4962 1.6846 -3.7894 -#> -9.7414 8.1937 13.6883 -2.5184 10.6000 3.3366 -#> 2.6495 2.5624 -7.4683 2.6998 0.0631 2.7538 -#> -8.5029 -4.9118 -18.4429 -3.2778 8.0432 -8.2741 -#> -1.9020 6.6837 -5.4427 6.1116 7.2660 0.4438 -#> -7.0378 -4.7777 15.0084 -2.7419 -2.9724 3.1454 -#> 4.6097 4.8506 3.8411 -4.8637 -13.3390 3.0000 -#> 9.8125 -6.7690 6.7033 5.7351 3.2553 -1.0613 -#> -0.3601 -13.9705 -1.0009 4.9286 -7.7348 -2.9671 -#> -2.5662 -6.6644 -7.4420 -13.8237 5.5117 3.2655 -#> 14.5174 -5.0537 21.5743 10.0272 -5.0012 3.7732 -#> -15.3295 16.5149 0.0156 -2.6484 1.1358 2.8868 -#> 13.3596 -12.8918 -0.3230 -11.0566 6.0814 -0.1005 -#> -2.1543 -12.3791 9.6814 -9.1736 1.9768 0.4381 -#> -2.3409 -1.2850 -13.5981 0.2571 -8.0662 -3.2677 -#> 7.2799 -3.7789 -4.2514 2.2253 -5.6518 1.1974 -#> 2.7535 11.8568 1.0693 -0.8272 2.2264 -0.9253 -#> 14.5961 -14.8933 -9.8182 11.7285 0.6507 0.6431 -#> -2.7609 6.4170 -0.6261 1.7213 -7.1515 -4.1353 -#> 11.0345 -4.5719 -3.2490 0.7722 -13.8753 4.9701 -#> -#> (9,.,.) = -#> Columns 1 to 6 -3.0844e+00 3.9575e+00 -8.8483e+00 7.1485e+00 1.4732e+01 3.7237e+00 -#> 1.3299e+00 -3.2750e+00 -3.2767e+00 -6.6849e+00 3.0363e+00 -1.1108e+01 -#> -5.3153e+00 -1.0945e+00 -2.6950e+00 6.9211e+00 4.3921e+00 4.0269e+00 -#> -6.4988e-01 2.1014e+00 9.9262e+00 -3.4379e+00 1.2616e+01 2.0861e+00 -#> 1.9079e+00 1.3251e+00 -2.1251e+00 -5.3450e+00 3.2996e+00 -4.3239e+00 -#> -6.0243e+00 -9.6375e+00 -1.1247e+01 6.9145e+00 -4.1306e+00 2.5840e+00 -#> 6.0369e+00 6.4653e-01 -2.5032e+00 -1.9317e+00 1.9454e+01 5.1159e+00 -#> 6.6771e+00 -9.7476e-01 4.0785e-01 -4.2783e+00 2.9150e+00 1.3538e+01 -#> -9.8473e+00 1.0038e+00 7.7229e+00 3.0641e+00 5.7014e+00 -1.2515e+01 -#> -4.6366e+00 1.6778e+00 -6.7110e+00 -2.1135e+01 -1.1663e+01 9.1680e+00 -#> 2.4572e+00 1.0856e+01 -8.8700e+00 -6.6691e+00 -9.6360e-01 1.3725e+01 -#> 1.5261e+00 2.6883e-01 -3.6022e-01 6.8276e+00 -1.1354e+00 -5.7788e+00 -#> 6.0398e+00 -3.6515e+00 1.9147e+00 -9.9658e+00 1.8608e+00 1.6815e+01 -#> 1.0912e+01 -1.1609e+00 4.9686e+00 -8.8477e+00 -5.8313e+00 -2.2854e+00 -#> 4.3442e-01 6.8882e+00 7.6787e+00 -8.1718e+00 -5.9922e+00 -2.9733e+00 -#> -2.9825e+00 2.4150e+00 -1.2465e+00 -7.3483e-01 -1.5889e+01 -1.0715e+01 -#> 2.2427e+00 3.7664e+00 -2.2868e+00 4.5995e+00 1.0914e+01 -1.2551e+00 -#> -9.6294e-01 5.3982e-01 1.4011e+01 6.6027e+00 3.2266e+00 -1.1214e+01 -#> -1.2467e+00 5.1257e+00 2.9094e+00 -4.5639e+00 3.2488e+00 4.4284e+00 -#> 2.6206e+00 8.2820e+00 3.8331e+00 -4.8418e+00 -1.3598e+01 9.1072e+00 -#> 6.8628e-01 1.1245e+01 1.8393e-02 -2.6590e+00 -2.7642e+01 -3.6005e+00 -#> 2.3190e+00 -2.0259e+00 -1.0040e+01 -6.6561e-01 -1.7592e+01 3.9268e+00 -#> 7.6194e+00 5.9051e+00 1.9740e+00 -8.0803e+00 -1.1348e+01 5.4960e-01 -#> -8.1681e+00 2.1004e+00 -3.6650e+00 1.0278e+01 5.2578e+00 2.8381e+00 -#> -4.0629e+00 3.3020e+00 -6.8705e+00 6.6090e+00 -5.2204e+00 8.8572e+00 -#> -2.8817e+00 -7.1743e+00 -5.8274e+00 -1.4024e+00 -1.6453e+00 -1.2242e+01 -#> -1.8511e+00 5.2588e+00 -3.8408e+00 1.7996e+01 -2.0355e+00 4.0652e+00 -#> 2.0675e+00 8.8085e+00 1.3759e+00 7.3567e-01 -9.5449e+00 3.0625e+00 -#> 1.1249e+00 -2.9889e+00 1.9286e+00 -3.4670e+00 1.6181e+01 1.6632e+01 -#> -9.4529e+00 2.5716e+00 7.1519e+00 3.4283e+00 -7.8391e+00 4.5298e+00 -#> 6.1931e-01 1.0553e+00 6.9422e+00 -1.0306e+01 -1.4759e+01 -7.0191e+00 -#> 9.1682e+00 -4.0056e-01 -9.7737e+00 9.0546e+00 -5.7929e+00 -5.6782e+00 -#> 2.3222e+00 -4.0269e+00 3.3736e+00 3.6830e+00 1.0166e+01 6.4749e+00 -#> -#> Columns 7 to 12 -5.3888e+00 -2.9333e+00 1.5363e+00 5.4292e+00 -9.1061e-01 1.2076e+01 -#> 1.2150e+00 1.0525e+01 6.0796e+00 2.6328e+00 -1.1000e+01 2.9484e+00 -#> 3.1673e+00 -8.9752e+00 -7.2490e-01 1.5361e+01 2.7796e+00 -7.5944e+00 -#> -1.4430e+01 9.7151e+00 -9.1587e-01 3.2778e+00 5.2619e+00 -3.7392e-01 -#> 3.7892e+00 -5.1312e+00 -9.0128e+00 5.1808e-01 -3.1414e+00 2.7353e+00 -#> -6.7068e+00 -9.0294e+00 1.1638e+01 1.0029e+00 1.1856e+01 -2.6498e+00 -#> -1.1853e+01 -9.1865e+00 -9.7623e+00 7.9592e+00 1.4931e+01 -5.3246e+00 -#> 1.1537e+01 -4.9769e+00 5.7990e+00 -1.7946e+01 -2.5516e+00 -6.2747e+00 -#> -1.0631e-01 -1.8550e+00 -6.2494e+00 1.0040e+01 1.2557e+01 1.6425e+01 -#> 1.5632e+00 8.7546e-01 -1.0170e+01 -1.5403e+00 8.9363e-01 -3.8374e+00 -#> 1.6753e+01 -1.7494e+01 -1.5118e+01 6.6152e+00 -3.2020e+00 9.7728e+00 -#> 1.8719e+01 7.2675e+00 1.6485e+01 2.5752e+00 -2.4923e+00 1.7008e+01 -#> 6.3191e+00 -6.5026e+00 -2.9797e+00 -1.0865e+01 -5.6393e+00 2.7777e+00 -#> 2.7979e+00 8.5942e+00 -2.8034e+00 -1.0326e+01 1.2389e+01 1.6556e+01 -#> 1.3892e+01 6.8219e+00 7.3136e+00 1.1366e+01 8.1391e-01 4.3374e+00 -#> 4.8084e-01 4.3842e+00 -4.2161e+00 1.8518e+00 -3.7542e+00 9.6281e+00 -#> -2.2650e+00 7.0146e+00 1.1047e+01 6.1887e+00 1.8057e+01 1.4751e+00 -#> -2.4461e+01 9.5519e+00 -8.2912e-01 -1.0050e+01 -8.2641e+00 4.6827e+00 -#> 7.4679e+00 -3.5030e+00 -1.4055e+00 1.9115e+01 2.4359e+01 8.1015e+00 -#> 2.4131e+01 2.4365e+00 -6.8482e+00 2.9907e+00 -4.4200e+00 2.4605e+00 -#> -2.3553e+00 6.7272e+00 -1.4875e+01 -1.3162e+01 9.2889e+00 6.4767e+00 -#> 2.3963e+01 1.2350e+01 8.0211e+00 1.3432e+01 -1.4131e-01 6.6445e+00 -#> 1.3471e+01 7.7444e+00 7.6033e+00 -2.4399e+00 2.5683e+01 -6.2176e+00 -#> 8.8612e+00 -4.4346e+00 1.4746e+01 -2.9410e+00 -6.8655e+00 6.2316e+00 -#> -3.5537e+00 -7.2698e+00 -4.7409e+00 -7.2148e+00 -5.8450e+00 -5.2691e+00 -#> 1.3115e+00 1.6191e+00 4.7505e-01 -1.5181e+01 -1.1890e+00 -1.3413e+01 -#> 2.0220e+00 4.9279e+00 2.4231e+00 -5.2500e+00 4.8225e+00 -1.3004e+01 -#> 1.6325e+00 -8.4463e-01 -5.2737e+00 -4.4634e+00 9.0230e-01 1.5081e+01 -#> -9.6546e+00 -1.6568e+01 -2.3547e+00 -9.3497e+00 -9.6521e+00 -4.5624e+00 -#> -1.1359e+01 4.4158e+00 7.5889e+00 5.7622e-01 -7.3438e+00 -1.0305e+01 -#> 3.7111e+00 -3.7338e-01 1.4702e+01 -6.3014e+00 -6.5463e+00 2.1742e+01 -#> 9.0815e+00 2.7954e+00 -1.5308e+01 1.0759e+01 3.7131e+00 -1.3179e+01 -#> 3.3553e+00 -8.0263e+00 -4.2763e+00 8.2564e-01 -3.5557e+00 -6.8022e-01 -#> -#> Columns 13 to 18 -7.4942e+00 -1.4010e+01 -5.3054e+00 -8.4081e+00 -3.8957e-01 7.1862e+00 -#> -7.1702e+00 -7.3959e+00 2.7816e-01 9.9267e+00 -2.1226e+01 2.2359e+01 -#> 3.7178e+00 8.9062e+00 -1.9239e+00 3.9103e+00 -3.0656e+00 -2.7693e+00 -#> -6.8793e+00 6.7350e+00 7.0029e+00 -1.1243e+00 9.9269e+00 -6.0519e+00 -#> -1.9346e+01 9.3772e+00 -1.6757e+01 8.7048e+00 2.2695e+00 -9.9442e+00 -#> -2.1270e+01 -1.9596e+01 -1.2583e+01 -2.6858e+01 -2.5704e-01 -2.3715e+01 -#> -1.5152e+01 -1.1151e+01 -1.4880e+00 -1.2560e+01 1.4956e+01 1.3154e+00 -#> 1.4843e+01 1.0410e+01 6.9810e+00 -2.2508e+01 5.1875e+00 9.2162e+00 -#> 8.8139e+00 8.1885e+00 1.5398e+01 -6.9044e+00 1.1073e+01 -4.3035e+00 -#> -5.6897e+00 1.0101e+01 -2.1102e-01 -8.2544e+00 -7.0694e+00 1.1567e-01 -#> -5.6823e+00 4.4739e+00 -1.0758e+01 -1.8899e+00 3.8160e+00 -8.9889e+00 -#> -6.3853e+00 3.2104e+00 5.0729e+00 -2.8138e+00 2.5488e+00 3.5416e+00 -#> -1.3139e+01 -1.6489e+01 1.7487e+01 5.5499e+00 9.9309e+00 -5.3077e-01 -#> 3.8411e+00 6.1458e+00 6.2721e-02 8.6752e-01 1.6274e+00 1.2679e+01 -#> 3.6154e+00 -6.3606e+00 -7.1458e+00 -9.3122e+00 5.1054e+00 3.1354e+00 -#> -8.9520e+00 -7.4094e+00 -1.3353e+01 1.1036e+01 3.9082e+00 -9.9292e+00 -#> -8.0426e+00 -2.7335e+00 -1.6823e+01 -1.5870e+01 -2.4214e+00 -1.9141e+00 -#> 3.9491e+00 5.3443e+00 1.4690e+01 -2.9426e+00 4.3608e+00 7.3118e+00 -#> 4.9603e+00 -4.8410e+00 -8.5335e+00 -2.0889e+00 2.1733e+00 -6.1265e-01 -#> 7.4755e-01 1.5469e+01 6.2350e+00 -9.7556e+00 1.9131e+01 4.6571e+00 -#> 1.3613e+01 -8.7656e-01 5.4168e-01 9.9604e+00 1.6535e-01 9.2111e+00 -#> 4.1108e+00 1.3246e+00 -3.6717e+00 -3.7175e+00 -1.5989e+01 -3.8839e+00 -#> 1.0627e+01 1.4176e+01 -5.0590e+00 -2.0990e+01 1.3081e+01 -3.2022e+00 -#> 2.4884e+00 -8.7539e+00 -1.5358e+00 9.2289e+00 6.3881e+00 -6.5302e-01 -#> 8.6044e+00 5.5272e+00 -3.5769e+00 -4.3029e+00 8.1571e+00 -1.5234e+01 -#> 1.0456e+00 1.2951e+01 1.2300e+01 -7.5131e+00 -8.2078e-01 -5.4459e+00 -#> 1.1622e+00 -3.2786e+00 -2.4809e+00 9.2949e+00 4.5887e-01 -5.6015e+00 -#> 1.0684e+01 -8.1112e-01 -7.1593e+00 6.1027e+00 -6.0294e+00 -4.8785e+00 -#> -3.0858e+00 1.3670e+01 -2.5882e+00 7.5500e+00 7.6051e+00 -9.8345e+00 -#> -1.4309e+01 -2.4152e+01 -4.3382e+00 2.0233e+01 -6.3286e+00 -1.8082e+01 -#> 1.5897e-02 3.6208e+00 2.3562e+00 -1.3223e+01 -9.8380e+00 -4.8735e+00 -#> 2.0443e+00 -1.5113e+01 -3.1743e+00 -4.9740e-01 -1.1086e+01 3.2641e+00 -#> 1.5150e+01 -5.8453e+00 -1.1136e+01 6.0637e+00 -6.3859e+00 -2.8524e+00 -#> -#> Columns 19 to 24 -5.6893e+00 -1.9453e+01 -3.5957e+00 1.5872e+01 -1.3028e+01 5.1139e+00 -#> 6.9340e+00 1.2190e+00 1.2480e+01 1.2942e+00 -1.6315e+00 1.4260e+01 -#> 1.2432e+01 -4.2904e+00 2.6480e+00 -1.8259e+00 -1.5810e+00 3.2235e+00 -#> 1.4535e+01 -7.8076e+00 1.7186e+00 -7.7738e+00 -7.1399e+00 -8.9696e-01 -#> 8.6471e-01 5.4114e+00 1.1150e+01 5.3790e+00 7.0586e+00 1.9771e+00 -#> -9.0428e+00 -6.1166e+00 -1.1130e+01 1.3702e+01 -7.3280e+00 1.9078e+00 -#> -1.2262e+01 4.3546e+00 6.0670e+00 -3.4008e+00 8.6816e+00 7.8010e-02 -#> -2.0900e+01 4.2239e+00 -9.2654e-01 7.3594e+00 8.6544e+00 2.1153e+00 -#> -1.1320e+01 1.2950e+01 -8.8325e+00 5.0883e+00 1.2992e+01 -1.0941e+01 -#> -5.0212e+00 2.9902e+00 -3.3185e+00 5.6894e+00 -1.4010e+01 -1.0241e+01 -#> -1.1013e+01 -3.3661e-01 2.2183e+00 1.8726e+00 -1.7061e+00 5.6472e+00 -#> -8.8008e-01 4.3267e+00 -4.8622e+00 7.6512e+00 -5.9359e+00 2.2392e+01 -#> -3.8189e+00 -8.8454e+00 -2.0480e+00 -1.0577e+01 -4.6067e+00 -1.5634e-01 -#> 1.4468e+00 -4.0155e+00 -7.7699e+00 -6.3497e+00 -1.2672e+01 6.8436e+00 -#> 2.3073e+00 1.2243e+01 -9.7741e+00 1.7353e+01 -8.2820e-01 1.3717e+00 -#> 1.4566e+01 -1.2031e+01 -2.8944e+00 1.1853e+01 5.4134e+00 -1.2361e+00 -#> -8.3577e+00 2.4359e+00 -1.3276e+01 -4.0730e+00 -1.0948e+01 1.3474e+01 -#> 3.8182e-01 -6.4262e+00 3.8115e+00 -2.2974e+00 -5.7632e-01 9.6762e+00 -#> -5.3428e+00 8.6593e+00 -6.0399e+00 1.8404e+01 2.2284e+00 -9.9725e+00 -#> 4.7909e+00 -2.2855e+00 -4.3346e+00 -5.4306e+00 2.7422e-01 -2.0013e+00 -#> -8.8541e+00 -6.9699e+00 -1.3400e+01 8.0493e+00 6.8156e+00 -9.7870e-01 -#> -7.3828e+00 2.8325e+00 7.9561e+00 8.8198e+00 1.1244e+00 -1.6940e+00 -#> 4.4528e+00 -1.8023e+00 -6.6527e+00 -3.9291e+00 -5.5194e+00 1.8781e+00 -#> 6.1969e+00 -1.2150e+01 1.4234e+01 1.0549e+01 1.6503e+01 5.4961e+00 -#> 1.9726e+01 -1.9853e+01 1.1585e+01 -2.1674e+00 -1.4682e+01 5.4222e+00 -#> -7.7661e+00 -2.6476e+00 -8.8573e+00 -3.4332e+00 -9.9211e+00 8.6490e-02 -#> 1.4378e+01 4.0144e+00 -9.0094e+00 2.5586e+00 -7.2253e+00 -7.1700e+00 -#> -1.1073e+00 -2.0247e+01 1.6292e+00 -1.5077e+01 -1.4288e+00 -1.5738e+00 -#> -1.2809e+01 -8.0958e-01 -3.1857e+00 -1.3108e+01 4.3087e+00 -1.0828e+01 -#> 1.0087e+01 -8.5014e+00 8.2723e+00 5.4861e+00 -1.5241e+01 5.1375e+00 -#> 4.1856e+00 -6.3674e+00 -6.0491e-01 -3.4339e+00 3.0145e+00 7.3578e+00 -#> -2.0565e+00 7.9100e+00 -1.1618e+01 -1.0194e+01 -1.0763e+01 -1.8640e+01 -#> 6.7430e+00 -1.1374e+01 2.1704e+00 -7.5976e+00 -8.7040e+00 -7.4410e-03 -#> -#> Columns 25 to 30 1.3558e+01 3.3122e+00 2.0959e+00 1.5647e+01 1.0997e+01 -1.9436e+01 -#> 2.6344e-02 7.2531e+00 1.2503e+01 -9.3091e+00 3.1167e+00 -1.1548e+01 -#> -2.7886e-01 7.5301e+00 4.8612e+00 2.8948e+00 -2.8620e+00 6.4170e+00 -#> -6.6708e+00 1.7084e+00 -1.2577e+01 -4.7185e+00 -4.7590e+00 1.1023e+01 -#> -5.3235e+00 1.5145e+00 8.6505e-01 8.3207e+00 -2.0379e+01 4.0297e+00 -#> 1.5575e+01 -2.6839e+00 7.3206e+00 1.3451e+01 -5.3136e+00 -4.7260e+00 -#> 6.1753e+00 1.2691e+00 -7.8798e-02 1.5560e+01 -1.3044e+00 -2.4853e+00 -#> -5.1093e+00 2.5675e+00 4.8341e-01 2.7719e+00 5.3591e+00 -2.0743e-01 -#> 8.9852e+00 9.6871e+00 -6.3932e+00 3.0404e+00 7.4364e+00 -1.0637e+01 -#> 4.8914e+00 -6.7086e+00 3.4064e+00 8.1951e+00 6.9123e+00 -8.8660e+00 -#> -7.4082e-01 -3.7787e+00 7.6736e-02 1.8565e+00 -6.1790e+00 1.2534e+01 -#> 9.1540e+00 -4.7321e+00 5.9293e+00 1.2966e+00 -1.2415e+01 1.3284e+01 -#> -1.3991e+01 9.3310e-01 -2.2098e+00 6.6424e+00 4.0365e+00 1.3118e+01 -#> -1.1929e+01 -4.9660e+00 8.5513e+00 -3.3803e+00 -1.0103e+01 -2.6659e+00 -#> 3.5112e+00 -6.5770e+00 5.2533e+00 6.0158e+00 -4.1765e-01 9.3664e+00 -#> 3.5893e+00 -5.5666e+00 -8.5346e-01 1.0235e+01 -2.8308e-01 -5.9867e+00 -#> 1.2652e+01 -2.2574e+00 5.1448e+00 1.3336e+00 -1.3777e+01 -2.0938e+01 -#> -9.7569e-01 7.9474e+00 -7.8825e+00 -3.6059e+00 4.1351e+00 1.9781e+00 -#> 4.8112e+00 -4.8876e+00 -1.4709e+00 4.2883e+00 2.9530e+00 1.8854e+00 -#> -1.3174e-01 -6.2074e+00 4.1328e+00 6.5042e+00 -9.7857e+00 6.5055e-01 -#> -3.8156e+00 -1.3507e+01 1.6801e+00 2.0514e+00 5.6115e+00 3.8388e+00 -#> -6.6274e+00 -7.6941e+00 1.3906e+00 -8.0568e+00 2.6097e+00 -1.2881e+00 -#> 6.0680e-01 -3.7622e+00 1.9647e+00 1.7416e+01 -1.8234e+01 -1.8430e+00 -#> -4.6856e+00 -4.4744e+00 3.9488e+00 9.2486e+00 5.6437e+00 1.2768e+01 -#> 5.5231e+00 -5.7111e+00 9.3682e-01 5.9836e+00 -9.2493e+00 3.0670e+00 -#> 9.8058e+00 -2.6402e+00 1.1336e+01 6.2685e+00 -2.6886e+00 7.6441e+00 -#> 5.9363e+00 -6.5192e+00 7.8321e-01 -4.0427e-01 -1.3133e+00 3.0544e+00 -#> -6.3091e+00 1.0040e+01 -1.2751e+01 -1.1534e+01 -5.8298e+00 2.7647e+00 -#> -7.4592e+00 9.4103e+00 1.4476e-01 5.4149e+00 6.6634e+00 1.8745e+01 -#> 2.3166e+00 -2.2831e+00 2.4495e+00 -1.4437e+01 -4.8693e+00 4.2169e+00 -#> -2.0170e+01 -3.3310e+00 -8.3002e+00 -5.6621e+00 2.9494e+00 1.0479e+01 -#> -1.6103e+01 2.7127e+00 1.2102e+01 1.2768e+01 1.7274e+01 -4.4694e+00 -#> -8.0425e+00 1.6162e+01 -6.0518e-02 1.0420e+01 4.4894e+00 6.6914e+00 -#> -#> Columns 31 to 36 3.6761e+00 1.0637e+01 1.4562e+01 5.9846e+00 4.0722e+00 9.0006e+00 -#> -5.1966e+00 1.5071e+01 -3.9094e+00 -8.8287e+00 1.9831e+01 1.1739e+01 -#> -1.7767e+00 5.8233e+00 -7.6486e+00 4.5091e+00 7.6294e+00 -5.1123e+00 -#> -6.5812e-01 5.4449e+00 9.8120e+00 3.2020e+00 -5.4120e+00 8.7914e+00 -#> -1.8320e+00 2.6157e+00 2.6970e+00 -2.7039e+00 -6.0996e+00 -3.1244e+00 -#> -3.5625e+00 -4.4053e+00 -2.8956e+00 1.3660e+01 -5.8036e-01 1.0343e+00 -#> -8.0378e+00 -1.0343e+01 1.2668e+01 3.7481e+00 -5.9861e+00 1.0574e+01 -#> -5.5252e+00 6.8272e+00 -8.0569e-01 4.2449e+00 7.8909e+00 1.2782e+01 -#> -5.3252e+00 7.3852e+00 -6.1989e+00 8.6960e+00 2.6933e+00 -1.5599e+01 -#> 1.6819e+01 -2.1453e-01 1.8835e+01 4.1777e+00 5.2485e+00 1.8756e+01 -#> 1.0849e+01 -3.0162e+00 -1.2584e+01 -1.1527e+00 1.0408e+01 -3.3177e+00 -#> 6.6933e-01 6.6165e+00 -7.3807e-01 7.7241e-01 7.5175e+00 8.1878e+00 -#> -2.0470e+00 1.7072e+01 1.0649e+01 -1.3827e+01 -7.3002e+00 7.4366e+00 -#> -1.2941e+01 -1.1538e+01 7.9025e+00 -1.5590e+00 4.0351e+00 4.5150e+00 -#> 1.2127e+01 9.3004e-01 -7.7179e+00 8.3346e+00 -8.8323e-01 3.2883e+00 -#> 1.3323e+01 4.5492e+00 4.5777e+00 3.5815e+00 -6.6529e-01 -8.3083e-01 -#> 6.3809e+00 -1.5644e+01 1.8017e+00 -5.0205e+00 4.5602e+00 1.4026e+01 -#> -1.9529e+00 7.3174e+00 -2.8223e+00 8.9153e-01 -5.3674e+00 3.2901e+00 -#> -1.2805e+01 2.5653e+00 -2.9729e-01 8.0468e+00 -4.1089e+00 1.0789e+01 -#> -8.7715e-01 1.3078e+00 -1.4629e+01 8.8644e+00 1.8758e+01 -1.1461e+01 -#> 8.1886e+00 7.9778e+00 1.5112e+01 -6.0972e+00 -7.2719e+00 5.6152e+00 -#> -1.0019e+01 -2.1547e+00 6.7985e+00 -1.1222e+00 3.7693e+00 -3.7409e+00 -#> -1.3928e+01 1.2986e+00 -1.6339e+00 2.8692e+00 -1.0651e+01 1.3526e+01 -#> -6.4898e+00 6.9418e+00 1.5826e+01 1.5640e+01 -1.0684e+01 -6.8203e+00 -#> 1.0347e+01 -1.3085e+01 -1.5079e+01 2.2178e+01 -1.2524e+01 -2.0429e+01 -#> 1.3768e+01 6.9692e+00 -7.8130e+00 3.4595e+00 6.7786e+00 1.5439e-01 -#> 1.0279e+01 -7.5763e+00 -6.7798e+00 -5.3758e-01 3.0871e+00 -5.8363e+00 -#> -6.6974e+00 9.5524e+00 -5.5794e+00 4.1938e-01 1.8463e+00 4.4021e+00 -#> 3.0242e+00 -1.3227e+01 6.3768e+00 -3.2215e+00 2.1403e+00 1.8556e+00 -#> 1.7823e+01 5.1552e+00 3.1660e+00 1.7945e+00 -2.3683e+01 9.8308e-01 -#> -1.4320e+01 -3.1440e-01 -9.9361e-01 -8.1371e+00 8.5137e+00 -2.5419e+00 -#> 1.5456e+01 -1.1247e+01 9.6495e+00 -1.7449e+01 1.5162e+00 3.9664e+00 -#> -4.0748e+00 1.2323e+01 1.0367e+01 1.8966e+00 -6.5433e-01 9.7306e-01 -#> -#> Columns 37 to 42 6.3193e+00 4.1580e-01 -5.5026e+00 -5.2077e+00 3.8273e+00 -1.7859e+01 -#> 6.0152e+00 4.8147e+00 -3.1357e+00 -1.1225e+01 4.4666e-02 -2.4823e+00 -#> 1.6241e+01 -1.2788e+01 -7.8586e+00 4.2744e+00 8.6902e-01 1.9180e+00 -#> 4.2309e+00 -1.2683e+00 1.1844e+00 5.9364e-01 1.2986e+00 7.0099e+00 -#> 1.8031e+01 -1.3735e+01 3.0254e+00 -6.7109e+00 -8.0463e+00 4.5866e+00 -#> 1.2389e+01 -1.3599e+00 -1.3221e+01 -1.2302e+01 -1.9856e+01 4.3722e+00 -#> -2.8307e+00 1.1498e+01 3.3417e+00 -2.7400e+00 -1.3726e+01 8.2211e+00 -#> 1.5877e-01 1.0815e+01 2.8379e+00 4.1172e+00 1.7293e+00 -2.6715e+01 -#> 5.9303e+00 1.5277e+01 -1.0294e+01 1.1066e+01 2.9799e+00 4.0886e+00 -#> 9.0479e+00 9.2719e+00 1.0262e+01 -6.9908e+00 7.9677e+00 -3.2507e+00 -#> 5.7520e+00 8.5322e+00 8.9157e+00 4.7481e+00 -7.1603e+00 -1.1803e+01 -#> -2.4535e+00 2.2372e+00 4.7389e-01 -2.4764e-01 6.1815e+00 -6.0736e-01 -#> 2.7749e+00 9.9736e+00 8.4133e-01 -1.1915e+00 1.6670e+01 5.8009e+00 -#> -1.0435e+01 1.1925e+01 6.6147e+00 -1.0507e+01 -2.8868e+00 5.8958e+00 -#> 4.8700e+00 -1.6851e+01 1.1281e+01 3.3564e+00 9.3250e+00 -8.3811e+00 -#> 4.7449e+00 -6.5033e+00 -9.0380e+00 -7.3736e+00 -1.5892e+00 5.9191e-02 -#> -1.1771e+00 7.8254e-01 -4.0813e-01 -2.0650e+00 -2.8252e+01 2.5435e+00 -#> -8.5939e+00 1.0028e+00 5.0527e+00 -2.2370e+00 7.3478e+00 -1.6387e-01 -#> -3.6050e+00 1.1006e+01 2.0796e+01 -8.6269e+00 9.8430e+00 1.6179e+00 -#> -2.2736e+00 -2.3937e+00 -1.7850e+00 1.6858e+00 3.9957e+00 -1.4926e+01 -#> -6.3791e-01 8.6191e-01 1.3340e+01 -8.5667e+00 -2.6413e+00 -1.2077e+00 -#> -1.3345e+00 9.3251e+00 -1.2439e+01 2.0178e+00 3.7639e+00 -8.5700e+00 -#> 4.7723e+00 -2.5637e-02 -4.7521e+00 -6.5413e+00 -1.6969e+01 7.9883e+00 -#> -1.3204e+01 4.1165e+00 -1.6554e+01 8.3988e+00 -7.9562e+00 -3.6346e+00 -#> 4.0281e+00 -1.1415e+01 -1.5121e+01 1.3674e+01 9.2672e-01 7.8058e+00 -#> -6.4940e+00 3.4464e+00 4.2818e+00 1.0922e+01 1.1316e+00 -3.7089e+00 -#> 5.0213e+00 9.4807e-01 -2.1677e+00 5.5849e+00 1.1946e+01 1.0366e+01 -#> 1.1274e+00 5.3332e+00 1.5231e+00 1.3328e+01 -1.3047e+01 8.8516e+00 -#> -3.4887e+00 -7.2093e+00 1.3508e+01 3.4285e+00 -2.4273e+00 -2.3980e+00 -#> 1.1403e+01 -1.4971e+01 2.1474e+00 2.2323e+00 5.8092e+00 5.7479e+00 -#> -3.5859e+00 1.2391e+01 -3.6762e+00 5.5437e+00 -1.0012e+01 -1.2154e+01 -#> 6.0459e+00 -9.5193e+00 1.4791e+01 -2.6113e+01 7.2202e+00 6.5619e+00 -#> -3.8888e-01 8.4880e-01 -5.9651e+00 -1.3199e+01 1.5273e-01 -1.0037e+01 -#> -#> Columns 43 to 48 1.2266e+01 -2.8834e+01 -8.0280e+00 -2.0428e+00 -4.9001e+00 9.2407e-01 -#> 9.0277e+00 1.7013e+00 5.1573e+00 -5.6153e+00 2.1339e+00 -2.9950e+00 -#> 7.0761e+00 1.5237e-02 -1.3783e+01 -5.3301e-01 3.7376e+00 1.3496e+01 -#> -1.1369e+01 -1.5329e+01 -7.7484e+00 5.5542e+00 1.0380e+01 5.9031e+00 -#> 1.7468e+00 -3.4204e-01 7.3675e+00 4.2309e+00 6.3147e+00 4.0213e+00 -#> 1.1571e+01 1.5676e+01 -6.5670e+00 6.0116e+00 -9.0544e+00 -1.0544e+01 -#> -8.3220e+00 -1.5512e-01 1.5130e+01 -5.3928e+00 -4.0473e+00 1.3477e+00 -#> -1.3731e+01 -9.9624e+00 1.8703e+00 -3.4599e+00 -1.0661e+01 6.7202e+00 -#> 9.7868e+00 -5.0907e+00 -1.8674e+01 2.8963e+00 2.7677e+00 1.6540e+01 -#> -2.8286e+00 2.6729e+00 -7.6072e+00 9.1315e+00 -3.8775e+00 -7.6609e+00 -#> 1.5656e+00 -6.6805e+00 9.2606e+00 1.1768e+01 -1.4334e+01 2.7171e+00 -#> -7.4580e-01 -3.4155e+00 -4.1006e+00 8.6670e+00 9.1879e-01 -1.4253e+00 -#> -4.0258e+00 -1.1773e+01 4.4728e+00 8.1287e+00 -1.2450e+01 6.1224e+00 -#> -4.0809e+00 1.3555e+01 -3.5414e+00 8.0674e+00 -5.2967e-03 6.7733e-01 -#> -8.5613e-01 8.3502e+00 4.0190e-01 6.8232e+00 1.3272e+01 -1.0222e+01 -#> 6.6294e+00 3.8793e+00 6.6421e+00 -3.5255e+00 1.1354e+00 -1.6485e+01 -#> -1.1456e+01 -6.4231e+00 -6.0706e+00 -1.6325e+00 -9.0896e+00 -3.1131e+00 -#> 6.9775e+00 -1.0652e+01 6.8487e+00 -4.7825e+00 9.5261e+00 5.9431e+00 -#> -5.6337e+00 2.5851e+00 -1.9914e+01 -2.2390e+00 -8.6217e+00 -3.2185e+00 -#> -3.1673e+00 -7.3470e+00 -8.4587e-01 1.4544e+01 4.8756e+00 -1.0150e+01 -#> -1.8867e+01 1.3802e-01 2.3707e+00 -4.2196e+00 5.3611e+00 -7.3660e+00 -#> -1.3701e+00 4.8789e-01 8.8058e+00 -1.4194e+01 5.8139e+00 -2.0546e+00 -#> -1.4534e+01 3.2997e-04 8.3650e+00 4.9874e+00 -1.5715e+00 -2.5045e+00 -#> 7.4516e+00 -1.2611e+01 3.1515e-01 -3.5712e+00 -2.8321e+00 -1.1929e+01 -#> 1.7093e+01 -5.9745e+00 2.7888e+00 1.9365e+00 1.1775e+01 -1.8040e+01 -#> 2.6475e+00 1.1220e+01 -6.5698e+00 4.0266e+00 -2.2043e+00 3.9965e-01 -#> -8.9387e+00 1.1510e+01 -1.3995e+01 6.5618e+00 -8.3189e+00 2.2141e+00 -#> -4.2109e+00 -1.1936e+01 -7.9612e+00 5.2782e+00 5.8308e+00 1.4404e+01 -#> -3.3679e+00 -5.2825e+00 9.1787e+00 -1.5533e+01 2.4056e+00 -6.0696e+00 -#> 1.0202e+01 -1.0092e+01 1.2604e+00 6.0711e+00 -6.1118e+00 -7.1285e+00 -#> 6.6391e+00 -1.0309e+01 9.7536e+00 9.0842e+00 -5.4748e+00 6.3011e+00 -#> -1.3415e+01 3.2043e+00 -8.2222e+00 -7.4362e+00 7.4519e-01 1.9304e+00 -#> -3.7220e+00 -4.5726e+00 -2.2351e+00 -1.9753e+00 -1.2845e+00 -7.2767e+00 -#> -#> Columns 49 to 54 -7.8283e+00 7.6700e+00 -6.4372e+00 6.7907e-01 7.5293e+00 -2.7664e+00 -#> 6.8952e+00 -1.7029e+00 -1.9042e-01 3.6731e+00 2.5942e-01 4.6866e+00 -#> -1.9880e+00 5.6396e+00 -7.6415e+00 -3.5026e+00 -3.4724e-01 4.2378e+00 -#> -2.9653e+00 3.0887e+00 4.5493e+00 -7.4611e+00 -1.0166e+01 9.6342e+00 -#> 6.3533e+00 -2.0191e+00 -3.0445e-01 6.5535e-01 6.8669e+00 2.7846e+00 -#> -3.0310e+00 1.6618e+01 -1.5621e+01 -2.1432e+00 7.1358e+00 5.5269e+00 -#> 5.7609e+00 -4.8679e+00 7.3759e+00 -9.2898e-01 2.1645e+00 8.0198e-01 -#> -1.5698e+00 9.1562e-01 4.2051e+00 -5.0667e+00 -6.5677e-02 -1.8864e+00 -#> -1.6002e+01 4.7873e+00 -2.8521e+00 -6.0687e+00 -2.3322e+00 8.8416e-01 -#> 1.2803e+01 2.7640e+00 4.7702e+00 -3.9992e+00 -2.7644e+00 5.7228e+00 -#> 1.3082e+00 4.6939e+00 -4.8850e+00 1.8737e+00 7.4326e+00 -8.6145e+00 -#> 1.6210e+00 -7.7442e+00 7.6699e-01 3.9052e+00 -7.2906e-01 8.8425e+00 -#> -3.1784e+00 1.6128e+01 7.4539e+00 -7.2732e+00 -2.3566e+00 -4.6841e-01 -#> 1.8711e+01 -2.0399e+01 -1.7557e+00 -6.4544e+00 7.8731e-01 -3.3787e+00 -#> -3.0505e-01 1.3782e+00 -2.6054e-01 2.7923e+00 -9.2549e-01 1.0817e+01 -#> 1.4942e+00 7.8411e+00 1.3749e+00 -8.3084e+00 -1.3907e-01 7.3425e+00 -#> 3.9848e+00 -4.5565e+00 -1.3949e+01 4.5808e-01 4.6363e+00 -5.0544e-02 -#> 4.7378e-01 2.4725e+00 8.0652e-01 8.6825e+00 3.1991e-01 -3.7330e+00 -#> 2.5099e+00 -7.2661e+00 4.7621e+00 3.0879e+00 -5.1366e+00 -1.6244e+00 -#> -1.0469e+01 7.9192e+00 2.0401e+00 -1.2957e+01 3.4234e+00 -2.3505e+00 -#> 6.6999e+00 7.5522e+00 -2.8123e+00 -3.0786e+00 -1.0235e+01 -2.7743e+00 -#> -1.6822e+01 -1.2617e+01 9.4496e-01 2.7541e+00 1.7316e+00 6.2106e+00 -#> 1.0395e+01 -2.0956e+00 -1.2255e+01 -4.5416e-01 -3.9487e+00 4.3195e+00 -#> 4.6914e+00 3.1649e+00 4.1611e+00 6.1286e+00 5.0255e+00 7.1065e+00 -#> 7.5021e+00 6.8406e+00 -9.6136e+00 1.7223e+01 -1.3089e+01 -3.5942e+00 -#> -5.5354e+00 -5.7653e+00 4.2610e+00 2.9217e+00 7.4059e+00 1.9449e+00 -#> -3.5394e+00 8.6149e+00 -2.3641e+00 -5.7500e+00 1.5863e+00 -2.8873e+00 -#> -1.4569e+01 4.0575e+00 -9.7085e+00 -3.8323e+00 -1.4526e+01 -3.2623e+00 -#> 4.0992e+00 -5.9664e+00 -5.2388e+00 7.7750e-01 -5.2460e+00 1.2358e+00 -#> 6.7283e+00 -1.1768e+01 4.6034e+00 1.5952e+01 1.1168e+00 -9.1776e-01 -#> -3.8493e+00 -2.4761e+00 5.4825e-01 -8.8532e+00 1.0645e+00 9.2228e+00 -#> 1.3682e+00 -4.5516e+00 8.4050e-01 9.9312e-01 -2.7629e+00 3.6249e+00 -#> 1.8553e+01 3.0052e-01 3.0743e+00 -8.5578e+00 2.4675e+00 -3.0139e+00 -#> -#> (10,.,.) = -#> Columns 1 to 8 4.1819 -2.7002 -18.1760 -1.1440 -2.9460 6.0683 -1.4644 -8.8791 -#> -5.2517 -2.8884 -7.8021 1.1429 1.3565 18.4330 21.5672 -6.8964 -#> 7.6318 -6.9620 8.9294 -8.1504 -0.3623 0.9137 4.5002 -3.5007 -#> -5.7043 -0.1398 -5.9344 14.1726 4.8862 3.5814 5.0882 -6.4967 -#> -0.8601 -6.4114 11.4836 -7.3367 -7.5688 7.5060 -5.5756 -1.5425 -#> -8.2925 -6.0803 6.6156 -13.9416 1.9437 3.9939 1.1387 7.1872 -#> 2.5452 2.3789 -6.3848 1.8546 1.7084 -12.1336 -3.9954 13.7758 -#> -1.8303 3.3219 5.7164 -3.2491 -1.4877 9.1777 8.8536 5.8444 -#> 8.2554 5.8091 -6.1897 -8.2495 1.8647 -10.6258 -0.8845 -2.1385 -#> 1.7795 7.0254 -9.9362 6.2154 -3.4837 -0.7500 20.6577 2.2969 -#> 2.0167 -1.8900 0.5460 -1.2813 -1.4822 -3.7381 -4.6472 6.2117 -#> -2.2446 -0.4886 -4.3657 6.1555 -10.2198 4.8470 -7.4376 -2.0771 -#> -5.2713 -1.6133 -1.1467 -4.1871 5.4227 9.1644 -3.4754 -5.3385 -#> 0.3023 11.8181 1.3283 -12.7279 -7.5569 10.1936 6.1990 11.6441 -#> -3.3104 3.9960 -2.4172 2.2504 -7.6548 -12.2355 10.1682 7.0920 -#> -6.2455 -2.9252 1.0112 10.4484 1.2471 -0.9377 -12.7561 -5.3247 -#> -3.9030 -9.2627 1.0247 1.9701 -8.9632 0.5715 2.5766 1.9814 -#> -2.1366 5.8496 -6.1677 8.8440 -5.9129 0.9059 -8.7046 -2.5762 -#> 5.2249 5.9455 -8.3770 -7.0487 1.7812 -7.4599 0.4756 18.2719 -#> 2.1648 -0.6042 5.4310 -3.9076 1.8279 3.3609 8.3237 4.8889 -#> -3.8500 8.1186 3.1748 0.4045 -3.5455 1.0117 -5.7007 -7.4015 -#> 2.6253 10.7632 -3.3398 4.9756 -20.4394 3.7262 11.1934 17.5135 -#> -0.7312 4.0369 6.8090 3.6341 -0.8025 1.7529 -4.8715 10.0601 -#> -1.1232 -2.8163 -2.6153 0.1292 -13.4160 3.9119 -7.7021 10.4302 -#> 2.2927 -2.3197 3.5505 8.4843 7.7590 6.3685 -6.6163 -17.7623 -#> 2.3017 5.6381 0.4991 -10.5783 -4.1324 -13.1374 5.4518 6.6804 -#> -2.0478 -7.5091 11.3796 -7.9909 10.7743 -3.7296 -2.3580 -15.9901 -#> 4.0426 -9.7816 8.3604 2.4883 8.6378 9.9463 -3.2728 -0.6286 -#> 1.2885 -0.2170 12.5709 1.1797 -8.4373 -3.5454 -11.9339 -8.7778 -#> 5.4442 1.5278 -9.7229 1.7396 11.1748 -4.8533 -4.4837 -3.0608 -#> 3.1077 -2.4470 7.1603 5.6823 -21.7576 0.4642 -6.7589 10.2302 -#> -4.3704 2.2297 0.9107 -17.3731 2.7938 -0.4466 -1.6011 1.0906 -#> -1.2622 -2.4540 -1.1643 -8.5674 9.3976 2.6191 -14.8639 1.3717 -#> -#> Columns 9 to 16 1.8556 0.0946 0.1899 11.7107 -5.4183 11.5789 9.7279 -2.8268 -#> -3.3962 4.7751 -14.7356 11.8734 2.1015 2.1329 2.1923 3.2486 -#> 1.4306 0.3409 -9.0292 -0.4911 -3.6641 -0.5287 18.8181 -1.0346 -#> -9.8678 -5.0219 15.4837 -13.4747 -5.8500 17.1277 -0.1490 2.6720 -#> -6.8133 -7.6474 5.9233 -0.4694 -1.0901 -7.4239 10.4323 3.4986 -#> -5.5519 -0.0618 11.3379 8.0623 -3.5955 1.7862 7.5723 2.4541 -#> 1.3072 0.6549 6.0913 -18.9477 1.6956 7.7507 1.1176 -0.0589 -#> 6.1180 3.5580 2.0954 3.3366 4.7224 -3.3512 9.1941 -3.5992 -#> -8.3560 -4.7540 1.6382 -13.4815 3.3727 -1.9814 -0.0188 -3.9449 -#> 4.9662 -12.4013 19.1299 -15.2008 8.3711 3.6924 4.5674 -2.7297 -#> 0.6927 -5.4917 -5.8683 -2.1278 -9.2548 -8.4762 9.8112 -6.1467 -#> 0.5953 -0.1684 -3.7758 -3.3063 3.1845 -12.9019 -5.7863 -2.8776 -#> -9.7528 18.2416 19.8239 8.8325 -16.4710 -13.3338 4.2949 7.3125 -#> -3.6096 -7.3529 4.0815 -0.1361 4.7710 -6.8037 -22.0151 11.4496 -#> 13.0295 -16.0956 20.0523 0.4107 -3.8567 6.1604 -10.1262 -17.2949 -#> 3.1475 -0.2741 12.1048 17.4119 4.1416 -7.9165 10.1866 7.0605 -#> 7.9490 6.5653 -0.1566 6.7108 8.9105 4.9730 -11.4455 1.9475 -#> -7.0497 -1.4302 -0.2031 -0.7717 10.2553 -5.8846 -12.2485 5.1642 -#> -1.4445 -1.0975 4.7951 -10.3304 3.8726 -11.5951 18.1794 -1.8722 -#> -3.5374 1.9384 7.5930 -1.7373 -1.3728 0.1740 9.8508 -17.4885 -#> 5.2514 1.7544 7.5342 21.3597 2.5470 -13.2217 -2.4640 15.8839 -#> 15.8121 1.9745 -12.0076 9.3390 -10.4766 -11.5816 5.4715 15.8169 -#> 2.7863 4.0846 3.5614 11.5852 13.7292 -3.9139 -1.5637 -4.1002 -#> -8.6593 5.3422 -7.6377 -6.7198 2.7386 -2.0494 -7.7664 1.0160 -#> -7.9545 11.3175 5.0165 2.6514 -4.5948 4.2351 -5.6257 -0.6661 -#> 2.0352 8.5044 -11.0207 3.9361 -3.1783 -5.0761 -10.5843 -6.2035 -#> -9.2982 3.4657 -8.0456 1.1406 -5.7957 -0.0046 -2.1324 8.1229 -#> -16.6553 10.8810 -0.9489 13.3421 -8.0725 9.1832 -4.2330 7.3050 -#> -8.2410 7.6877 4.3861 -15.3545 -0.3129 0.7203 1.5005 6.7024 -#> -6.5903 -1.3819 -6.9875 -4.7477 4.7515 0.0780 -13.0780 -12.4752 -#> 0.5467 0.2573 -11.7447 11.3582 0.7454 -11.0607 -4.9295 6.2202 -#> 0.9685 0.8027 11.2603 1.4244 -10.0601 -9.5701 14.1312 11.6304 -#> -11.8415 2.6648 0.9239 -9.2368 -4.5917 12.4541 8.8730 4.3792 -#> -#> Columns 17 to 24 17.7241 16.4431 -4.4976 4.1706 -12.8776 -15.2067 -10.9706 4.8594 -#> -2.4854 4.0536 0.9395 0.8457 -2.1911 -0.2284 2.7090 -5.2006 -#> -1.5690 2.2409 15.2344 2.3106 -15.5675 4.6727 -18.2378 -8.5262 -#> -19.8752 9.8127 -10.5200 8.9852 -14.8782 -3.1370 4.1668 -10.8551 -#> -3.6712 -12.0466 7.5911 7.8817 -14.2461 -1.8762 6.1248 -4.2913 -#> 7.4452 -0.9572 -3.7197 1.3354 6.7883 -13.2608 -13.4355 14.7262 -#> 10.5368 -0.6395 15.0316 7.2175 -6.9461 -20.6147 1.1905 13.3449 -#> 9.9587 17.1341 -11.5921 -1.7717 -12.4661 6.2366 -9.3802 -1.1792 -#> 0.3752 8.5826 -13.2740 -0.0306 13.3570 17.4113 -12.8242 9.3084 -#> -11.8180 9.5951 6.7905 12.8367 -8.0408 -15.6782 6.5608 -18.7112 -#> 22.3179 -7.4422 1.5256 11.6816 5.9146 -10.3219 -3.7901 7.5836 -#> 3.8820 -7.6058 8.6871 -4.0585 8.9200 -4.1044 -1.5731 -3.6648 -#> -1.7718 -1.6068 -6.4439 11.8292 -2.6022 -16.4736 -11.8727 11.3917 -#> -17.9264 -4.7390 6.3397 -8.9043 8.8302 -16.6837 2.7382 -14.5637 -#> -1.9014 7.9378 1.7300 0.9099 -7.4772 -0.5940 -6.2970 -17.5786 -#> 1.4715 -0.3907 -9.3192 10.1859 11.9984 -14.9277 12.3849 17.7064 -#> 2.3705 -17.2180 4.6233 4.2500 -4.6967 -11.6569 -5.3638 -11.5411 -#> -1.3751 12.4701 -7.0619 4.3700 -0.2691 -0.2517 6.5075 2.4046 -#> 9.9320 -0.5399 3.7089 -8.3076 4.6503 -6.4728 -24.8052 24.1327 -#> 1.9584 7.0046 1.5319 -5.6831 -14.7582 1.4350 -3.2149 16.2855 -#> 12.0401 -0.7776 3.8262 -6.7161 0.6593 2.0989 -4.3371 12.7094 -#> -19.7485 3.2476 -10.9042 14.1623 8.4786 7.8983 0.4614 -12.9178 -#> -7.0698 1.4244 -11.8563 1.1273 15.1136 10.7658 2.1623 -7.2914 -#> 4.7051 3.1767 -8.4571 -4.8364 14.5448 8.8465 -1.5965 9.6834 -#> -4.4673 -7.8511 0.8354 5.9707 3.3181 12.2863 11.1134 -10.8450 -#> -7.2237 0.3425 17.3627 -14.8729 2.3539 -13.4249 3.3829 4.3209 -#> 1.4075 -4.3673 1.7913 -1.9425 10.2997 -7.9903 1.3192 5.3896 -#> 19.7331 -9.6629 -7.2495 -9.8141 5.0245 8.9532 -18.5641 8.1088 -#> -7.1797 -1.4011 -7.3396 2.2222 -7.3143 7.9236 2.3504 -12.2137 -#> 6.8541 2.4263 0.0412 8.6535 -3.8287 4.7261 8.2146 -1.9722 -#> 0.9613 2.0930 -8.1847 8.7364 11.5909 4.4428 -15.8480 2.7165 -#> -14.3310 3.1957 13.1065 11.7513 -4.6397 4.3785 -23.1525 2.6458 -#> 8.8166 9.1156 0.1519 -5.9758 6.7014 -3.0737 -10.1020 1.6576 -#> -#> Columns 25 to 32 10.5292 -3.9645 12.8190 2.1494 10.7930 11.1448 9.9790 12.4497 -#> 11.4912 12.2822 -0.5843 -8.5035 -10.4811 7.8215 5.3842 1.7837 -#> 9.6353 14.6377 3.8013 14.2778 -5.8059 19.4480 -11.1792 9.9659 -#> 1.2615 -12.5058 -1.1939 -1.1402 -15.7723 5.3684 1.1663 -5.7907 -#> 2.3630 -5.0501 13.3084 18.5605 1.1618 8.6902 -2.5698 -1.1850 -#> -13.9340 4.3304 5.8433 -7.3367 4.6003 -6.4036 4.8493 5.0689 -#> -1.3893 14.5972 16.2413 14.9188 -0.5374 -7.7302 -0.7117 -2.1723 -#> 1.4423 2.1201 1.2344 -0.5692 14.9963 3.6928 -3.9090 0.6049 -#> 0.7095 -7.0357 -2.1930 -3.7832 -5.9221 -8.0483 -6.4077 2.4676 -#> -1.5388 0.4704 12.8991 -6.0182 14.1894 -0.9701 9.0996 -16.5178 -#> -10.3212 -10.7765 4.8461 4.5388 4.3799 -10.6147 0.8329 -6.4080 -#> 7.8106 2.1017 -9.8434 -0.4331 5.9101 -3.3945 -2.3179 -2.2783 -#> -13.8799 0.6262 -4.3279 3.8890 -3.8753 -8.0789 -0.7328 -4.0891 -#> 14.3206 7.7849 3.6460 -4.0085 -7.5287 -11.1466 4.8729 -11.5882 -#> -0.0881 -2.9461 2.1459 11.0899 3.5483 -0.2515 2.0340 1.7700 -#> 9.2189 -13.4063 -2.4231 3.0836 5.5822 2.4684 2.2476 1.8075 -#> -2.4374 2.6486 -5.4516 1.9154 5.0113 -13.6252 9.9636 -11.3138 -#> 2.4418 -4.0569 -8.3882 -19.7733 -2.0071 4.8646 2.7268 6.5430 -#> -5.4768 -4.4051 17.2406 1.0203 7.2529 -12.8022 1.9017 4.4126 -#> 5.6344 7.9751 3.8124 -4.3246 6.1148 15.3043 -18.8127 10.1895 -#> 1.9013 4.9514 4.0854 -3.0271 19.8358 -0.8026 14.0764 -0.8553 -#> -2.2414 -11.0523 18.3026 -1.5447 0.4390 -9.4300 0.1453 -6.5077 -#> -12.4334 6.4966 -14.2762 13.4793 6.2123 0.7264 -8.5034 -7.8488 -#> 3.6735 -10.9852 12.3695 -3.4940 3.7098 -15.8378 -2.1686 -8.8639 -#> -6.1870 6.3384 -17.9945 4.0391 5.4411 3.9668 -13.1839 6.2893 -#> -12.5335 10.5731 -9.3001 -6.2328 -10.8204 4.7457 3.1881 6.8758 -#> -9.5590 4.8932 -1.7767 -4.7540 -8.1867 0.3086 12.7548 -12.7002 -#> 3.6840 -12.3161 -6.0951 -1.2251 4.7612 -1.1004 -1.5123 -2.2366 -#> -3.0114 -9.8084 -9.4691 0.2194 -9.6904 -13.4815 -1.6036 -6.8736 -#> -13.0158 -10.7943 -6.4755 4.5214 0.5924 -7.7346 8.6752 10.1047 -#> 7.4194 -13.7749 5.0595 5.8050 5.8419 -0.0406 9.7345 -1.6591 -#> -12.7815 0.2335 2.2811 -2.3682 -7.0829 -12.3729 -3.2147 1.8516 -#> 8.9504 -2.9962 8.6639 -3.1886 -5.3588 5.4321 -9.9454 -5.3727 -#> -#> Columns 33 to 40 -2.6904 -3.1793 -3.9982 2.2110 -15.0660 -7.8316 -4.5374 -3.6815 -#> 6.3429 -0.5526 -12.4770 10.9454 10.4932 2.0863 7.1767 -15.6242 -#> 9.9699 -2.9062 -7.5707 -4.7522 -8.1896 -2.9347 -0.8152 -4.8127 -#> -12.2653 11.1087 0.8183 -3.0673 5.4707 -7.3395 8.2259 0.0790 -#> 10.7333 11.7357 -0.5174 -2.7349 -11.8983 4.5970 4.7187 7.6155 -#> 0.8824 -0.0948 2.7482 -16.1985 -3.4809 -13.0749 -5.6538 1.3078 -#> -4.6005 10.9233 -8.3767 -23.2996 -9.9544 4.9645 6.6196 -1.6382 -#> -4.8016 1.2827 2.5463 -9.7602 -11.3491 -6.9045 8.6490 -3.3517 -#> 10.8844 -4.3575 4.9099 25.9703 -7.6978 -8.9655 -17.4221 1.0042 -#> -4.2847 2.7185 -2.9281 -5.9636 7.4574 0.7742 -1.9685 3.4123 -#> 9.2045 5.2980 15.7517 -5.8331 -15.0311 7.7835 -9.3464 -1.9853 -#> 6.4127 -4.6117 19.0878 -3.0422 -1.5535 -2.1965 -1.4020 1.2183 -#> -11.4244 7.5634 8.4587 -1.8272 3.1815 11.3702 1.6303 8.1204 -#> 1.2992 -2.9209 -2.0812 11.5782 22.2623 7.6336 -5.1908 5.6859 -#> -2.1399 3.7254 -9.4623 -7.0854 -2.7738 12.2496 1.6377 2.5113 -#> -12.5850 -4.9262 3.3072 -5.5994 4.4245 -1.8885 -0.8919 -10.0518 -#> 13.9544 -2.2759 4.0960 -12.4514 -14.1780 -0.7708 -2.8591 2.4331 -#> -0.4444 -4.2921 7.0387 6.5836 9.4148 -14.9592 4.8886 7.0435 -#> -19.0743 -3.3144 -13.8473 -2.4842 -1.5997 -0.7481 15.9668 0.6583 -#> -10.1261 -2.2332 -7.1975 -2.3450 9.0592 10.7116 -0.8987 -8.9685 -#> -11.9545 -8.5190 4.8090 13.1186 10.6559 -4.5238 -2.1798 -8.8657 -#> -1.9783 8.6018 3.2554 -16.5278 -14.0102 -3.3953 9.9409 3.1031 -#> -13.6221 -1.6317 17.8017 -13.5058 -0.0398 -20.7438 -2.9618 4.1430 -#> 0.8359 -2.0096 -1.9416 4.9894 4.1868 -2.3571 5.3722 7.2971 -#> 2.0086 -12.1338 17.6228 -10.7163 5.5852 -7.9251 1.4832 -14.7635 -#> 5.7109 5.6109 -7.2849 1.9554 5.1607 2.8878 1.7395 1.9746 -#> 1.1878 -8.7317 -0.0576 13.3530 1.2356 4.8184 -11.7238 -1.7304 -#> -5.7051 -4.8007 10.1910 18.0028 -12.7282 -13.6544 2.4402 -9.5052 -#> -2.6643 14.4130 10.9647 -9.3912 3.2169 -7.6773 -12.9542 0.5395 -#> -7.9634 -6.3668 -8.8339 3.0626 3.5328 -5.4092 6.0550 -2.6830 -#> -5.5458 1.8084 3.9153 6.7767 5.4337 -10.1347 -0.7966 9.2530 -#> 5.2944 -15.6789 16.6287 0.8781 -8.6488 12.3779 -11.0757 -2.0095 -#> -1.9817 1.1022 3.5618 -4.4506 0.1626 -8.8249 -2.2472 -2.0069 -#> -#> Columns 41 to 48 17.5704 3.3890 -14.7766 8.8433 -8.1115 18.5641 5.5488 -15.7888 -#> 1.3682 -0.3895 -13.8270 -15.9155 -4.3090 -8.9518 10.9437 -12.4455 -#> 17.5254 1.6004 1.1251 -3.3617 6.1294 14.9661 -12.5143 14.6155 -#> 2.5869 11.4185 -0.2229 -15.8317 16.1505 20.4406 -10.4021 4.9527 -#> -4.8332 2.2539 2.9690 -19.8020 -0.6848 10.3750 -3.9140 -7.7243 -#> 9.0005 -4.9784 -14.8227 -5.6539 -22.5182 3.3849 -6.7069 -9.1133 -#> -13.7044 8.7026 15.7930 -21.8754 14.5765 -6.7201 -2.8081 2.8092 -#> -8.3891 -2.3845 8.5504 12.8548 1.4612 -3.3912 -12.1353 5.9169 -#> -0.5118 0.2147 5.5487 -9.9571 7.8578 31.2956 -7.6141 6.0582 -#> -4.1179 17.5565 -1.9947 7.3969 1.8679 -1.0288 -10.8080 -6.1185 -#> -12.9663 -2.5697 12.2865 13.0955 -2.8212 -13.4418 -3.3537 11.4065 -#> 0.6306 -6.9157 -22.3353 7.5794 5.3888 -1.9895 6.8287 -15.9672 -#> -3.7175 -7.1140 -7.8213 0.2052 7.8420 23.6224 -1.2615 5.3317 -#> -2.8885 7.7248 5.9880 -12.1741 -8.5161 3.1150 20.8556 -8.3987 -#> 2.4623 2.5093 8.3615 -4.6639 6.6286 -13.4688 -6.7489 9.2497 -#> -9.4874 0.9945 5.8155 -4.5071 -3.0293 -14.6485 0.9375 -9.3388 -#> 5.6274 -2.6494 2.2440 6.4652 -6.9652 -8.7051 -4.4503 -11.7992 -#> -7.1215 -11.7251 -0.0189 -7.7000 20.6908 -9.6292 17.5161 0.8517 -#> -1.1748 16.3685 17.1124 -0.1526 9.9417 -15.2438 -2.5329 -3.7418 -#> 9.6564 9.9983 -10.8354 6.3171 5.9041 -1.7046 -10.1057 1.9377 -#> -8.3920 -5.5874 -3.4125 9.7132 11.8479 8.2080 4.9230 -10.1816 -#> 9.3801 3.2812 4.9593 2.1379 -10.3081 -8.9848 -15.6181 -5.7878 -#> 1.7352 14.5038 -3.1819 -9.0198 9.3146 9.1462 -12.0767 -4.1429 -#> -7.9687 1.5227 -10.3528 -6.1670 -12.7228 -8.4186 9.4608 -7.8613 -#> -0.3986 -0.5350 -20.1001 7.9296 2.2427 14.5613 16.7416 -7.5759 -#> -4.4657 1.9143 -2.0119 9.2431 15.5990 -9.8887 -2.0631 -5.6964 -#> -4.3807 2.3498 2.3197 2.3938 -1.6953 5.2110 4.9993 2.3942 -#> 8.3853 -10.0567 17.5188 -0.2648 6.5264 23.4170 0.1300 4.5117 -#> -16.0887 2.2805 16.1597 -5.0918 14.6784 0.1509 17.1091 10.7455 -#> -5.0488 -1.5862 3.4866 -7.5867 12.9261 1.0782 -9.2606 -10.5018 -#> 4.2590 -8.3886 12.6921 14.9392 -12.0440 -5.4093 -14.8714 9.9903 -#> 10.3134 7.1416 -2.7204 -8.9261 -3.6273 29.5756 -9.7316 12.0568 -#> -4.3794 -11.0773 10.8934 -7.9329 5.4392 -4.8761 4.0482 7.3628 -#> -#> Columns 49 to 54 -14.5902 -0.2382 4.3159 -2.6877 5.0703 2.4193 -#> -7.1263 -7.5356 -5.7498 7.8019 -0.7866 -2.3049 -#> -12.3100 -1.9089 0.3939 9.3053 -5.7823 1.2514 -#> -10.8058 5.7169 6.6433 11.4855 -1.1410 -4.5655 -#> 8.8070 5.1104 8.8327 12.1930 5.0258 5.7235 -#> -25.6938 -3.5064 -2.6577 -15.0395 -2.9256 -3.4516 -#> -6.6522 -0.1651 -11.4028 -6.1774 2.9720 -3.9651 -#> -11.4142 -13.1631 14.7867 5.8335 5.6973 0.3327 -#> -24.5765 5.7319 12.5413 -2.6588 1.5041 -1.9145 -#> 6.3423 -8.7484 -4.6776 1.1261 -4.7972 0.9373 -#> 13.7444 -5.1163 -2.2804 -7.3522 3.7646 6.7636 -#> -6.6618 -3.2632 10.3643 0.2059 -1.5956 2.7000 -#> -20.9447 4.3667 -0.4386 -11.0839 1.1866 3.1471 -#> 10.5578 -6.5905 -5.1642 6.4413 0.7218 -5.2311 -#> 1.3020 9.0326 7.8362 0.2074 -1.2691 -3.3912 -#> 0.0193 11.9974 -4.4291 -10.2958 -13.7476 -0.6264 -#> -7.6380 2.1646 -13.6631 -4.7001 5.4647 -4.1669 -#> 5.9041 4.0855 -2.6119 2.6817 6.5380 -0.4950 -#> 6.0665 3.0488 7.5467 -16.5522 1.2797 -2.5643 -#> 2.4455 -9.6235 7.8516 -1.5448 2.5017 3.9083 -#> 10.5918 10.5633 4.1331 3.3853 -0.5743 0.0424 -#> 6.7426 -6.8340 -1.2416 -1.1304 -13.0831 -2.6119 -#> -9.2593 -1.9201 -3.0986 14.1559 -0.0596 -11.1140 -#> -11.5428 2.5954 10.6733 -4.6600 -3.8252 -3.4938 -#> 1.3032 -0.2393 -5.7651 3.2347 1.5240 -3.7687 -#> 4.9202 -6.3740 -17.5235 4.7132 2.0002 -0.2020 -#> 11.3402 5.2820 -17.9162 -0.2594 -7.4186 -2.8355 -#> -19.4105 5.8406 7.7503 -7.5794 1.5874 -2.1943 -#> 1.6365 -4.2029 -6.0806 1.8311 -1.0763 0.0894 -#> 17.7446 21.0740 -0.0280 -10.0488 5.2612 4.8506 -#> -4.6811 9.2956 18.5009 -6.1713 -1.4305 0.4427 -#> -10.9019 7.2243 -14.8779 -8.3666 -7.9722 0.3536 -#> -8.8841 2.5430 -2.5944 1.4092 -9.3180 -2.0870 -#> -#> (11,.,.) = -#> Columns 1 to 6 -3.7849e+00 -5.0196e+00 2.1432e+00 1.7021e+00 2.1228e+00 -4.3782e+00 -#> 4.1938e+00 -3.9595e+00 4.0635e+00 -5.5167e+00 5.6394e+00 4.2589e+00 -#> -2.8545e+00 1.2807e+00 -6.3746e+00 -3.4508e+00 1.5402e+01 5.1446e+00 -#> 2.8870e+00 3.8449e+00 9.1714e+00 6.2121e+00 8.3932e+00 8.3983e+00 -#> -2.2077e+00 8.2990e+00 -1.5067e+00 -5.9537e+00 -1.3866e+01 -6.3206e+00 -#> -3.9909e-01 4.0141e+00 -1.2573e+01 8.3191e+00 2.9710e+00 -2.4224e+00 -#> -1.8828e+00 -2.7120e+00 4.2654e+00 -1.2382e+00 -1.4503e+00 5.6398e+00 -#> 3.4461e+00 -6.0116e+00 2.6112e+00 -4.3507e+00 -6.3859e+00 -4.9195e+00 -#> -1.6851e+00 -9.8991e+00 5.0588e+00 -2.4422e+00 6.6687e+00 1.7778e+00 -#> 4.5697e-01 -7.3823e-02 4.4258e+00 -4.7368e-01 1.0315e+00 3.9583e+00 -#> 1.1417e+00 -1.6804e+00 -2.8100e+00 -2.1802e+00 -3.9562e+00 -1.0400e+01 -#> -5.8564e-01 1.0935e+01 2.6781e+00 4.0670e+00 8.1068e-01 -3.8475e+00 -#> 4.1610e+00 -5.0571e-01 2.9245e+00 -7.9322e+00 -1.6292e+01 -3.1278e+00 -#> -1.4342e+00 -2.3585e+00 4.6606e+00 1.9605e+01 2.9582e+00 1.8324e+01 -#> -5.5404e-01 7.1800e-01 -1.4267e+01 4.0446e+00 5.6713e+00 -1.4284e+01 -#> 4.1463e+00 7.7104e+00 -1.1590e+01 -4.9941e+00 6.1806e+00 -1.0090e+01 -#> -2.4521e-01 8.6493e+00 3.0875e+00 1.5020e+00 7.0473e+00 -7.6942e-01 -#> -1.9347e+00 -1.0450e+01 3.0863e+00 2.8111e+00 1.0976e+00 1.5948e+01 -#> -2.6669e+00 -1.8782e+00 2.9217e+00 1.3300e+01 -1.6403e+00 -8.4687e+00 -#> 1.7010e+00 6.6925e-01 -2.5950e+00 1.3026e+00 8.2395e+00 4.8721e+00 -#> 2.0192e+00 2.7153e+00 2.1160e+00 -7.7253e+00 -8.3825e+00 -3.1612e+00 -#> 3.9173e+00 -6.8199e+00 1.5681e+01 8.2122e+00 2.8327e-01 -9.1162e+00 -#> 1.6034e+00 3.3712e+00 1.5284e+00 1.3668e+00 5.2616e+00 4.2150e+00 -#> 9.3535e+00 5.1441e-01 -5.8771e+00 6.4237e+00 -7.5995e+00 -1.9427e+01 -#> 1.0015e-02 5.9925e+00 -4.5338e+00 -4.7200e+00 5.7863e+00 3.5718e+00 -#> -3.8353e-01 1.7491e+00 -1.1367e+01 1.3499e+00 -7.0512e+00 1.4846e+00 -#> -6.6765e-01 6.4164e+00 -7.2331e+00 8.6983e+00 -4.7788e+00 9.2212e+00 -#> -3.8685e+00 3.6740e+00 4.8216e+00 1.3614e+01 2.4227e-01 1.1748e+00 -#> -1.2426e+00 -6.0361e+00 -5.0742e+00 -1.0331e+01 -1.2833e+01 7.2044e+00 -#> 8.5433e-01 3.0204e+00 6.8311e-01 -1.3152e+00 -1.2573e+00 1.1791e+00 -#> 1.0658e+00 -1.8278e+00 2.1841e+00 1.3825e+00 5.6781e+00 -1.4014e+01 -#> -3.4659e+00 -2.0254e-02 -6.6509e+00 -2.2007e+00 -1.0925e+01 -3.6717e+00 -#> 3.3764e+00 2.2750e+00 -8.4900e+00 7.7153e+00 -4.4704e+00 -1.6245e+00 -#> -#> Columns 7 to 12 6.0973e-01 -1.4816e+01 -1.6339e+01 -3.2286e+00 4.9790e+00 3.0585e+00 -#> 6.8249e+00 3.8479e+00 1.2093e+00 -4.8146e+00 -1.4413e+01 -2.8921e+00 -#> -1.7814e+00 6.0191e+00 -1.1324e+01 5.8392e+00 -5.0403e+00 3.6101e+00 -#> 4.6124e+00 2.8248e+00 9.4244e-01 2.1540e+00 5.7656e+00 1.0509e+01 -#> 1.6468e+00 8.3349e+00 1.0383e+01 1.0796e+01 -1.1249e+01 4.1644e+00 -#> -1.0202e+01 -1.2100e+00 -1.3321e+01 -1.6383e+01 -1.6935e+01 -4.5484e+00 -#> 8.5930e-01 -5.8933e+00 7.5169e+00 5.9695e+00 1.0066e+00 -6.7549e+00 -#> 2.3309e+00 -4.0142e+00 -2.3207e+00 -2.2595e-01 3.0484e+00 -2.7808e+00 -#> 2.0428e+00 -6.2656e+00 -7.2521e+00 -5.2489e+00 9.3829e+00 1.3080e+00 -#> -6.5883e-01 -4.0317e-01 -1.2260e+00 -1.4979e+01 2.1824e+00 2.7155e+00 -#> 7.3020e+00 9.7149e-01 -1.2680e+01 9.1245e+00 1.1231e-01 -1.4746e+01 -#> -8.0618e+00 -2.8863e+00 -1.4427e+01 -2.7446e+00 3.0282e+00 -7.5802e+00 -#> 7.9672e+00 -3.7802e+00 -5.4363e+00 7.7521e+00 -1.0090e+01 -7.8178e+00 -#> 1.2089e+01 5.3239e+00 4.9814e+00 -5.6864e+00 -3.6484e+00 -4.6010e+00 -#> -2.8823e+00 -1.1789e+01 -1.1138e+01 -1.0029e+01 1.0326e+01 -1.0063e+01 -#> -5.3812e+00 3.2276e+00 -3.6145e+00 -3.5016e+00 -4.2648e+00 -4.8598e+00 -#> -8.2244e+00 2.2045e+00 -5.3808e+00 -9.6569e+00 1.8693e+00 -3.7419e+00 -#> 4.7899e+00 4.2961e+00 -9.5148e-01 1.1050e+01 2.0492e+00 -6.1241e+00 -#> -9.6661e+00 -1.4662e+00 -1.4761e+01 4.7909e+00 9.8864e+00 5.9882e+00 -#> 1.0315e+01 -3.5138e+00 -1.0110e+01 8.8079e+00 6.5133e+00 -5.1630e+00 -#> -1.4707e+01 -1.8352e+00 8.1046e+00 3.2276e+00 5.4078e+00 1.4720e+00 -#> 4.0316e+00 -4.7272e+00 2.2854e+00 -1.0523e+01 -9.7445e-01 -2.0957e+00 -#> -8.9254e-01 4.7154e-01 -3.2894e+00 -4.5754e+00 -3.8137e+00 -7.3894e+00 -#> -2.3613e+00 -1.3534e+00 1.3022e+01 4.2130e+00 3.0146e+00 1.2287e+00 -#> 6.0422e+00 -2.9257e+00 -8.1346e+00 -5.4069e+00 -6.5148e+00 1.1687e+01 -#> -4.2277e+00 6.4720e+00 -2.5719e+00 -5.1238e+00 -6.9550e+00 -1.9845e-01 -#> -4.8493e+00 -3.1173e+00 -3.0814e+00 4.2641e-01 -2.9877e+00 4.9414e+00 -#> -6.0841e+00 -4.0842e+00 -1.1773e+01 4.2059e-01 -8.0204e+00 2.0235e-01 -#> 3.8129e+00 -5.0090e+00 8.0143e+00 7.2605e+00 -8.3539e+00 1.9997e-01 -#> -5.2110e+00 -2.1361e+00 5.1565e+00 5.2963e+00 7.4855e+00 8.4635e+00 -#> -6.7656e+00 1.0277e+01 3.9130e-01 1.8962e+00 5.5895e+00 -7.6272e+00 -#> 4.1073e+00 4.8892e+00 1.0822e+00 -4.1000e+00 -1.1186e+01 1.5630e+00 -#> 2.1621e+00 -1.8975e+00 -1.0544e+00 7.2115e+00 -5.1892e+00 1.8350e+00 -#> -#> Columns 13 to 18 1.2055e+01 -1.5823e+00 -4.0961e+00 4.4132e+00 1.6397e+01 -7.4848e+00 -#> -8.9138e+00 -1.2013e+00 8.9636e-01 -2.3079e+00 -9.8273e+00 -3.5474e+00 -#> -4.1584e+00 9.4283e+00 3.3601e+00 1.1121e+01 -2.0212e+00 -7.8219e+00 -#> -5.1769e+00 3.6503e+00 -1.1306e+01 4.6026e+00 -6.1186e+00 4.3157e+00 -#> 7.2794e-02 -3.5959e+00 -6.4562e+00 -4.8976e+00 -8.0606e+00 -6.6448e+00 -#> -2.9705e+00 -1.6580e+00 9.4389e+00 -1.0084e+00 -1.6057e+00 5.3935e+00 -#> 1.2491e+01 -4.7120e+00 -5.2667e+00 1.6226e+01 3.9891e+00 -1.4046e+01 -#> 1.3069e+01 -3.1612e+00 -9.8630e+00 -2.3612e+00 1.6367e+00 2.9848e+00 -#> 2.8621e+00 1.2066e+01 -8.5787e+00 1.0951e+01 -5.8907e+00 8.6545e+00 -#> -3.6769e+00 8.8309e+00 -4.3361e+00 -2.2998e+00 -5.8985e+00 1.7773e+00 -#> 5.7171e-01 4.1866e+00 -3.4568e+00 2.9769e-01 1.8227e+01 2.3260e+00 -#> -4.9518e+00 -4.5572e+00 -1.9396e+00 -3.8327e+00 -4.9439e+00 1.4059e+01 -#> -1.8056e+00 -1.0121e+01 -1.8802e+01 -3.8292e+00 1.4599e+01 7.9980e+00 -#> -5.1516e+00 -6.4763e-01 -1.7462e+00 -1.9015e+01 5.6483e+00 -3.0024e-01 -#> -1.0154e+01 -8.1130e+00 3.3937e+00 6.0944e-01 -1.4344e+01 9.7953e+00 -#> -4.8702e+00 2.4856e+00 7.8282e+00 3.4010e-01 -1.4870e+01 1.3035e+00 -#> -1.1545e+00 -3.1283e+00 5.7233e+00 2.2559e+00 -1.8827e+00 -2.2008e+00 -#> 2.7410e+00 -2.7028e+00 4.8085e+00 -5.9716e+00 -6.9213e+00 -6.9172e-01 -#> 1.5735e+01 4.2947e+00 -3.2594e+00 7.8355e-01 -4.0335e-01 -2.7748e-01 -#> -1.2516e+01 9.3661e-01 8.7588e+00 -1.0998e+01 -1.0668e+01 3.7740e+00 -#> 2.1174e+00 -1.1302e+01 2.5620e+00 -1.2488e+01 -3.4072e-02 1.3111e+01 -#> 1.3338e+01 -2.0520e+00 -1.5102e+01 -2.9870e+00 5.0300e+00 6.6853e+00 -#> 9.5776e-02 -1.1173e+01 -7.8260e+00 -7.9570e-01 -4.0905e+00 3.1859e+00 -#> 9.8249e+00 -8.4933e+00 6.2898e+00 1.8266e+00 4.9972e-01 1.1476e+00 -#> -3.1772e+00 5.4599e+00 9.6864e+00 -9.3830e-01 1.3744e+01 4.6697e+00 -#> -7.9928e-01 9.6497e+00 1.1326e+01 5.6546e-01 -4.4016e+00 1.2161e+01 -#> -4.6892e+00 -2.7207e+00 1.5347e+01 -1.3501e+00 1.0858e+01 -1.9357e-01 -#> 3.9911e+00 -2.1637e+00 3.5168e+00 -5.6583e+00 4.2209e+00 4.6337e+00 -#> 9.9214e+00 -7.2976e-01 1.0678e+01 1.8799e+01 1.7413e+01 -8.4330e-01 -#> 2.3533e+00 -5.9228e+00 1.3134e+01 -1.4971e+00 3.8762e+00 1.5080e+00 -#> -7.7518e-01 1.4636e+00 -7.8376e+00 -1.2756e+01 -3.8406e+00 3.6612e+00 -#> 2.5242e+00 8.6385e-01 -6.2545e+00 4.3963e+00 9.6608e+00 1.5203e+01 -#> -4.8494e+00 -6.9702e+00 4.1564e+00 4.8794e+00 1.7624e+01 -7.4009e+00 -#> -#> Columns 19 to 24 -1.1275e+01 -8.8870e+00 -1.1503e+00 -1.3656e+00 -1.4318e+00 -4.2171e+00 -#> 1.2225e+00 6.3358e+00 1.4911e+00 -9.2090e+00 2.2159e+00 -3.0175e+00 -#> -6.2561e+00 -6.9127e+00 -1.5702e+00 -1.1056e+01 3.5933e+00 -2.8720e+00 -#> -8.9248e+00 -5.0138e+00 1.0471e+01 -5.2612e+00 5.8908e+00 1.0312e+01 -#> 9.3393e+00 -1.1216e+00 -9.9527e+00 -5.3251e+00 3.2107e-01 -2.6164e+00 -#> -1.4038e+00 -2.8797e+00 3.0678e+00 1.7291e+01 -1.7825e+01 1.5292e+00 -#> -1.1628e+00 -1.6507e+00 4.8153e+00 -1.1936e+01 -1.5914e+00 -1.2476e+01 -#> 5.0162e-01 -2.5429e+00 -3.8954e+00 -1.0717e+00 -7.4392e-01 -8.1214e+00 -#> -2.0364e+00 -3.6251e+00 -1.8497e+00 4.7399e+00 1.0754e+01 -1.1310e+01 -#> -4.8160e+00 1.7677e+01 -2.3417e+00 1.8043e+00 -1.5608e+01 -3.7794e+00 -#> 1.0231e+00 9.7939e-01 -4.3045e+00 -5.1458e+00 3.6919e+00 4.1636e+00 -#> 3.0280e+00 6.1041e+00 9.0048e-01 4.3932e+00 -1.2203e+01 2.1182e+01 -#> -2.6326e+00 -5.6641e+00 6.3962e+00 8.7000e-01 1.0775e-01 1.2407e+01 -#> 9.1635e+00 2.8067e+00 2.2553e+00 2.4540e+00 5.8933e-01 1.7804e+01 -#> 3.9390e-01 1.5576e+00 2.1263e+00 3.5894e+00 7.0876e+00 1.6979e+01 -#> 2.2544e-01 4.1435e+00 1.8176e+00 3.9213e+00 -1.5278e+01 2.8989e+00 -#> -4.3914e-01 1.2226e+01 -4.4174e+00 2.6144e-01 -5.7790e+00 1.1095e+01 -#> -1.0954e+00 3.1233e+00 -4.9339e+00 -5.5239e+00 1.2868e+01 -1.3096e+00 -#> -1.7473e+01 -3.2913e+00 -9.9370e+00 1.6746e+01 -1.0182e+00 -1.9162e+01 -#> 6.0977e+00 -7.1840e+00 9.8135e+00 -1.5442e+01 4.1905e+00 4.2416e+00 -#> 8.6695e+00 -1.1826e+00 -1.3659e+01 1.9926e+00 3.4558e+00 -8.9040e+00 -#> -7.1326e+00 3.0948e+00 5.0819e+00 8.8750e-01 8.3385e+00 -1.4594e+00 -#> 1.3645e+01 5.1997e+00 3.6020e+00 -8.2307e+00 2.1137e+00 1.2046e+01 -#> -4.8727e-01 -7.2430e+00 -2.7028e+00 1.1680e+01 -7.5035e+00 -5.5648e+00 -#> 7.9166e+00 -2.5712e+00 1.0126e+01 -5.6492e-01 -1.6257e+01 8.2240e+00 -#> 1.1700e+01 1.9525e+00 -1.7176e+00 7.5046e+00 -9.1546e+00 6.5096e+00 -#> 5.9320e-01 -5.0605e+00 -1.0223e+00 1.9240e+01 -1.0675e+01 2.9595e+00 -#> 1.3353e+01 -1.2302e+01 -2.9855e+00 -1.1250e+00 5.4672e+00 -7.9218e+00 -#> -2.6932e+00 6.7082e+00 -1.1248e+01 5.8509e+00 1.2224e+01 -1.1559e+01 -#> 6.9015e+00 -6.2887e+00 1.2609e+00 3.2606e+00 -1.2129e+01 2.1034e+00 -#> 5.9783e+00 2.8002e+00 -1.0461e+00 2.0890e+00 -1.5872e+00 4.5608e+00 -#> 2.3425e+00 9.2480e+00 -1.0278e+01 8.0220e+00 -4.5289e+00 1.0035e+01 -#> 1.3217e+00 -2.8492e+00 4.2375e+00 -8.9814e+00 -8.6686e+00 -8.7894e+00 -#> -#> Columns 25 to 30 8.3360e+00 -3.7383e+00 -1.0755e+01 -4.4215e+00 -5.5599e+00 4.4972e+00 -#> 9.7614e+00 3.3047e+00 8.1004e+00 -1.7579e-01 -2.6047e+00 6.5980e+00 -#> -4.3140e+00 -9.9903e+00 3.5220e+00 -1.1364e+01 -2.4828e+00 -1.2487e+01 -#> 7.3264e-01 1.2492e+01 1.8758e-03 -1.5940e+01 -1.7799e+01 7.2498e-01 -#> -6.0802e+00 -2.9302e+00 5.6184e+00 -7.8101e+00 -3.4038e+00 3.7225e-01 -#> -3.1649e+00 -8.9998e+00 2.3312e+00 7.6182e+00 8.3816e+00 9.3550e+00 -#> -1.1552e+01 2.3792e-01 1.4321e+00 3.3447e+00 -1.7611e+01 8.7142e+00 -#> -8.7107e-01 8.1181e+00 -1.4570e+01 6.2605e+00 7.8271e+00 -4.8895e+00 -#> -2.9786e+00 -2.9680e+00 -2.1442e+00 -1.3693e+01 2.4202e+00 -1.5083e+00 -#> -3.8475e+00 -6.6983e-01 1.0256e+00 -1.1677e+01 -4.5688e+00 1.2044e+01 -#> -1.4389e+01 -1.1644e+01 -2.4801e+00 2.1499e+00 -5.2939e-01 3.5510e-01 -#> -2.3637e+00 -3.1054e+01 -6.8584e+00 9.6814e-02 1.5122e+00 9.6570e+00 -#> 4.2619e+00 5.8534e+00 -1.5839e+01 1.4565e+00 -1.7065e+00 -9.5515e+00 -#> -8.0406e+00 5.8786e+00 7.9152e+00 -6.5251e+00 -1.3515e+01 1.7494e+01 -#> -5.5358e+00 -1.6714e+01 -1.0522e+00 5.3414e+00 -1.4394e+01 7.4968e+00 -#> 5.9624e+00 7.6743e-01 4.9624e+00 9.7006e+00 -5.8005e-01 -2.2949e+00 -#> 3.7649e-01 -1.2126e+01 1.4155e+01 8.3035e+00 1.8507e+00 4.5304e+00 -#> 2.6175e+00 1.0547e+01 -5.2089e+00 -1.7335e+00 -1.1900e+01 -3.6790e-01 -#> -2.6824e+00 -1.0599e+01 1.6807e+01 1.3984e+01 1.0876e+01 1.8125e+01 -#> -6.5827e+00 -2.4784e+00 2.5894e-01 -8.1518e+00 -5.2204e+00 7.2081e+00 -#> 9.8621e+00 4.5915e-02 2.3873e-01 1.2638e+01 1.6099e+00 3.0094e+00 -#> 3.0689e+00 5.8365e+00 -1.1364e+01 3.0868e+00 1.1472e+01 -3.2644e-01 -#> 4.4706e-01 6.1634e+00 -8.9733e+00 -1.2721e+01 6.9967e-01 1.8322e+01 -#> -5.6052e+00 6.5106e+00 8.6718e+00 4.4310e+00 -9.5444e-01 -6.2647e-01 -#> 4.7944e+00 -5.0659e+00 5.1348e+00 -5.5745e+00 2.4925e+00 -1.3703e+00 -#> -1.4523e+01 -1.3315e+01 7.2122e-04 -1.1568e+01 -4.8835e+00 4.6232e+00 -#> -3.9652e+00 6.9381e+00 -4.8351e+00 -1.2155e+01 8.0029e+00 -8.4143e+00 -#> 6.9426e+00 3.5703e+00 -9.3369e+00 3.6161e+00 2.8410e+00 4.1312e+00 -#> -6.9064e+00 9.6734e+00 -1.0496e+01 -3.5655e+00 -8.3905e+00 -1.7785e+01 -#> 3.1462e-03 1.9450e+00 -5.5292e+00 2.8397e+00 -6.2833e+00 3.3583e+00 -#> -5.6672e+00 3.4510e+00 -1.5409e+00 8.8068e+00 -5.1308e+00 2.6718e+00 -#> -3.3111e+00 1.4582e+01 5.8425e+00 -1.7964e+01 6.5661e+00 -2.8826e+01 -#> -6.9143e+00 1.0431e+01 -5.4161e+00 -2.9465e+00 3.6363e+00 3.3457e+00 -#> -#> Columns 31 to 36 1.2633e-01 2.0837e+00 -1.7995e+01 -7.4558e+00 -3.4899e+00 -2.1166e+00 -#> 5.1230e+00 9.6372e+00 4.8431e+00 -4.0710e-02 -1.1583e+01 2.3542e+01 -#> 3.2173e+00 6.4572e+00 8.2509e+00 8.3110e+00 -1.1414e+01 -1.0241e+01 -#> 1.9424e+00 -8.8026e+00 1.1571e+01 6.9345e+00 2.2727e+00 1.0056e+00 -#> -7.7589e+00 -1.0472e+01 -3.7097e+00 -1.0164e+01 -3.7912e-01 -2.1021e+01 -#> 1.9517e+01 4.7494e+00 4.4957e+00 -1.0816e+00 5.4489e+00 -6.5487e+00 -#> -8.7826e+00 7.8509e-02 9.0676e+00 6.6272e+00 2.6549e+00 -1.5556e+01 -#> 9.7232e-01 1.4177e+01 7.6577e+00 1.4539e+00 -1.0341e+01 -4.9364e+00 -#> 5.9784e+00 5.7458e+00 -1.4109e+01 -5.6991e+00 9.1061e+00 1.4150e+01 -#> 1.6477e+01 -3.1146e+00 2.9849e+01 -1.8320e-01 -2.9251e+00 4.5001e-01 -#> -5.6236e-03 1.4766e+00 -6.0240e+00 9.4568e+00 -1.5395e+00 -1.5155e+01 -#> 4.9929e+00 6.7127e+00 -8.6386e+00 -4.3836e+00 -5.1465e+00 -2.8905e+00 -#> -3.2138e+00 6.1757e+00 1.0868e+01 2.0150e+01 2.3389e+00 -1.8949e+01 -#> -3.8071e+00 9.3911e+00 1.2564e+01 -9.0786e+00 4.9773e+00 9.1801e+00 -#> 6.5899e+00 -1.0259e+01 2.5031e+00 -6.8856e+00 2.3481e+00 -2.8523e-01 -#> 7.7059e+00 -3.3388e+00 -1.0783e+01 3.0331e+00 1.0327e+01 1.5650e+01 -#> 5.5370e+00 -2.3466e+00 7.0102e+00 3.5636e+00 5.4982e+00 2.2237e-01 -#> -4.8712e+00 -8.5445e-02 -6.6527e+00 1.0074e+01 -9.5377e+00 5.8579e+00 -#> 7.7541e+00 -5.7797e+00 -1.3062e+00 -1.3028e+01 1.2405e+01 7.4622e+00 -#> 7.9735e+00 1.0001e+01 -2.2889e+00 2.2752e+00 -8.3279e+00 -1.2083e+01 -#> -6.8952e+00 -8.5599e+00 -8.3242e+00 5.5835e+00 2.0972e+00 5.6569e+00 -#> 1.2612e+00 1.4267e+01 1.1097e+00 1.6704e+01 -3.4209e+00 7.5645e+00 -#> 2.4637e+00 4.9550e+00 -2.2320e+00 -6.2892e-01 5.8074e+00 2.5013e+00 -#> -1.9164e+01 -3.9155e+00 -9.6766e+00 -1.0788e+01 1.0344e+00 -4.6280e+00 -#> -5.2133e-01 -5.1429e+00 4.8088e-01 2.4318e+00 4.7498e+00 1.5048e+01 -#> 9.6296e+00 7.1187e+00 7.5106e+00 -1.4648e+01 -6.7761e+00 -9.6407e+00 -#> -6.2040e+00 -5.7022e+00 3.5299e+00 -7.0126e+00 2.2069e+01 -1.1930e+01 -#> 1.0790e+01 1.6324e+01 -2.4648e+00 8.3412e+00 1.5067e+01 2.6021e+00 -#> -5.1471e+00 -6.1570e+00 1.7403e+01 9.0279e+00 -1.1493e+01 -2.4385e-01 -#> -6.8378e+00 -2.1936e+01 -8.3022e+00 -7.4845e+00 9.9043e+00 3.4503e+00 -#> 3.5370e+00 -4.7399e+00 -9.0786e+00 -8.4300e+00 -1.1417e+01 -2.3562e-01 -#> 1.6061e+00 -6.5874e+00 6.4658e+00 -7.2971e+00 1.1293e+01 -1.7372e+01 -#> -6.1479e+00 4.7600e+00 2.8297e+00 -4.8681e+00 -5.9525e+00 -2.5531e+00 -#> -#> Columns 37 to 42 -4.1066e+00 2.3703e+00 -5.9486e-01 -4.4666e-01 1.8248e-01 -8.1558e+00 -#> 8.1020e-01 1.9982e+00 1.4639e+01 4.4896e+00 7.8410e+00 9.7412e+00 -#> -1.4223e+01 -1.3682e+01 -1.7861e+00 -6.5508e+00 8.4454e+00 7.9100e+00 -#> -1.8200e+01 1.3744e+01 -1.3529e+01 1.0235e+01 5.1733e+00 -1.4189e+01 -#> -6.4968e+00 1.8096e+01 -2.5983e+00 3.8399e+00 5.0918e+00 3.8648e+00 -#> 1.0602e+01 1.2529e+01 1.3578e+01 3.7652e+00 -2.2580e+00 -5.7485e+00 -#> -7.6017e+00 -2.7962e+00 7.1716e+00 -1.3455e-01 -1.4039e+00 -4.6689e+00 -#> 3.3376e+00 -8.9963e+00 1.1831e+01 -7.4716e+00 -5.3106e+00 4.3870e+00 -#> 7.4744e+00 -2.4187e+00 -2.0261e+00 -6.3718e+00 -6.1785e+00 -9.6643e+00 -#> -1.2513e+01 1.6945e+01 -3.6329e+00 6.4687e+00 -5.0772e+00 8.4799e+00 -#> 7.7864e+00 -1.5982e+01 9.4983e-01 4.9537e+00 3.2252e+00 7.0543e+00 -#> 1.2666e+00 -4.2042e+00 -1.7502e+00 -2.3269e+00 -1.6726e+00 -9.4753e+00 -#> 6.5305e+00 -8.1240e+00 9.4768e+00 4.4920e+00 -6.1574e+00 -5.3549e-01 -#> 2.2090e+00 4.4112e+00 -7.0229e+00 1.0717e+01 -6.2096e+00 3.4585e+00 -#> -1.1082e+01 6.0274e+00 -2.1499e+01 -3.2891e+00 2.8027e+00 -1.1171e+01 -#> -1.3268e+01 3.0344e+01 6.3035e-01 -4.0915e-01 -1.5139e-01 -1.0354e+01 -#> 6.5644e+00 7.3598e+00 -6.2678e+00 -1.0709e+00 -4.2123e+00 2.6543e+00 -#> 1.8287e+01 -1.2104e+01 3.0959e+00 6.8876e+00 -9.4723e-01 1.1671e+00 -#> -4.3596e-01 -9.0169e+00 1.3417e+01 -4.2153e+00 3.0050e+00 1.0411e-01 -#> -3.4339e+00 -1.8052e+01 -1.4217e+00 -3.8098e+00 7.3061e-01 -1.7107e+01 -#> 1.3988e+01 9.8941e+00 7.7778e+00 -1.4597e-01 -3.0472e+00 -5.8861e+00 -#> -1.7668e+01 2.0067e+00 -1.7894e+01 -7.7294e+00 -7.6481e+00 1.0407e+01 -#> 2.3894e+00 -8.2646e+00 9.7261e-01 -2.3514e+01 -1.4048e+01 -1.6880e+01 -#> -2.4144e+00 2.2946e+00 4.3116e+00 -5.1044e+00 -1.0307e+01 5.4957e+00 -#> -1.0266e+01 -6.9912e+00 2.0645e+01 -1.2542e+01 2.4918e+00 -9.1484e+00 -#> 3.8444e+00 3.6482e+00 6.9221e+00 -6.6864e+00 7.9277e-01 -6.1825e+00 -#> -7.1886e-01 -1.9885e+00 -8.9606e+00 1.2146e+01 -6.4248e-01 -6.7472e+00 -#> -5.4758e+00 -1.1466e+01 -2.0698e+00 1.4037e+01 9.3900e+00 4.5658e+00 -#> 5.3529e+00 -4.9850e+00 4.7814e+00 1.0499e+01 -1.3354e+01 1.0404e+01 -#> -1.2554e+01 -6.0901e-01 -4.0208e+00 -2.2119e+01 5.2214e+00 -1.0635e-01 -#> -4.0157e+00 5.0545e-04 -4.0805e-01 -6.4190e+00 -9.8647e+00 7.2780e+00 -#> -3.5262e+00 1.7072e-01 -1.3999e+01 4.5315e-01 -1.7660e+00 -1.6332e+00 -#> -5.1083e+00 -4.2060e+00 -2.6048e+00 2.4064e+00 3.4505e+00 1.4462e+01 -#> -#> Columns 43 to 48 7.4749e+00 3.2896e+00 -6.5771e-01 -2.0536e+00 -1.5341e+00 -2.1485e+00 -#> -1.4133e+01 -5.2030e+00 1.2760e+01 -1.2155e+01 1.1957e+01 -8.8063e+00 -#> 5.0999e+00 -6.1652e+00 3.1785e+00 -6.0359e+00 -4.3141e+00 2.7696e+00 -#> 4.1555e+00 -1.0005e+01 7.6561e-01 1.1265e+00 -5.4895e+00 7.6430e+00 -#> 4.9313e+00 -1.8493e+00 5.9208e+00 -9.9984e+00 -4.2710e+00 -1.3521e+00 -#> 5.3910e+00 -4.5412e-01 -9.8172e+00 2.3860e+01 -1.8580e+01 1.1603e+01 -#> 5.4036e+00 3.2782e+00 3.6406e+00 -7.9052e+00 -3.6935e+00 -5.4650e+00 -#> 4.7777e-01 1.8790e+01 -7.1115e+00 -1.1741e+01 5.1975e+00 -9.0889e-01 -#> -4.3182e+00 9.0281e+00 -8.1416e+00 7.9910e+00 -9.8128e+00 -1.0565e+01 -#> 5.0018e-01 -4.6825e+00 5.4171e+00 -1.3931e+01 1.5320e+00 -8.1043e+00 -#> 1.3373e+01 -5.6451e+00 -6.6508e+00 1.5060e+00 -8.8349e-01 1.1201e+00 -#> -5.6232e+00 3.6242e+00 -1.0497e+00 2.5834e+00 -2.0360e+00 -7.3847e+00 -#> 6.6888e+00 2.3927e+00 -3.8077e+00 -2.1639e-02 -5.8558e+00 6.8943e+00 -#> -7.8989e+00 -2.6013e+00 5.0901e+00 2.8595e+00 -5.5578e+00 -2.1903e+00 -#> 9.7989e+00 5.4863e+00 -1.1755e+01 1.3804e+01 6.5772e+00 -2.0106e+00 -#> 4.7886e+00 4.1859e-01 1.0380e+00 1.2049e+01 3.2355e+00 -6.1523e+00 -#> -1.0745e+00 4.7642e+00 2.3202e+00 -7.0219e+00 -1.0187e+00 -7.0501e+00 -#> -5.4901e+00 -3.2077e+00 2.0884e+01 2.2532e+00 3.0159e+00 7.5106e+00 -#> -1.1099e+01 1.3630e+01 -7.1980e+00 -1.1558e+01 1.6250e-01 7.4032e+00 -#> 1.2947e+01 5.4460e+00 -1.5756e+01 9.4801e+00 -1.2139e+00 6.2564e+00 -#> 1.1998e-01 5.7220e+00 1.1314e+01 -3.5625e+00 3.5268e+00 1.1134e+01 -#> 1.3069e+01 1.1677e+00 -8.4681e+00 -1.1823e+01 -1.4532e+01 -5.3117e+00 -#> 3.4807e+00 1.9474e+01 8.1345e-01 -4.3280e+00 3.3045e+00 8.0262e+00 -#> -4.1458e-01 2.9176e+00 3.3451e+00 -2.6143e+00 8.6070e-01 2.2524e-01 -#> 4.1285e+00 9.9700e-01 9.8412e+00 3.2570e+00 -8.5901e-01 4.1803e+00 -#> -1.1954e+01 5.9839e+00 -1.7945e+01 1.3585e+01 -3.0892e+00 6.8838e-01 -#> 2.6369e+00 -5.5210e+00 -1.5965e+01 1.8688e+01 -1.1416e+01 9.1712e+00 -#> 1.5720e+00 -3.2047e+00 3.7386e+00 -1.0319e+01 -6.0105e+00 6.2452e-02 -#> -1.0030e+00 -3.8978e+00 8.4415e+00 -6.1983e+00 9.5013e+00 7.3072e+00 -#> 1.0870e+00 -1.7976e+00 1.0758e+01 6.4537e+00 -3.5953e-01 8.7095e+00 -#> -1.2589e+01 1.6091e+00 4.8148e+00 3.5849e+00 -3.8170e+00 -3.7377e+00 -#> -1.0399e+00 1.7239e+01 3.7850e+00 8.8687e+00 1.6787e+00 1.8082e+01 -#> 9.9256e+00 -5.7379e+00 6.7655e+00 -4.6335e+00 1.2040e+01 4.8663e-01 -#> -#> Columns 49 to 54 2.0588e+00 -1.1653e+01 1.0608e+01 2.7881e+00 5.6513e+00 -5.9351e-01 -#> -1.5856e+01 -1.0242e+01 -7.9768e-01 2.1046e+00 -4.7630e+00 -2.5084e+00 -#> 5.8313e+00 -1.4300e+01 4.8191e+00 -3.9928e+00 3.9323e+00 -1.7959e+00 -#> 7.1854e+00 7.4052e+00 -3.5649e+00 3.0255e+00 -6.5195e+00 4.0929e-02 -#> 7.3524e+00 -4.1666e+00 1.4574e+01 -8.3825e+00 1.2557e+00 2.2931e+00 -#> -1.6590e+01 -4.2074e+00 6.5455e+00 -2.3427e+00 -5.9171e+00 -2.4614e+00 -#> 8.4701e+00 -4.9499e+00 2.8532e+00 1.2714e+00 -1.0051e+01 -2.1670e+00 -#> -4.7380e+00 -4.1595e+00 7.7442e-01 4.5593e+00 -3.1357e+00 1.0338e+00 -#> 1.7061e+01 -1.1289e+01 8.1655e+00 5.5929e+00 5.5192e+00 6.9233e-01 -#> -1.1707e+01 -4.1463e+00 -4.4399e+00 -5.5704e-01 -5.4864e+00 -4.6775e+00 -#> -1.9277e+00 4.0682e+00 1.0254e+01 -2.4243e-01 -1.0002e+00 -1.2724e+00 -#> 6.7865e+00 -1.4280e+00 1.7221e+00 -4.3760e+00 5.2218e+00 -1.1768e-01 -#> -3.3110e+00 -9.4838e-01 -1.9602e+01 9.0612e+00 -1.4708e+00 4.8763e+00 -#> -3.5278e+00 4.1600e+00 -1.1134e+00 7.8087e-01 -2.6953e+00 -1.4792e+00 -#> 6.3919e+00 -8.5068e+00 1.0860e+01 -2.0888e+00 4.6862e+00 -2.6934e+00 -#> -1.3631e+01 3.2825e+00 3.6291e+00 2.1424e-01 -6.6992e+00 -2.3271e+00 -#> -8.4119e+00 -8.5404e+00 4.4918e+00 2.3728e+00 -2.8177e+00 2.3407e+00 -#> -5.7681e+00 4.3214e+00 -1.0287e+01 3.5729e+00 -4.2341e+00 1.4796e+00 -#> -8.7989e+00 5.0641e+00 -4.5184e-01 -7.1411e+00 4.0992e+00 -2.9186e+00 -#> -5.6270e+00 -6.4904e+00 3.5192e+00 -4.1002e+00 5.7519e+00 -9.8820e-01 -#> -4.6634e+00 -5.7731e-01 -5.6198e+00 -2.6096e+00 5.3468e+00 -2.4138e+00 -#> 1.9903e+01 -1.9132e-01 3.7509e+00 -2.5854e+00 8.5715e-01 -1.6367e+00 -#> -1.8672e+01 -6.1275e+00 6.0109e+00 6.0365e+00 -3.1170e+00 2.3840e-01 -#> -3.7971e-01 -6.9648e+00 2.7984e+00 -2.4435e+00 -4.0945e+00 3.8353e-01 -#> -1.4058e+01 5.9574e+00 -1.9671e+00 -5.1292e+00 1.5424e+00 -5.5177e-01 -#> -8.7823e+00 1.3229e+00 -7.1644e+00 1.2158e+00 -3.8932e+00 3.1650e+00 -#> -5.6919e+00 1.2080e+01 1.2179e+00 4.1779e+00 -5.4655e-02 2.0174e-01 -#> -8.4263e+00 3.1701e+00 2.4079e-01 5.1354e+00 2.8352e+00 -2.6872e-01 -#> 4.9421e+00 -5.0250e+00 2.6910e+00 2.7623e+00 -6.1163e+00 1.4400e+00 -#> -1.1406e+01 2.4924e+01 -9.8252e+00 -1.6379e+00 -2.1415e+00 -1.5942e+00 -#> 5.9189e+00 6.6855e+00 5.4495e+00 8.0702e+00 -6.0936e+00 1.2601e+00 -#> -7.1622e+00 7.4579e+00 -1.6247e+01 -1.7368e+00 3.5875e+00 8.7810e-01 -#> -5.6202e+00 -4.4285e+00 1.1572e+00 -4.1747e+00 -1.2560e+00 -6.9329e+00 -#> -#> (12,.,.) = -#> Columns 1 to 6 2.2343e+00 2.4861e-01 1.3061e+01 -1.3344e+01 1.2442e+00 1.6980e+01 -#> 5.8112e-01 -1.1779e+01 -2.8015e+00 -1.2982e+01 -3.2822e+00 7.1146e-02 -#> -4.3058e+00 1.0108e+01 2.0428e+00 4.8671e+00 -7.7594e+00 -1.3086e+01 -#> -1.4351e+00 -5.7256e+00 -4.4247e+00 1.3467e+00 1.5534e+00 -1.1222e+01 -#> 1.7914e+00 6.6421e+00 -2.5117e+00 2.6416e+00 4.7904e+00 -3.9372e-01 -#> -8.3136e+00 -1.5627e+00 -4.9528e+00 6.6515e+00 5.2495e+00 -1.7103e+00 -#> 5.3299e+00 6.1510e+00 -7.8576e-01 -3.3475e+00 -1.0934e+01 -3.2163e+00 -#> -2.7892e+00 7.9525e+00 1.3028e+01 -3.7640e+00 -2.0863e+01 -2.6415e+00 -#> -4.3488e+00 -8.0201e+00 -5.8985e+00 -3.7523e-01 1.3658e+01 -5.0732e+00 -#> -7.9534e-01 -8.1160e+00 -4.3030e+00 -8.5787e-02 -1.4162e+01 -1.9195e+01 -#> 1.7526e+00 -1.3950e+00 -3.5435e+00 -3.9843e+00 -5.4550e-01 1.4017e+00 -#> -5.9295e+00 -1.8240e+00 2.3624e+00 2.2988e+00 8.0792e+00 5.5243e+00 -#> 3.5374e+00 1.2951e+00 1.0387e+01 -7.4211e+00 -1.6104e+01 -9.4376e+00 -#> 4.2853e+00 -5.5056e+00 -4.3562e-01 1.1677e+00 6.9683e+00 3.3208e+00 -#> 7.1005e-02 8.5128e+00 3.6922e+00 7.9138e+00 5.0410e+00 3.8249e+00 -#> -7.2560e-01 -9.5460e-01 -4.3871e+00 -4.4477e+00 5.1031e+00 5.3250e+00 -#> 9.0075e-01 9.4957e+00 5.7318e+00 -5.9370e-01 7.3959e+00 8.1217e+00 -#> -6.2461e-02 -8.1013e+00 -2.5499e+00 -1.3845e+00 3.1540e+00 1.0728e+01 -#> 5.4301e+00 2.6902e+00 -5.4021e+00 1.0105e+01 -1.1946e+01 -8.0718e+00 -#> 1.5006e-02 5.5758e+00 4.5327e+00 3.3208e+00 3.3094e-01 -2.0927e+01 -#> 7.4586e+00 4.6525e+00 5.8436e-01 1.0973e+01 -2.3610e+00 4.2508e+00 -#> -1.2983e+00 -1.1301e+01 1.2430e+01 4.0250e+00 4.1832e-01 -1.6513e+01 -#> -7.8391e+00 2.4531e+00 -1.8300e+00 1.0450e+01 1.5860e+00 -2.5208e+00 -#> 5.6105e+00 5.5700e+00 -4.3786e+00 -9.1806e+00 -5.7021e+00 -7.4651e+00 -#> -9.2395e+00 6.9638e+00 6.7176e+00 4.8742e+00 1.5557e+00 -4.7856e+00 -#> 1.3244e+00 -1.5630e+00 -6.8633e+00 -2.1048e+00 -4.5452e+00 -1.2455e+01 -#> -3.4415e+00 -3.6602e+00 -8.8038e+00 1.2900e+01 -1.6583e+00 3.5402e-01 -#> -5.4946e-01 4.2501e+00 -3.8394e+00 -1.1170e+00 -8.1395e-01 -3.9050e+00 -#> 2.5022e+00 8.3729e+00 -2.6832e+00 2.7693e-01 3.8717e+00 -6.5552e+00 -#> 7.8093e+00 1.3572e+00 -1.4310e+01 5.0189e+00 3.0458e+00 8.0139e+00 -#> 1.3326e+00 -6.4450e+00 1.0170e+00 5.7269e-01 1.9765e+01 -6.6270e+00 -#> 1.9113e+00 5.7325e+00 7.8303e+00 -8.4500e-01 -7.9439e+00 -6.7554e+00 -#> 2.8425e+00 8.1149e+00 6.1998e-01 -6.4198e+00 -1.3863e+01 2.7923e+00 -#> -#> Columns 7 to 12 1.0106e+01 -8.0998e+00 -1.6541e+00 -1.0441e+01 -6.2678e+00 1.5084e+01 -#> 4.2660e+00 5.1183e+00 -4.0667e+00 -2.3064e+01 -4.2162e+00 -1.1576e+00 -#> 1.3151e+00 -4.1282e+00 1.2276e+01 2.8656e+00 1.3460e+01 -8.0012e+00 -#> -4.6559e+00 -1.3864e+01 -4.5554e+00 -3.6413e+00 -5.4804e+00 -1.7346e+01 -#> 9.5870e+00 7.6032e+00 8.3130e+00 1.2399e+00 2.7644e+00 3.0823e+00 -#> 7.4200e+00 -1.7400e+01 -1.1742e+01 7.4787e+00 -2.5101e+01 5.8978e+00 -#> -1.8558e+00 2.3167e+00 -5.5227e+00 -3.7291e+00 4.4580e+00 1.3574e+00 -#> 4.9338e+00 1.0376e+01 -5.2100e+00 6.7813e+00 -3.7380e+00 -1.6642e+01 -#> 1.8014e+01 -2.5866e+00 -5.0389e+00 -3.0290e+00 1.5388e+00 -1.0736e+00 -#> -2.3015e+01 -4.0512e+00 -2.0893e+00 -4.2346e+00 -2.2044e+01 1.4695e+00 -#> 4.4195e+00 -1.5963e+00 2.3283e+00 -7.8490e-01 -1.9061e+00 9.3110e-01 -#> -1.0458e+01 2.6943e+00 -1.1518e+01 3.4626e+00 9.9623e-01 1.5833e+01 -#> 5.4454e-01 -1.8906e-01 -4.6633e-01 8.7036e+00 -1.3710e+00 -1.1635e+01 -#> -3.0853e+00 1.0756e+01 -2.1256e+01 -1.3527e+01 2.0147e-01 8.2447e+00 -#> -2.6700e+00 -2.6537e+00 3.8012e-02 -4.0746e-01 4.2514e+00 1.3912e+01 -#> 4.6423e+00 -1.3562e+01 3.3131e-01 -4.1649e+00 -7.3361e+00 4.3607e+00 -#> -1.1995e+01 7.7007e+00 -1.9198e+00 -1.5712e+01 -1.6685e+01 9.2774e+00 -#> -5.1702e+00 1.3754e+01 1.1154e+00 -2.0734e+00 1.8528e-02 -3.4237e-01 -#> -7.5284e+00 -9.7365e+00 -3.6080e+00 6.4859e+00 -8.1583e+00 6.2707e+00 -#> 1.2302e+01 4.8155e+00 6.7645e-01 -9.2286e-01 7.5249e+00 -1.2391e+00 -#> -3.8789e+00 -7.4176e-01 -3.9320e+00 -6.7518e+00 2.9841e+00 2.0185e+01 -#> 3.6928e+00 -1.8126e+01 -9.5705e+00 2.5406e+00 7.8560e+00 -1.4716e+01 -#> 1.0955e+01 3.9481e+00 -1.9212e+01 -2.1933e+01 -1.2778e+01 1.1302e+01 -#> 7.4397e+00 -7.9327e-01 -9.3109e-01 1.9953e+01 -1.2216e+00 1.0269e+01 -#> 1.7136e+00 -1.5817e+00 -5.9169e+00 -6.5075e+00 4.2173e+00 1.1803e+01 -#> 6.3738e-01 4.2163e+00 -1.4027e+00 9.7656e+00 -2.3720e+00 1.1708e+01 -#> 4.1512e+00 -6.7453e+00 -7.7274e+00 3.5501e+00 8.7879e-01 4.4375e+00 -#> -3.6729e+00 -1.0202e+01 -6.3316e-01 -1.5827e+01 -5.6612e+00 -1.3232e+01 -#> 3.4021e+00 1.2152e+01 4.2543e+00 1.6077e+01 7.1638e+00 -9.5322e+00 -#> -1.6027e+01 -1.2267e+01 9.8984e+00 5.8649e+00 1.5113e+01 2.3995e+01 -#> 6.6292e+00 -6.8196e+00 4.2903e+00 7.1128e+00 -1.0787e+01 -1.3526e+00 -#> -7.1549e+00 1.3952e+01 -9.2232e+00 -6.3592e+00 6.6941e-01 -9.2557e+00 -#> -3.7317e+00 9.6946e+00 -6.1778e+00 9.9506e+00 -5.2459e+00 -5.7252e+00 -#> -#> Columns 13 to 18 7.9775e+00 -3.3716e+00 -1.6398e+01 1.0290e+00 1.4219e+01 -7.9817e-02 -#> -2.8099e+00 9.6442e+00 5.9975e+00 -8.3840e+00 -1.2924e+01 1.7883e+01 -#> -2.9478e+00 -4.1491e+00 6.5517e+00 -3.4311e+00 -2.5219e-01 -4.7551e+00 -#> -1.7489e+00 4.8127e+00 1.7067e-01 -1.0031e+01 4.8095e-01 1.7671e+01 -#> 3.3080e+00 -5.0196e-01 1.6195e+01 2.1735e-01 -1.1563e+01 -1.8092e+01 -#> 5.8877e+00 8.8385e+00 4.5792e+00 -1.3592e+01 3.6850e+00 3.2994e+00 -#> -3.6112e+00 6.8737e+00 3.0362e+00 3.9409e+00 1.3417e+01 -7.3520e+00 -#> -1.2533e+01 -5.9631e+00 -6.3714e+00 2.5638e+00 1.9757e+01 -1.3020e+00 -#> 1.8081e+01 -3.9881e+00 -2.4899e+01 -1.1685e+01 -5.0189e+00 2.7581e-01 -#> -1.0677e+01 -1.7292e+00 6.1302e+00 9.3948e+00 -5.9134e+00 1.0920e+01 -#> -1.5548e+01 -9.7880e+00 1.5163e+01 3.2234e+00 -1.4654e+01 -2.7111e+00 -#> 2.6025e+00 1.6772e+01 -9.5342e+00 -1.7929e+01 6.9729e+00 1.7990e+00 -#> -4.6987e+00 1.8096e+00 -4.8640e-01 1.7515e+00 1.6400e+00 8.6115e+00 -#> 5.8794e+00 2.3319e+01 -1.2410e+01 -1.4471e+01 -2.5027e+00 1.3834e+01 -#> 3.6492e+00 1.0097e+01 6.0911e+00 -1.6952e+01 3.7765e+00 1.2470e+00 -#> 9.0154e+00 -7.9646e+00 1.0940e+01 -2.5419e+00 -1.2839e+01 -3.2985e-01 -#> -1.0842e+01 1.9563e+00 2.8683e+00 6.0006e+00 -4.8396e+00 -3.4927e+00 -#> 5.7079e-01 -9.4869e+00 -7.2187e+00 -2.3264e+00 1.1500e+00 1.3042e+01 -#> 9.3394e-02 -1.3834e+01 -1.0763e+01 -9.9645e+00 1.9327e+01 -3.2349e+00 -#> 1.6126e+01 -7.6733e+00 7.4829e+00 -1.7533e+00 -4.0451e+00 6.5047e+00 -#> -7.6359e+00 -1.2013e+01 -7.0419e+00 7.2818e+00 -9.0434e+00 6.7979e+00 -#> -7.6891e+00 -3.2007e+00 -1.9260e+01 1.1613e+01 1.1378e+01 1.1982e+01 -#> 1.8505e+01 1.5860e+01 1.0207e+01 -1.2126e+01 -3.8591e+00 -5.2286e+00 -#> -1.0642e+01 9.9338e+00 -4.5486e+00 -2.5227e+00 -4.0303e-02 -1.1448e+00 -#> -2.2505e+00 2.7698e+00 1.3077e+01 -9.0763e+00 7.3168e+00 -5.4729e-01 -#> -1.9906e+00 -1.5236e+00 -5.2221e+00 -6.9238e-01 9.1291e+00 2.5066e+00 -#> -3.4526e+00 1.7163e+00 -1.1267e+00 4.2589e+00 -4.5121e-01 1.1336e+00 -#> -8.9524e+00 -9.5551e+00 -4.5162e+00 -2.4994e+00 1.2130e+00 -2.3913e+01 -#> -4.9009e+00 -8.7625e+00 3.8590e+00 -2.1340e+00 -7.0136e+00 -7.2723e+00 -#> -4.2614e+00 -1.5789e+01 1.1984e+01 1.4028e+01 2.2274e+00 -9.6963e+00 -#> 1.0833e+01 1.0684e+01 3.9278e+00 -2.4064e+00 -7.7770e+00 8.4217e+00 -#> 5.8047e+00 1.9016e+01 -1.1636e+01 5.5353e+00 -4.7503e+00 1.2186e+01 -#> -5.2927e+00 1.4010e+00 1.2860e+00 6.6502e+00 -5.9610e+00 -1.4549e+01 -#> -#> Columns 19 to 24 -7.8774e+00 9.3987e+00 -3.1508e+00 3.2035e+00 -1.7237e+00 -2.2337e+00 -#> 9.8753e+00 -5.1873e+00 -1.5979e-01 7.0176e+00 1.4790e+00 -2.2652e+00 -#> 2.4684e+00 -1.5629e+01 -1.0845e+00 2.5782e+00 5.0271e+00 -1.5199e+01 -#> 1.1172e+01 1.2462e+00 1.2406e+01 7.3422e+00 -1.3610e+00 -7.2763e+00 -#> 1.1630e+01 -6.2654e+00 1.3605e+00 3.8456e+00 9.3753e+00 -9.7017e+00 -#> 2.6302e+00 -4.7561e+00 -1.1254e+01 -2.3001e+01 4.0595e-01 6.5323e+00 -#> -1.6812e+01 -1.8110e+01 -3.6096e+00 -3.5465e-02 -1.4968e+01 9.3158e+00 -#> -2.4737e+01 8.3503e+00 -1.7397e+00 2.9828e+00 -8.2400e+00 4.5024e+00 -#> -3.7867e+00 8.1052e+00 -8.3192e-01 8.8176e+00 4.7793e+00 -1.0590e+00 -#> -3.2457e+00 2.4234e+00 -5.5022e+00 -5.7761e-01 5.2855e+00 -7.4730e+00 -#> -1.1944e+01 -6.8048e-01 -8.7536e+00 -1.1724e+00 3.2814e+00 -1.1310e+01 -#> 2.9503e+00 1.9739e+00 -8.4449e+00 -1.2881e+01 -7.4179e+00 -2.1671e+00 -#> -7.6667e+00 9.0520e+00 -4.4530e+00 3.8394e+00 -4.5973e+00 1.5565e+01 -#> 1.2824e+01 -3.3694e+00 1.1752e+01 -2.3582e+00 5.0714e+00 4.2392e+00 -#> 8.0841e+00 -2.5142e+00 -1.0000e+01 1.6647e+00 -8.9167e+00 -1.6243e+01 -#> 1.3602e+01 2.1327e+00 -1.9909e+01 2.4326e+00 1.2900e+01 -3.7971e+00 -#> 1.8914e+01 6.4058e+00 2.3033e-01 -8.2163e+00 3.4782e+00 -1.0740e+00 -#> 6.3810e+00 -4.0515e+00 1.0759e+01 -8.3499e-01 7.3965e-01 -9.1388e+00 -#> -5.8142e+00 4.5707e-02 1.1224e+00 -4.1406e+00 6.5294e+00 -6.1837e+00 -#> -9.5804e+00 -7.5517e+00 -4.6984e+00 1.7209e+00 -6.8727e+00 1.3254e-01 -#> 8.7612e+00 -5.2527e+00 -6.8012e+00 -9.3108e-01 6.9843e+00 1.3341e+01 -#> -1.4546e+01 9.7236e+00 -6.3226e-01 -4.3340e+00 -5.9874e+00 -3.2339e+00 -#> 3.9066e+00 2.2284e-01 -1.4156e+01 -3.8453e+00 -8.6821e+00 -1.9315e+00 -#> -1.5990e+00 7.5505e-01 -1.8868e+00 1.4217e+01 1.5455e+00 -2.3546e+00 -#> -1.0250e+01 -7.7651e+00 -7.1299e+00 -3.6324e+00 3.5405e+00 -4.7819e+00 -#> 1.4602e+01 1.3240e+00 9.4274e+00 -8.6895e+00 -5.1178e+00 2.1310e+00 -#> 8.2942e+00 -2.1216e+01 9.4975e+00 -7.5825e+00 1.0006e+01 -2.7521e+00 -#> 1.1945e+01 -1.4659e+01 6.7279e+00 -6.1521e+00 4.0640e+00 -6.7216e+00 -#> -6.6725e+00 2.7873e+00 8.7322e+00 9.3220e+00 5.0820e+00 5.8792e+00 -#> 3.6809e+00 -1.6383e+01 -1.5499e+01 1.1922e+01 3.0115e+00 -2.6662e+01 -#> -3.1499e-01 7.1257e+00 -1.5790e+01 1.0345e+00 2.9703e+00 4.5194e-01 -#> -1.3069e+00 -1.4656e+00 -2.5011e+00 8.5862e+00 -3.4828e-01 1.6991e+01 -#> 2.6757e+00 -2.0333e+01 1.2204e+00 -5.4096e+00 6.6630e+00 -6.8385e+00 -#> -#> Columns 25 to 30 -1.5597e+00 -4.3700e+00 -8.6704e+00 -4.0963e+00 -1.6321e+01 -4.3423e+00 -#> 2.9396e+00 -5.9848e+00 2.3327e+00 9.8268e+00 1.4889e+00 1.8608e+01 -#> -3.4125e+00 9.3013e+00 1.5506e+00 1.0031e+01 -5.2014e+00 -6.4869e+00 -#> -3.6186e+00 -5.2734e+00 -8.8278e+00 1.1643e+01 -1.0888e+01 -7.8657e+00 -#> 8.9388e-01 -3.4935e+00 -1.1399e+01 8.2926e+00 1.1714e+01 1.8860e+01 -#> -2.4677e+01 -1.4618e+01 1.2835e+01 2.7846e+00 3.1225e+00 -7.5194e+00 -#> -3.2098e+00 -1.2272e+01 -7.2635e+00 2.9061e+00 -1.4480e+00 -7.6684e+00 -#> 6.1910e+00 1.7474e+00 -1.5868e+00 -1.4324e+01 -3.1537e+00 2.5270e+00 -#> -1.4938e+01 -2.9654e-01 -5.9241e+00 4.5311e+00 5.1367e-01 -4.2442e+00 -#> -1.3926e+01 1.4841e+01 1.2362e+01 3.4737e+00 -3.6818e+00 6.5808e+00 -#> 5.6785e+00 4.2856e+00 -4.1781e+00 -4.9512e+00 1.3393e+01 -4.2968e-01 -#> -5.5229e+00 -1.6487e+01 6.4846e-01 7.7945e-01 1.2717e+01 1.9683e+00 -#> 4.8212e-01 -1.0946e+01 5.6029e+00 1.0808e+01 1.0435e+00 -4.6982e+00 -#> -8.2592e+00 -5.3752e+00 -5.2363e+00 -5.4708e-01 -7.7221e+00 1.0334e+01 -#> -5.4707e+00 2.1817e+01 6.3952e-01 2.8722e+00 -7.4404e+00 -9.4587e+00 -#> -9.4711e+00 3.9907e+00 2.3350e+00 6.3608e+00 1.4797e+01 8.6591e+00 -#> -1.5397e+01 -1.2832e+01 -1.2764e+01 -9.1527e+00 -1.3297e+00 3.4214e+00 -#> 2.7281e+01 -1.0290e+01 -8.7485e-02 -3.3851e+00 6.6595e+00 -2.6885e+00 -#> 7.5860e+00 4.2211e+00 1.8520e+01 -1.6380e+01 -5.6852e+00 -1.8309e+01 -#> -4.2865e+00 1.0817e+01 6.2852e+00 -7.2758e+00 -4.1168e+00 3.6913e+00 -#> 2.4672e+00 -1.1057e+01 2.4086e+00 -4.3137e+00 9.9728e+00 1.6034e+01 -#> -6.7063e+00 -6.8820e+00 4.8363e-01 1.1105e+01 -1.0333e+01 -1.6367e+00 -#> -2.1280e+01 -4.1415e+00 6.0902e-01 -1.1766e+01 9.4662e+00 1.6602e+01 -#> -1.0490e+00 -9.4947e+00 8.6725e+00 3.2525e-01 4.2604e+00 -2.7379e+00 -#> 5.9630e+00 -5.8807e+00 1.1127e+01 -7.1153e+00 1.0029e+01 -1.1379e+01 -#> -2.1555e+00 6.7623e+00 2.2071e+00 -1.9433e+00 1.1161e+01 -7.5301e+00 -#> -1.9755e+00 7.1774e+00 -2.1621e+00 -4.2744e+00 -1.2805e+01 -1.0321e+01 -#> 1.6206e+01 -9.0574e+00 -1.3386e+01 1.4836e+00 -3.6717e+00 1.9019e+00 -#> 1.8684e+01 -1.0844e+01 -3.7766e+00 -1.4817e-01 -6.8664e+00 -1.0706e+01 -#> 4.5852e+00 1.4066e+01 9.4515e+00 7.7569e-01 -4.8496e+00 -1.4866e+01 -#> -2.0469e+00 -4.9087e-01 7.5441e-01 -2.6077e+00 -3.6706e-02 1.2476e+01 -#> 1.8071e-01 -7.4970e+00 -3.3489e-01 6.6271e+00 -2.0375e+00 -1.4445e+01 -#> -1.8088e+00 4.1301e+00 -2.2527e+00 2.1262e+00 -3.4904e-01 6.9723e+00 -#> -#> Columns 31 to 36 -6.5040e+00 5.8752e+00 3.1383e+00 1.1837e+01 6.4853e+00 1.0323e+01 -#> 4.4193e+00 2.0463e+00 5.9618e-01 7.4200e+00 -1.3114e+01 -1.4656e+00 -#> -1.0508e+01 3.8826e+00 5.1851e+00 -9.6904e+00 1.1859e+01 4.9136e+00 -#> -4.1396e+00 -7.9817e+00 -5.4383e+00 -1.7414e+01 7.0854e+00 -9.6429e+00 -#> 2.2758e+00 6.2357e+00 9.3515e+00 -6.7403e+00 1.8872e+01 -5.0694e+00 -#> 5.8505e+00 4.4309e+00 1.3236e+01 1.0722e+01 8.9095e+00 -2.9298e+00 -#> 6.5115e+00 3.5489e+00 -9.1628e-01 2.5802e+00 7.0845e+00 1.8733e+01 -#> -3.0491e+00 3.2534e+00 1.1997e+00 8.8833e-01 -2.1584e+00 2.3641e+00 -#> 1.6468e+00 4.4843e+00 3.3471e+00 -1.8669e+00 5.9386e-01 8.3609e+00 -#> -2.0089e+00 -1.9370e+01 -2.9176e+00 -5.4528e+00 5.9043e+00 8.6076e+00 -#> -9.3280e+00 6.0964e+00 3.1025e+00 1.0701e+00 1.0776e+01 9.5514e+00 -#> 6.1485e+00 -3.3274e+00 8.4756e+00 -4.8317e+00 -2.7427e+00 1.5232e+00 -#> 4.6530e+00 9.2008e-01 -6.9505e+00 -6.6205e+00 7.9752e+00 -1.4570e-01 -#> -2.1591e+00 -1.0310e+01 -1.2721e+01 -2.1465e+01 -1.4941e+01 5.3263e+00 -#> 8.0429e-01 9.5050e+00 5.3638e+00 -5.6013e+00 2.7263e+00 -3.2744e+00 -#> 2.6271e+00 -7.3372e+00 6.6152e+00 2.8711e+00 1.1847e+01 -3.4755e-01 -#> -3.4368e+00 -9.0752e+00 -1.5055e+00 -3.1626e+00 -9.8245e+00 -4.7415e-01 -#> 2.9191e+00 2.4938e-01 1.8888e-01 9.8061e+00 -1.7531e+01 -6.0103e+00 -#> -4.7865e+00 1.6937e-01 -5.0390e+00 4.5317e+00 2.9974e+00 1.9773e+01 -#> 2.9439e+00 4.0795e+00 -3.9380e+00 -6.3246e+00 1.0128e+00 -4.5317e+00 -#> 2.6851e+00 -5.0098e+00 -6.2676e+00 6.7208e+00 -1.0663e+01 5.7691e+00 -#> -1.2311e+01 -6.5510e-01 7.9312e+00 -1.9693e+00 -6.8400e+00 1.4329e+01 -#> 9.2061e+00 -5.5078e+00 -5.9201e+00 -1.1016e+01 -1.0204e+01 -9.2475e+00 -#> 1.3350e+01 1.9800e+00 -9.1722e+00 -3.6948e+00 8.2696e+00 8.0762e+00 -#> 4.6315e+00 -7.5339e+00 1.5947e+00 -1.1322e+01 -8.3186e+00 -6.6386e+00 -#> -5.7139e+00 -1.3925e+01 -2.3077e+00 1.1494e+00 -7.2648e+00 2.1355e+00 -#> -4.2203e+00 2.6027e+00 1.7255e+00 -5.4968e+00 3.3107e+00 -7.9746e+00 -#> -3.7495e+00 1.9044e+00 -5.6979e+00 1.2216e+00 2.1435e+00 4.0822e+00 -#> -4.3472e+00 -1.2315e+01 -5.7846e+00 -9.3757e+00 6.5540e+00 -5.8813e+00 -#> -4.1403e+00 -3.7812e+00 -4.1460e+00 3.9681e+00 1.9608e-01 3.2650e-01 -#> 1.4559e+00 -1.7231e+00 1.6940e+00 -5.9361e+00 9.1474e+00 -6.8446e-01 -#> -5.8992e+00 -3.6607e-02 -4.8166e-01 4.3366e-01 -1.3624e+00 8.6887e+00 -#> 5.1203e+00 4.1813e+00 -9.7441e+00 -2.3176e+00 6.0868e+00 8.1336e+00 -#> -#> Columns 37 to 42 1.1062e+01 -2.0296e+00 8.4705e+00 -3.9911e+00 8.3573e-01 1.9506e+00 -#> -1.1974e+01 5.3647e+00 -1.1954e+01 -5.3444e+00 -1.2646e+01 -1.6711e+01 -#> 2.1595e+01 -4.4778e+00 1.0384e+01 -5.3072e-01 -9.7432e+00 -1.0414e+01 -#> -1.0581e+01 -4.4263e-01 4.7731e+00 -2.6030e+00 6.7749e+00 -1.6233e+00 -#> -9.3047e+00 -1.5634e+01 -1.5564e+01 -1.4220e+01 -1.4309e+01 -6.0601e-01 -#> 5.9982e-01 -4.9061e+00 3.9769e+00 -6.1671e+00 -1.0324e+01 4.0731e+00 -#> -4.3588e+00 -5.6208e+00 6.1008e+00 -9.9672e+00 1.3759e+00 4.8355e+00 -#> 7.6434e+00 -1.1719e+01 -4.9626e+00 -5.4398e-02 -9.2290e+00 5.2844e+00 -#> 1.2625e+01 -1.1581e+01 1.8633e+01 5.7880e+00 -1.1121e+01 2.0442e+01 -#> -3.0447e+00 5.1596e-01 -1.4120e+01 -2.8625e+00 5.8997e+00 4.8108e+00 -#> -1.3048e+00 1.6969e+00 -6.9976e+00 -2.0704e+00 -1.4876e+01 1.3682e+01 -#> -1.0865e+00 3.1554e+00 1.2313e+00 8.0548e-01 -2.0513e+01 7.1541e+00 -#> -1.4557e+00 2.9544e+00 2.4083e+00 1.6820e+00 7.4094e+00 2.2584e+01 -#> -2.5237e+00 1.0195e+01 -2.6441e+00 -3.7556e+00 -3.9881e+00 2.7679e+00 -#> 1.6696e+01 -7.1908e+00 2.1827e+00 -2.7923e+00 -1.9770e+01 5.8063e+00 -#> -1.1795e+01 1.0671e+01 2.6017e+00 -8.2290e+00 -1.6968e+00 7.3746e+00 -#> -1.9721e+01 3.6066e-01 -2.9877e+00 -8.9195e-01 -2.4205e+00 -1.4037e+01 -#> -1.7180e+00 2.5656e+01 1.3118e+01 2.6885e+00 2.4311e+00 -1.9627e+01 -#> -1.2209e+00 -1.7275e+01 -1.0422e+01 -1.7955e+00 -3.1315e+00 2.8352e+01 -#> 5.5952e+00 -1.3472e-01 -1.9804e+00 -2.7732e+00 -3.5788e+00 1.0023e+01 -#> 4.8103e-03 1.3171e+01 -7.5450e-01 8.6808e+00 7.5673e+00 1.6787e+00 -#> 7.9544e+00 8.2651e+00 -1.1695e+01 -1.4135e+01 -3.2763e+00 6.9574e+00 -#> -4.4499e+00 4.4580e+00 1.5868e+01 -4.1087e+00 -8.7055e+00 -4.7099e+00 -#> -9.6114e-01 9.5251e-01 1.4457e+01 2.6207e+00 -2.6928e+00 1.6989e+01 -#> 1.2193e+01 7.8603e+00 5.9201e+00 -2.7639e-01 9.1031e+00 -1.2982e+01 -#> 5.6224e+00 6.9850e+00 1.0509e+01 1.2624e+01 1.2922e+01 -1.3763e-01 -#> 1.1296e+00 5.3891e+00 2.9965e+00 -7.8478e+00 1.1889e+01 -2.1520e+00 -#> 1.0341e+01 -1.0417e+01 7.8661e+00 -8.1170e+00 -4.0068e+00 -7.5216e+00 -#> 4.2167e-01 9.3623e+00 6.6411e+00 2.1331e+01 1.4457e+01 8.5066e+00 -#> -5.5189e+00 5.7353e+00 8.0404e+00 -3.5269e+00 1.4370e+01 8.0252e-01 -#> -5.0023e+00 8.6836e+00 -1.6432e+00 -3.7779e+00 -1.8732e+01 7.3776e+00 -#> -2.2673e+00 1.3026e+01 1.6185e+01 5.9499e+00 8.0218e+00 1.3399e+01 -#> 4.0683e+00 4.2663e+00 1.6034e+01 2.7645e-01 7.4202e+00 -2.4128e+00 -#> -#> Columns 43 to 48 -1.2115e+01 1.1046e+00 -1.2515e+01 2.2028e+00 -7.2036e+00 -4.0879e+00 -#> 3.2179e+00 1.7676e+00 -1.1806e+00 1.7817e+01 -4.8552e+00 4.6124e+00 -#> -4.8195e+00 9.0433e+00 -4.6589e+00 7.1531e+00 -1.0476e-01 -2.9681e+00 -#> -9.5588e+00 -2.9691e+00 -3.7199e+00 8.4445e+00 5.2903e+00 -3.8193e+00 -#> -5.4232e-01 -3.8415e+00 -7.7434e+00 2.5880e+00 1.6367e+00 -1.3412e+01 -#> -8.0844e+00 3.3727e-01 5.1215e+00 -1.5609e+01 -5.1258e+00 8.3984e+00 -#> -2.3635e+00 -3.6531e+00 -5.6427e+00 -8.5791e+00 5.5740e-01 -1.5238e+01 -#> -4.3731e+00 1.1792e+00 1.4618e+01 -2.5758e+01 -1.0487e+01 -2.2543e+00 -#> -1.5963e+01 -1.3927e+01 2.1588e-01 -3.2920e+00 9.5727e+00 -5.3125e+00 -#> 2.9978e+00 8.8854e+00 6.2099e+00 -4.2905e+00 -9.6987e+00 1.4991e+00 -#> -2.1656e+00 -7.8803e+00 -4.0464e+00 -1.5159e+01 -1.1318e+01 -4.2191e+00 -#> -6.2049e+00 5.2066e+00 1.2489e+01 -7.5448e-01 7.2532e+00 -5.8020e+00 -#> -5.8869e+00 1.3076e+00 1.9979e+01 -9.7389e+00 -1.3736e+01 -1.6782e+01 -#> 5.1603e+00 -6.6834e+00 5.8808e+00 9.0616e+00 1.1712e+01 -6.1902e+00 -#> -6.4048e+00 7.1058e+00 -1.6456e+00 2.8912e+00 3.1004e+01 -1.2531e+01 -#> 6.8006e+00 1.9610e+01 -9.1857e+00 4.4045e+00 2.9447e+00 1.6091e+01 -#> 1.6478e+01 4.0959e-01 -2.3223e-02 -7.2793e+00 -2.9199e+00 -1.1647e+01 -#> -9.9995e+00 -8.1696e+00 1.4737e+00 6.5148e+00 -1.0667e+01 -3.2807e+00 -#> 2.2444e+00 -2.0464e+01 3.0094e+00 -2.6361e+00 -3.3323e+00 1.1333e+01 -#> -1.0368e+01 7.5013e+00 1.0117e+01 1.3035e+01 -7.9614e+00 8.9583e+00 -#> 5.1484e+00 -3.3187e+00 4.4564e+00 4.0747e+00 -1.4273e+01 -4.7209e+00 -#> -6.8290e-01 5.9370e+00 1.9618e-01 2.6352e+00 8.7978e+00 1.3552e+01 -#> -3.7305e+00 2.9977e+00 1.2201e+01 8.4139e+00 1.1424e+01 -4.4993e+00 -#> 1.6751e+01 -8.0573e+00 1.2008e+00 -3.8915e+00 1.2050e+01 1.0548e+01 -#> 6.2575e+00 1.6294e+01 3.5308e+00 -1.2285e+00 4.1870e+00 8.4701e+00 -#> 3.4958e+00 1.4690e+01 8.2981e+00 -1.0359e+01 -1.0257e+01 -7.3372e+00 -#> -6.5664e+00 9.0567e-01 1.2777e+01 -3.5199e+00 2.0997e+00 4.1423e+00 -#> 8.9526e+00 -1.3091e+01 -1.2533e+00 6.2042e+00 -7.8835e+00 -1.1464e+01 -#> 9.2127e+00 -3.9388e-01 7.7766e+00 -1.4394e+01 -1.3689e+01 -1.4557e+01 -#> -3.7454e+00 3.2210e+00 -1.3385e+01 4.2240e-01 8.1938e+00 -5.3303e+00 -#> 2.7591e+00 -4.6040e+00 2.7475e+00 -5.0884e+00 -1.3997e+00 4.3406e+00 -#> 4.6150e+00 -9.1322e+00 2.9752e+00 1.5551e+01 -1.3454e-01 -1.5530e+01 -#> 1.6372e+01 1.1288e+00 -3.9405e-01 -3.8099e+00 -6.5638e+00 -7.1677e+00 -#> -#> Columns 49 to 54 2.5190e+00 1.1175e+01 1.1601e+01 -7.7790e+00 -5.6342e+00 3.8789e-01 -#> 7.7460e+00 1.1136e+01 1.7623e+00 6.0582e+00 -1.1285e+00 2.3664e+00 -#> 3.6941e+00 5.5731e+00 1.9186e+00 4.5717e+00 -1.0011e+01 -3.9924e-01 -#> 1.0815e+01 2.3231e+01 -4.9671e+00 -4.7148e+00 7.6496e-01 2.7236e+00 -#> -8.7334e+00 7.1061e-01 2.8245e+00 7.4412e-01 1.5410e-01 5.2671e+00 -#> 9.0524e+00 -4.5650e+00 6.4491e+00 1.6855e+00 7.0050e+00 -9.4149e-02 -#> 8.5457e+00 1.3220e+01 7.7125e+00 -6.8548e-01 3.0141e+00 -5.4146e+00 -#> 4.9509e+00 -2.3485e+00 1.4033e+01 9.0368e+00 3.3652e+00 5.8068e-01 -#> -2.0652e+00 -5.7014e+00 -3.8886e+00 -5.5111e+00 -5.6780e+00 -3.0847e+00 -#> -2.7895e+00 3.7967e+00 -1.0079e+00 1.4597e+01 4.4697e+00 3.1462e+00 -#> -5.1495e+00 2.1150e+00 5.2226e+00 -1.7447e+00 -4.7373e+00 2.6336e+00 -#> 1.9226e+00 -1.4160e+00 -4.5898e+00 -6.5440e+00 -2.6794e+00 3.3858e+00 -#> -9.6982e-01 4.6653e+00 9.2560e+00 -5.5485e-01 -1.1931e+00 1.4302e+00 -#> 3.4949e+00 -6.4405e+00 9.6045e+00 -2.4561e+00 5.1457e+00 -4.2440e+00 -#> 4.8166e+00 -3.9878e+00 -3.9815e+00 -1.3134e+01 -1.0717e-04 3.4807e+00 -#> -6.2559e+00 -4.7228e-01 -6.3469e+00 6.4103e-01 -3.5107e+00 3.2496e+00 -#> 2.0834e+00 1.0204e+01 1.1521e+01 1.6283e+01 6.7450e+00 4.0019e-01 -#> 5.8669e+00 8.3477e+00 -2.4140e-01 -5.7797e+00 -2.8879e-01 -3.9505e+00 -#> -5.4113e+00 -2.5000e+00 6.8480e+00 2.3189e+00 -2.7195e+00 3.8497e+00 -#> 3.4269e+00 -3.1728e+00 -1.1029e-01 -9.0684e-01 1.3045e+00 1.5835e+00 -#> -8.3938e+00 4.9229e-02 -2.7664e+00 3.1438e+00 -3.9460e+00 -2.1360e+00 -#> -4.3670e+00 -3.5643e+00 2.9556e+00 8.9076e+00 -6.1031e+00 1.0562e+00 -#> 4.2641e+00 -9.5336e-01 -3.9316e+00 -3.7728e-02 7.7775e+00 -5.3253e+00 -#> 2.4582e+00 -7.1300e+00 -3.4889e-01 -3.3291e+00 2.4103e+00 -5.9407e-01 -#> -2.3643e+01 -9.1169e+00 2.0369e-01 1.8650e+00 -5.5027e-01 -2.4543e+00 -#> -2.8845e+00 -4.8286e+00 8.2757e+00 -1.1491e+00 7.9805e+00 -1.8078e+00 -#> 6.6483e+00 -1.0365e+01 -8.0959e+00 -8.7907e+00 1.4690e+00 -7.9068e-01 -#> -8.7006e-01 1.8929e+00 4.0245e+00 1.9152e+00 -2.2162e+00 -2.2415e+00 -#> 3.6591e-01 -5.5330e+00 -8.4923e+00 1.8120e+00 5.9263e+00 3.9024e+00 -#> -6.2291e+00 8.7485e+00 -1.9414e+00 -7.9282e+00 4.7886e-01 4.3102e+00 -#> 5.4525e+00 -1.2424e+01 -7.3165e+00 2.8767e-01 -1.4393e-01 4.6123e+00 -#> -1.2733e+01 4.2735e+00 -2.6760e+00 1.1882e+01 -5.1786e+00 5.2276e+00 -#> 6.1855e+00 -8.2518e+00 8.2563e+00 8.2125e-01 1.9322e+00 -2.0311e-01 -#> -#> (13,.,.) = -#> Columns 1 to 8 3.3697 -1.6417 -2.5881 1.4601 4.3656 -6.9630 -4.3418 0.4763 -#> 1.1765 -1.5559 2.3349 -1.8196 -0.6485 10.1627 -6.8879 -7.2488 -#> 0.8502 1.8956 -11.6411 -5.9180 -5.4087 14.6186 -24.2376 0.7743 -#> -1.4274 3.4590 -9.0911 8.5080 -2.2069 9.8465 12.1775 6.1672 -#> 2.4107 3.6783 0.7586 -9.3225 4.0492 3.9703 -6.0808 0.2643 -#> -2.4323 1.7145 6.8121 -6.2586 8.6515 -8.2777 9.1493 -2.4481 -#> 4.9859 -0.4675 -15.7742 6.5457 -1.6402 0.0901 -6.2409 -7.5844 -#> -4.2467 2.7866 -5.5029 6.2417 -0.0762 -4.2140 -9.9406 8.5973 -#> -1.1883 7.0216 -1.1023 -6.5494 -6.9241 4.7190 -8.5682 6.3939 -#> -0.1639 5.8292 -8.6185 -16.1194 3.0383 6.4299 13.7305 -25.3119 -#> -3.3431 1.2020 -3.0795 -5.2280 8.8131 -1.5852 -3.8657 6.2194 -#> -0.8473 0.6341 -5.4182 -2.6758 -2.1707 -6.6575 -0.5795 0.3148 -#> -0.2078 -1.1004 -7.5125 7.7188 4.9043 -11.1906 -0.4454 6.0457 -#> 2.4375 2.4327 -6.1229 -7.1326 7.1260 -7.1742 4.7441 -3.1286 -#> 0.3591 -2.6065 1.6676 -8.6926 7.5336 5.8167 -6.4093 -9.9438 -#> 0.9972 8.1101 -0.7143 -10.3995 -6.7773 10.8206 11.4318 -4.4148 -#> -1.1437 -0.7181 -3.4500 -4.8707 -1.9937 5.6570 16.0217 5.5252 -#> 2.4227 8.3839 -2.4364 3.3629 -5.9968 -12.7720 7.5399 -6.6289 -#> -1.8409 -6.0536 1.2139 -12.4322 8.8837 -0.7812 8.3287 -2.8039 -#> 1.9115 -4.3994 -9.7979 -8.3191 7.2758 4.4275 -16.5625 -4.0027 -#> 3.4311 2.6624 2.9892 -10.4464 -2.2261 1.0225 0.3791 -8.0550 -#> -4.4709 -0.7903 0.7995 3.1568 -1.2654 11.3205 -1.4539 17.6271 -#> -1.7305 9.0243 -13.2431 -1.9026 -16.2815 0.5502 5.2861 7.0291 -#> -4.2350 -1.8609 0.3718 4.0760 0.5236 -8.7421 -6.9448 7.3357 -#> -1.3744 3.5020 -12.2297 13.4129 1.6466 -11.8234 -8.9527 1.1651 -#> 2.0638 -4.1548 8.5688 -6.7970 5.9547 -7.3101 3.1767 -17.7759 -#> -1.1274 -3.6897 10.8214 -4.8866 15.7748 -7.5586 4.6709 0.7997 -#> -4.9754 -8.4870 -0.9146 -1.8098 -2.3627 9.0146 -2.1974 12.3374 -#> -0.2340 -3.5367 2.8041 15.3531 0.9489 -13.3056 -0.0813 -1.9383 -#> 0.3238 -4.0122 -1.6916 9.1604 8.2029 -4.6214 -4.9758 -12.1864 -#> -3.2851 6.1557 6.3449 -3.0137 -10.8634 5.8261 7.5779 5.0891 -#> 7.6729 -5.9596 2.6869 -4.4836 21.3971 -9.8410 -1.4419 5.5365 -#> -2.3335 1.0808 -6.9140 -9.3190 -13.7158 -4.9634 -1.6641 9.4183 -#> -#> Columns 9 to 16 1.4272 6.9005 -8.1758 11.1291 1.9131 0.3419 -16.7107 2.3946 -#> 2.9979 9.6438 -1.1152 10.2243 10.0655 2.6768 0.7035 -7.3764 -#> 0.7594 -11.4686 -0.8667 -6.9460 -3.0995 -10.4431 9.9436 -4.9980 -#> -12.8362 6.3509 8.7563 1.5577 -6.4292 -1.0006 0.2837 2.4161 -#> -0.8063 -11.6076 5.4064 -1.6572 5.0197 -0.9882 6.2821 -11.3196 -#> -0.4334 -5.9995 8.1809 9.3257 -7.2008 11.0680 9.0294 11.7098 -#> 1.7393 -10.8395 -3.5343 -2.6829 -6.3353 9.3118 -4.4164 -0.7906 -#> 3.3009 10.0124 -0.3810 5.4467 -6.9107 -2.5141 2.8867 -7.0754 -#> 9.0013 2.4283 5.4354 6.2346 -2.8392 3.9953 0.3989 7.1242 -#> -1.1859 9.7886 -9.4103 4.9508 -3.2787 7.1588 0.6914 9.1727 -#> 3.9983 -4.8992 -4.3854 5.9145 8.3597 1.4387 5.1238 -11.0169 -#> 6.1850 -18.2773 16.1798 -0.8564 4.5464 -12.6405 16.6850 -5.5169 -#> -14.1683 -3.7150 8.3266 3.9403 -10.1845 3.7448 14.6025 -10.5982 -#> -2.1483 -1.9087 -4.0616 4.0809 -7.5876 -1.6227 6.5654 -12.7282 -#> -12.4211 -2.9235 4.4465 13.7550 -3.7480 -11.1861 -4.5634 0.1438 -#> -4.0812 -0.1487 1.5998 -2.6823 -1.8722 3.0849 0.9512 -3.3778 -#> 20.1330 -4.9240 0.1899 -1.7735 -2.1026 5.9774 1.7584 11.2253 -#> 2.7877 9.6033 2.8122 -10.5991 -4.5545 7.3662 3.2882 1.9574 -#> -1.0704 -11.4845 17.3092 0.6163 7.4801 -0.4920 -11.8827 3.2849 -#> -13.0745 8.8030 -2.1587 -2.8709 -12.8417 -10.1367 -7.5423 -9.6992 -#> -3.0761 7.7702 -2.4629 -3.4838 -3.1326 -6.3012 -0.3314 0.4033 -#> -0.9980 0.8173 -13.4021 5.3338 4.5769 0.6703 7.1546 -5.3624 -#> 6.4712 -7.9052 -8.2864 -10.3298 -5.9160 -9.6548 -6.2402 -3.2102 -#> 8.4025 -13.5483 4.8272 -1.4405 -7.0647 -2.4812 10.6723 -6.7507 -#> 10.1343 -10.1757 -5.0400 -3.4735 -3.4416 -8.9320 -2.7094 17.5644 -#> 9.3466 -0.2112 6.1917 -14.8537 7.7931 7.1170 8.1069 0.8100 -#> -0.9844 10.4327 9.0464 -0.4420 0.5409 -7.9996 4.3075 0.8815 -#> 4.1023 -5.1221 1.8569 2.7524 1.9987 -3.7410 1.0878 3.2000 -#> 4.1913 2.2717 14.7937 -8.9496 -7.7735 1.2565 10.5878 8.3647 -#> -19.0380 2.2726 -6.1166 9.6046 5.4028 -8.7725 10.7857 -15.7318 -#> -6.8082 -8.1418 2.6750 3.2514 -13.1861 3.7538 14.1491 -20.7132 -#> 2.5634 -7.6944 -1.8655 -0.1787 -5.3450 5.7111 1.7207 -2.0251 -#> 2.7810 -2.1352 -14.3000 -14.9511 -4.4896 2.1630 4.6799 -10.1312 -#> -#> Columns 17 to 24 4.1233 -17.0780 -3.9538 -2.8758 -3.5483 -7.9644 13.6918 7.0413 -#> -4.7687 0.5470 1.2882 -1.0949 -0.6276 1.5143 -9.6125 -1.6341 -#> -15.2562 -1.1686 -16.4718 8.9318 7.9628 9.2988 -2.5182 -9.5424 -#> -0.6619 5.0462 -12.8017 1.7106 -15.0936 14.4036 -13.6283 -22.3025 -#> -8.0790 -0.9867 3.7640 7.9434 1.0856 11.9036 15.1968 -13.2958 -#> -4.3092 -4.7185 -0.5972 7.3328 5.9813 0.2981 7.5770 5.4590 -#> 9.5222 -13.4433 -6.4562 -3.4667 3.6051 19.5683 -0.8224 11.0804 -#> -7.5286 -8.2387 -3.8117 9.0431 -1.1148 -11.5416 2.0163 2.2055 -#> -2.1770 6.3546 -7.6080 -5.0374 -6.2474 -7.7696 18.1866 6.6824 -#> -7.2299 -4.4548 -7.4621 2.6389 -14.8909 -8.9010 -13.5682 -8.0851 -#> 10.6522 -5.2483 4.8124 -14.9675 13.2051 7.2541 20.9926 -3.5390 -#> -17.9549 14.9953 -11.2778 -11.2486 -0.9213 1.1407 15.7856 1.4826 -#> 13.3178 -6.0633 -3.3993 -16.2574 -7.6736 4.8620 11.2303 -21.1926 -#> 15.5492 3.6144 6.5018 -6.6918 -13.9382 -14.1719 -3.3112 -1.1664 -#> -12.5781 6.3902 -10.8792 6.3373 8.9104 -0.7249 0.7847 -9.6297 -#> 5.4024 6.1987 0.7445 6.3244 -7.0532 13.4351 -0.4791 -7.8003 -#> -8.9566 4.4521 3.5503 0.7959 -2.4220 -4.2371 -16.6129 -3.5669 -#> -5.0858 -3.2332 4.4177 -2.7194 -5.7067 0.9085 -0.5359 23.8047 -#> 6.6007 1.4694 -14.1240 5.0184 2.4091 -14.1085 -10.9890 -13.9345 -#> -4.3680 -6.0325 5.6439 -3.9697 12.0833 -1.6750 -1.3681 -2.9893 -#> 3.2750 1.4166 7.3507 -13.6353 -13.9985 -15.8404 9.4659 5.0599 -#> 7.0789 1.9307 -7.8050 5.8272 -6.3961 -15.0296 10.2564 8.1358 -#> 2.6946 -0.3610 -6.2785 15.8479 1.6669 -1.4408 10.4534 14.8757 -#> 7.3278 3.8470 -8.3954 -5.5899 -7.4549 -2.1057 4.4085 16.0832 -#> -4.0996 19.4165 -3.9743 0.4099 4.5863 8.4134 -2.0642 4.1411 -#> -7.2185 -5.7259 -2.9592 -1.7083 -2.6126 -13.4320 -22.0603 7.3150 -#> 11.2710 -4.9157 6.0592 -2.3185 2.4728 6.5344 0.5997 -0.1768 -#> 1.0739 4.0677 12.0173 -1.5614 -6.8807 1.5222 8.9047 -7.8059 -#> 1.2063 7.1509 -8.9546 8.0253 -1.3432 -5.8292 -8.4954 -1.0355 -#> 7.4239 -4.5134 1.7243 -5.2007 -10.9802 16.0628 11.1323 -8.3712 -#> 4.0806 -0.9152 -2.2090 5.6934 -4.1941 -4.5412 9.3974 -0.6221 -#> 15.5935 0.7072 -10.0603 5.7345 1.6642 8.9193 5.1053 -2.7519 -#> 7.1138 -12.1894 0.6080 7.8461 -8.7989 22.0221 6.3918 9.6505 -#> -#> Columns 25 to 32 -2.6800 -5.7408 5.5992 2.6666 5.3303 -4.8828 -11.9063 3.5820 -#> 6.6322 -2.1588 -2.1263 1.4245 18.6545 -7.0352 -17.6687 -3.6482 -#> 12.9900 1.3050 -11.1928 -9.5810 -5.0537 8.9391 2.4537 -3.9121 -#> 11.7591 9.5109 -13.7638 -6.9739 6.2391 7.8595 -18.5428 -13.9119 -#> -7.2973 4.5046 -8.6841 0.8250 -1.5067 9.7065 7.1418 17.4973 -#> -5.0957 4.2618 6.5113 10.0076 2.1287 1.5233 13.4246 1.1008 -#> 13.6883 12.8369 2.2686 1.3899 -3.6863 -4.7827 -18.9132 0.2136 -#> 0.2509 10.7896 15.7100 -2.1480 2.9210 -11.5030 -2.3106 1.9727 -#> 1.2314 -13.5424 -16.2364 -15.6695 -8.0330 -4.8299 -2.7412 -10.9407 -#> -7.7083 9.8008 -1.3087 -8.5017 -15.5218 -5.4738 -14.0310 -5.9766 -#> 3.9822 -3.2992 4.6238 -6.0679 -6.0140 -5.3986 -3.0122 7.2745 -#> 0.5601 2.7388 -11.0853 -0.6145 -2.6784 -3.7761 5.2345 -10.7856 -#> -4.9741 3.9108 19.4934 -4.8232 9.6452 1.3108 0.9940 -2.7460 -#> 1.2843 -7.5330 -7.7228 12.3964 -4.2323 1.3258 -1.2630 -16.1458 -#> -1.2016 12.5133 -14.4585 5.6997 18.9993 10.7756 5.3936 7.2677 -#> -1.4915 2.3885 3.8869 -7.1487 7.6098 -3.9885 -3.6000 1.5721 -#> -9.4018 6.0380 -8.7007 15.1696 11.1714 3.3346 8.8122 -1.4180 -#> 8.4743 -7.6606 2.2928 6.4204 -2.3149 1.3092 0.0668 -23.7015 -#> -19.6935 7.9098 4.7952 -0.2914 -5.0775 5.1258 -13.0558 -19.8275 -#> 14.4639 -7.6469 1.4158 -1.6512 10.6585 7.2336 -15.5307 4.4070 -#> -20.4726 -3.5960 16.8774 12.4712 -10.9533 -14.0708 0.3716 -0.7920 -#> -6.4765 1.7232 11.5085 -4.0904 -4.2319 -14.4870 3.9247 12.0280 -#> 8.0161 -5.6225 -4.3256 -0.7993 5.5562 -0.6457 -10.7856 -8.2703 -#> 5.5643 -2.1936 -12.2500 1.4277 -7.6654 -3.2104 -5.2612 9.9382 -#> 5.8360 -5.2719 -11.2102 3.7008 -4.7814 4.4835 -3.3795 -11.0323 -#> -5.5010 -5.6203 -4.4000 7.3807 -1.5143 0.2017 1.5784 -7.5409 -#> 2.7717 3.4020 -14.4807 5.5100 -13.9762 -4.2596 -3.8514 7.4326 -#> -16.6801 -2.2176 -3.4166 -3.9260 -0.4806 1.3265 -18.2089 -8.6009 -#> -0.0372 9.5898 -11.3903 -1.9474 -2.1608 0.1678 10.6816 0.2370 -#> 7.6345 9.2515 17.8873 3.6420 -7.1994 5.8380 -6.9410 6.9322 -#> 7.5231 -5.2615 4.9636 -4.4058 -2.2565 -5.9325 5.9708 1.2859 -#> 4.2971 -4.3269 -0.0417 0.7345 21.4405 3.9511 12.8331 0.6572 -#> 6.0391 -3.2032 -9.0262 -4.9702 -5.1867 6.1670 -3.9718 0.8178 -#> -#> Columns 33 to 40 10.8348 14.7557 3.3659 -0.1846 -11.8096 4.6553 4.4372 3.0805 -#> -1.4661 -23.1553 9.9883 2.0956 4.4233 -18.6846 -8.7162 -8.2316 -#> -7.0449 2.0095 3.8353 -5.8263 19.1000 -13.2257 3.6385 7.9960 -#> 17.4574 -10.0265 -0.9992 -17.1924 -1.2527 -11.1004 3.5991 4.4013 -#> 0.8899 2.4346 -4.6935 -5.2829 3.8203 -5.5734 3.9520 16.4752 -#> 4.7666 15.8362 -2.0676 -1.8818 1.2573 20.5019 -14.3063 17.9730 -#> 17.2786 -12.8405 -3.7920 5.0189 1.6979 0.9932 12.1372 0.0466 -#> 3.7036 6.8519 -9.6157 6.8130 -9.7540 -3.0379 8.6012 -5.9710 -#> -5.5511 15.5132 -6.4415 -14.1951 3.1228 -9.8643 -1.0604 10.2775 -#> 14.2856 -10.0348 -12.9017 -7.8481 0.7524 -14.6779 -16.8381 8.3535 -#> -10.8446 4.1617 -6.2735 5.5409 2.7467 -14.3498 7.3312 9.8202 -#> -3.1855 6.0698 9.7219 -3.7233 -9.9178 -3.9497 1.5279 18.1157 -#> 2.7665 3.5974 -11.1750 -2.1840 -10.3358 -0.4430 14.4890 -2.0752 -#> -7.7241 -2.5540 8.6497 8.3299 2.8415 -20.8746 -24.1512 18.1693 -#> 9.1557 9.8385 -3.6903 7.4441 0.4999 17.7083 -16.9265 3.5991 -#> 23.0757 -8.6160 6.5006 -4.3907 10.3004 0.7045 1.5834 -10.7031 -#> -4.5615 -7.9168 2.6719 -0.5778 5.2735 1.1896 -2.8907 10.4492 -#> -9.2825 -7.2149 -0.4448 14.7885 -2.6618 -8.3046 4.2973 -6.7991 -#> 6.1126 20.5788 -26.1840 -1.7718 -11.1458 -7.2249 2.3674 -9.6824 -#> -1.3163 5.0976 3.1525 6.4770 6.3339 0.9214 -2.7082 5.1429 -#> -3.1457 12.5447 8.5838 2.9981 -7.9624 -12.5725 -10.0365 -9.2533 -#> 0.9133 5.9405 11.3644 -18.7311 4.3938 -7.6613 12.7711 -2.6682 -#> 30.8365 -7.4982 0.9603 21.9556 -6.2884 -11.5021 4.4946 10.3176 -#> 14.4685 0.7428 -16.6316 -8.9628 -4.4752 3.2429 -1.9568 11.5486 -#> 15.3273 -3.5033 -3.1176 4.2342 -2.5407 10.6061 -6.7412 0.8499 -#> -1.9486 -13.6522 1.0336 7.3432 0.7010 12.5901 -9.1793 10.5238 -#> -7.0167 -0.1492 -8.6312 5.7787 2.9533 14.0871 -16.4314 2.4016 -#> -18.2783 16.0457 8.4394 -4.0634 -8.0765 -7.8682 -6.6131 3.6371 -#> 3.8207 -4.2039 -16.9315 3.1379 -6.0499 1.3322 9.1753 -1.0800 -#> 15.7705 -12.9205 -13.6020 8.6659 6.2298 7.5931 -2.9307 1.7454 -#> 3.2957 16.8939 11.6730 -2.9389 8.2990 -14.7993 -2.8298 9.3996 -#> 1.9768 2.8376 3.7889 3.5501 -10.1581 -5.4513 8.9801 -7.9636 -#> 10.2112 -16.3559 -7.1759 -0.9930 9.6437 -1.5389 -1.1243 10.5625 -#> -#> Columns 41 to 48 -3.3578 -1.1139 -2.4762 -11.2209 9.3577 -8.8033 13.6489 4.8650 -#> 6.5548 1.7133 -0.1935 5.5874 -10.3480 -6.8775 3.3956 -6.4977 -#> 13.0252 -6.4013 -1.1414 11.4570 -7.0205 6.2467 -4.4751 -4.3536 -#> 1.8241 28.0728 -12.6101 8.6788 2.0485 6.1478 -12.4244 -16.7476 -#> -8.6532 -8.1284 6.3043 -12.8859 -12.4685 9.6203 -18.7328 -18.8922 -#> -15.2930 -5.9777 -0.0329 -28.5118 -3.1383 2.8168 0.3067 -13.5621 -#> -14.5134 -3.2274 -0.0055 4.0627 2.8727 1.3799 -6.3114 0.0714 -#> -0.6586 5.8188 -6.9768 -4.4756 2.2888 2.8688 -13.7075 11.0588 -#> 10.5744 13.1215 -2.0282 11.4851 2.1987 9.1812 3.6154 14.4257 -#> 3.2649 20.1229 -6.1985 -15.9335 -3.7413 -2.8519 -9.2608 -15.2161 -#> -6.5173 -7.6655 -0.2815 -4.8623 2.1578 -1.2280 9.3485 -0.0405 -#> 11.4033 7.2183 -8.9260 1.6441 0.8568 -7.4824 1.2641 3.8432 -#> -6.9809 6.6826 -9.4452 -2.2702 3.1859 -2.7236 -9.9576 1.2124 -#> -5.0526 25.2570 3.2705 -8.0824 0.9330 -8.9063 14.1930 -0.4507 -#> 1.8832 -4.0172 0.1316 -12.6761 -5.1909 9.1281 -10.9429 18.3555 -#> -3.4178 0.2063 1.9687 3.5427 -13.6490 3.5390 -2.1170 -7.5645 -#> -4.9405 -9.4943 -8.6500 -7.4028 -4.2744 -15.9314 10.8900 -7.9999 -#> -5.6152 0.7457 -8.4452 15.5308 -9.3691 -2.9904 10.8867 2.9986 -#> -20.7752 -1.4896 7.0495 -8.1634 6.6833 18.6335 -6.1095 18.3737 -#> 10.9329 1.9044 6.1540 -2.9243 -0.0544 -3.7527 -3.9820 10.0857 -#> -1.5593 -0.9560 16.8549 -3.6897 -10.3189 -0.4447 11.5176 -5.2639 -#> 6.5579 13.6265 -3.9144 10.4601 3.4433 9.1518 6.4529 -4.3566 -#> 11.3136 12.5980 7.1762 -5.4726 -20.0146 10.0288 -6.0000 -3.1094 -#> -6.4761 -7.8402 9.1078 1.4559 -1.1450 2.7579 4.0419 -2.2974 -#> 3.5403 -6.0334 -3.7099 3.2879 6.2914 -9.1560 0.7688 -11.4910 -#> 4.6251 -0.9901 1.4971 -9.3215 -6.6628 -6.7178 8.7838 6.6056 -#> 2.6455 0.9620 13.9202 -10.0320 13.9161 -1.2996 6.1395 2.7685 -#> 0.3381 3.3796 -0.1898 -3.0536 7.6887 1.7112 16.5639 -2.5761 -#> 5.5124 -6.3711 -2.4787 7.4375 -13.7518 3.8744 -4.3973 2.8493 -#> -2.6076 -9.5805 -3.0601 -13.9946 4.2460 -6.1053 -2.4415 -2.1601 -#> -3.5259 8.7255 -4.6512 9.3553 -13.5940 -4.1108 14.2099 -19.3789 -#> -9.5535 7.4221 11.2101 -15.1180 6.3322 -0.1481 -11.3736 12.0167 -#> -4.7469 3.2653 -3.6165 -11.0855 5.2520 -7.3072 -6.8884 -0.6938 -#> -#> Columns 49 to 54 -9.5841 -3.1883 4.8819 0.9323 2.6546 3.5816 -#> -0.0657 -3.6876 6.0528 13.6310 12.1754 0.8895 -#> 6.4921 -1.6472 0.3413 -4.9985 -5.0039 10.1445 -#> 17.9599 -1.9585 2.2322 8.4005 -2.2082 -4.5209 -#> 1.2629 -0.2790 15.5301 7.7452 7.6565 8.6430 -#> -8.3178 18.0239 1.7017 -4.8925 6.3538 1.4147 -#> 1.9469 6.5868 -6.3767 11.1967 -4.1276 -3.9215 -#> -6.0379 11.6491 8.1192 14.2532 2.8171 -3.9183 -#> -7.5273 -5.7284 -2.2543 5.3219 -13.6635 -1.2019 -#> 11.3169 16.1229 5.3380 5.9345 7.8532 -1.9193 -#> -1.1561 0.6397 -5.8041 -5.6023 3.9126 4.1689 -#> -11.1197 9.0098 0.2090 -4.2656 5.6267 2.1441 -#> 0.2229 23.9966 4.7711 -3.8389 -2.0389 0.9057 -#> -0.7366 -1.9622 -0.0843 5.3703 7.7059 -5.1786 -#> 7.1206 -9.2183 -13.2013 2.0726 5.0482 0.8396 -#> 14.2594 7.7721 3.2311 -13.0711 -4.3741 0.9782 -#> 3.7075 11.6253 2.4502 1.5064 0.6112 1.8650 -#> 0.5343 4.5191 -2.2040 -3.6395 3.1832 -3.9657 -#> -10.2922 0.4036 -10.7387 -0.4683 7.3636 -4.6882 -#> 5.1110 2.4479 -8.0885 -0.8889 -1.2094 -0.3844 -#> -4.1646 14.5772 16.0076 -7.5377 -3.2482 -3.9920 -#> 0.1385 -18.6015 0.7036 -10.1109 -3.2682 3.0401 -#> 4.0277 3.9081 -8.8889 2.1189 -7.2020 -7.7930 -#> -5.4221 -2.3047 -2.3612 -5.3930 1.0088 -1.1463 -#> 10.4664 -0.2838 2.3142 -2.4206 -3.5280 -7.2229 -#> -5.9612 11.5086 -9.4708 9.8642 0.9766 -3.3587 -#> -0.9830 -5.6997 0.4194 -5.3989 -7.7813 0.5598 -#> 5.4461 -11.4660 14.4418 -5.9451 -5.2957 -3.8965 -#> 9.3387 3.2802 13.7992 -1.7918 2.2744 -0.3253 -#> -5.5320 6.5413 0.8090 0.4279 1.2553 -0.8095 -#> -6.7395 4.1986 -5.4247 -12.3497 4.4859 -1.4505 -#> -0.5004 -13.5893 -4.6648 -16.0753 -8.4585 -0.8199 -#> 11.9082 0.3645 -0.3096 -5.3894 -1.3357 -2.1738 -#> -#> (14,.,.) = -#> Columns 1 to 8 0.3770 11.4530 12.3254 -8.1717 2.0064 9.2537 -6.0489 -2.5420 -#> 3.3567 0.3686 -2.1060 -8.9118 -3.2569 8.7885 -6.6666 -0.7948 -#> 1.4497 -6.2063 -8.3565 5.7694 0.5264 6.2934 -8.0693 -3.4899 -#> -3.0604 -2.0974 3.9263 -2.7745 -3.1499 14.5660 10.1938 -4.8976 -#> -5.2067 -7.2476 -3.7690 2.6146 -5.7421 -14.2176 -5.4898 9.6478 -#> 2.7384 9.4170 -1.8324 9.9095 6.4603 -9.8938 -1.7702 5.1115 -#> -4.0225 2.1869 4.1249 -6.0901 2.7495 -9.6204 1.9513 6.8786 -#> 6.3655 -0.8163 -3.4434 8.3752 -2.3094 6.4114 -4.9764 0.9664 -#> -0.0141 2.9467 9.7410 -3.6164 -2.9697 3.0740 1.8683 15.1830 -#> 4.5287 7.7451 1.3771 -5.9612 18.3195 3.0365 -13.9349 0.0934 -#> -2.5749 -7.0213 3.1655 8.8867 7.6434 -14.2088 6.4947 0.1312 -#> 0.2012 0.2631 -14.1806 9.5833 2.1844 -8.8791 6.7946 -6.6629 -#> -1.3827 -1.3199 4.2433 13.1405 -5.5283 5.0318 7.1572 -6.3521 -#> -1.7635 -9.2385 1.3914 -0.1005 3.9263 0.8530 2.5017 -0.9691 -#> 6.6711 -7.8012 -4.7570 8.3643 -3.3958 -8.0918 -3.4167 2.7507 -#> -0.3402 0.0913 6.1923 -0.4942 9.8246 -10.3545 -3.3388 -2.1711 -#> -0.1663 2.2720 -4.4339 11.3540 13.9740 -2.5846 2.5917 8.9986 -#> -7.9547 3.4470 2.9946 -0.8519 -3.0121 5.5671 2.6166 -6.7560 -#> -2.8500 14.7417 11.9077 0.6807 -3.3666 -5.2585 -1.2208 5.3802 -#> 1.9767 -10.0252 -11.7052 -0.6985 8.8981 -7.7196 -6.4009 -11.7299 -#> 0.5899 8.1488 -1.2487 3.1090 11.6321 -1.5359 4.3982 -9.4264 -#> 4.3843 3.0785 5.5632 -9.2855 -6.0409 0.6462 0.5483 -0.7019 -#> -1.0347 -1.7948 -18.4069 4.8440 3.1485 -12.7971 2.4637 13.6364 -#> 1.7668 1.3412 -1.5615 5.8986 1.3487 -17.8216 3.9517 0.0958 -#> 0.1322 0.1879 -17.0802 7.1159 13.9271 -5.1603 1.4898 8.4073 -#> 1.5698 9.0832 4.5893 6.2004 6.9843 -1.5408 -6.6988 -3.6267 -#> 3.2580 -1.6388 -6.9130 -4.1207 3.7113 7.9904 10.7252 -1.3340 -#> -2.4975 -2.1435 4.3989 9.1296 7.1549 11.2097 6.9807 4.6930 -#> 0.9505 -3.5208 3.4347 18.2746 4.8642 15.6792 8.6931 6.0316 -#> 2.9614 -2.0693 -0.3657 -4.5516 2.1412 -5.9561 0.8259 7.0303 -#> 0.2068 -7.9595 0.9035 -1.1417 -6.3314 -5.8986 -2.8805 -7.1754 -#> -6.0468 1.5532 7.6197 4.5963 -9.7855 12.0498 4.3925 12.1035 -#> -1.9815 -5.6976 4.5649 0.0077 1.1741 15.6724 -8.5468 13.0375 -#> -#> Columns 9 to 16 14.8517 -7.5338 2.0898 7.6614 9.8378 -4.6831 -10.0600 -2.1377 -#> 0.2623 -4.5503 -9.4165 8.9472 5.1810 1.5789 -4.1377 10.3595 -#> -11.9591 -22.1948 -3.1148 9.7955 11.0973 17.6188 -16.4940 6.3169 -#> -28.6606 -0.8674 -1.7593 -0.1612 7.4730 0.9168 9.7108 4.6269 -#> -14.9942 -7.2785 -5.0295 -3.6556 -1.4412 3.0655 -4.7327 -2.4419 -#> 7.7481 20.4307 6.6899 3.0492 14.3508 6.8140 -18.2257 -9.2996 -#> -6.0070 6.3098 -6.6324 9.5148 -2.5147 7.1132 6.0873 -21.6238 -#> 6.2963 4.4553 4.4620 15.4355 -2.1113 13.9785 5.0768 3.0637 -#> -0.0940 -11.8444 1.7085 7.9298 -21.2709 12.9068 -1.5473 1.5120 -#> 3.5864 -1.8245 -3.1646 3.0038 17.3009 3.9642 4.0612 -17.3035 -#> 6.0550 -2.8465 6.7859 5.7545 -3.3479 7.4782 -0.4281 0.6236 -#> -7.2762 -8.2545 -3.9465 1.5299 2.3039 3.3165 -10.5609 16.3569 -#> 1.4526 8.4141 -3.7688 -0.0326 6.0030 -10.1505 -4.4883 0.2529 -#> -8.2809 -3.0374 -28.4246 -4.4572 15.7466 -10.8104 14.7449 -3.9600 -#> -15.1718 4.2424 9.0816 -0.5463 -4.5732 -3.3233 -6.8652 7.9014 -#> 4.3162 15.7039 20.4732 -12.0980 7.2798 -3.6740 -1.9739 -12.1692 -#> -2.6370 0.7528 -9.7031 1.6575 8.0723 8.4717 -2.0160 3.2745 -#> -4.5273 -0.0800 3.9357 9.6299 -14.8624 -5.6028 -1.8641 6.1496 -#> 8.7650 8.5841 8.5647 4.0912 -27.8494 6.4642 -8.0686 4.8601 -#> -5.1022 -7.4816 0.6639 0.6033 2.0572 4.6621 -1.3301 -3.6485 -#> 12.1442 3.4068 -2.5187 -10.2796 -14.7906 5.9140 0.7293 -5.2769 -#> 8.1061 -4.8040 -2.1649 0.6224 11.7789 16.9427 2.5081 6.6288 -#> 4.9209 -7.8456 -26.9416 -0.1341 0.9170 12.5050 8.5160 16.3678 -#> -3.3597 8.2125 5.1914 -7.7349 7.0574 -6.7789 -8.3004 -8.7751 -#> 5.3466 -3.3574 6.1179 -13.2464 14.9734 -9.8865 -19.7086 19.1938 -#> -11.9796 6.3637 5.0602 7.1964 1.0075 5.9998 6.7007 -9.5336 -#> -4.5889 13.0579 5.0471 -13.9253 1.0684 3.8819 6.7362 5.6673 -#> -5.1198 -4.9673 10.4607 0.9514 -6.6400 6.9641 -0.0188 8.3754 -#> 2.1722 15.0798 6.2502 5.0564 9.8680 10.7211 16.0995 -4.0884 -#> -9.4826 -2.3942 14.4484 -1.4258 4.7605 -15.5945 -9.1665 17.3647 -#> -9.1650 1.6061 -2.9621 -12.2872 -5.8349 -4.7291 15.9064 3.0888 -#> 8.6125 -0.0677 -1.3918 -11.1935 5.1555 -11.0749 7.4689 -3.5442 -#> -2.2461 -0.5258 -7.8684 1.0981 11.2391 -1.9608 1.9035 -7.2017 -#> -#> Columns 17 to 24 -4.9053 -2.2667 -1.0283 9.6957 0.2477 -10.0500 -4.8617 0.0063 -#> -13.0942 1.2088 -5.3956 -1.5164 15.8247 4.6247 3.8747 -0.5828 -#> -10.8099 4.3983 -8.7463 -3.5179 2.2671 5.8489 0.7554 -5.0317 -#> -1.0272 8.7758 -6.7657 4.8201 6.0469 6.8353 -1.8373 -8.9972 -#> -4.4041 5.9650 6.4497 -1.7451 -12.9368 10.2491 8.8281 8.5340 -#> 9.2032 -14.1359 -18.1122 13.0184 6.0143 -9.3855 0.8192 7.0896 -#> 3.8926 -9.6341 4.3712 14.9302 -10.4555 2.5968 7.1624 -5.0525 -#> -6.0445 -0.9045 10.0906 -2.3694 -1.8475 -3.1114 7.8476 -1.0478 -#> 13.0187 10.4528 4.5314 4.6289 -6.1119 -3.1832 -17.6575 -9.1404 -#> 15.6379 -0.9983 -15.1494 14.3846 1.5959 3.0012 -1.9903 -2.3397 -#> 10.5089 11.0922 -10.5190 3.0268 -6.0590 8.4724 -7.9977 6.2165 -#> 2.7381 -5.1110 5.9924 10.3621 -14.0646 -2.0999 -11.6789 -1.5407 -#> 2.9158 -4.4157 1.3596 9.1929 -6.0317 -2.0577 -0.3624 8.0497 -#> 0.9831 -17.8929 11.4422 15.9151 10.2596 0.3767 -3.8407 -6.7646 -#> 8.4754 -5.1325 -5.9244 -4.3481 -2.2820 9.5119 -3.3970 -18.9483 -#> 6.4051 -5.7837 -3.7830 2.3501 -1.5908 -1.0771 -7.2282 1.1143 -#> -4.5312 -6.2030 1.3716 0.0105 -0.7657 11.8283 14.8129 4.3418 -#> -20.4536 6.8824 3.8435 12.0614 7.8767 -3.0026 5.1493 6.4343 -#> 6.8865 8.7946 -7.3705 10.7327 -6.2733 -3.3637 3.3572 -12.2461 -#> 6.0896 1.9243 -4.6876 -1.3111 9.2689 -1.3074 -17.4840 -7.6172 -#> -6.5077 24.6859 1.1896 6.5526 5.2203 -8.3758 -0.7854 2.7145 -#> 0.3053 -0.3795 5.7081 -9.1353 -10.9340 -7.9618 -7.4539 0.8218 -#> 1.8895 0.8432 4.5600 -7.0166 -5.9432 3.2469 -1.9923 -5.0270 -#> 2.7993 -9.1268 -3.3086 0.9909 -9.4538 3.3323 -20.2697 5.6602 -#> -11.5537 -6.1840 -5.3921 17.5635 -2.1432 1.7829 -1.8565 8.2781 -#> 0.6469 -8.2766 10.0507 -10.9065 -8.3207 -8.7234 2.6877 1.4146 -#> 3.9395 -10.7987 -2.3744 -0.4842 9.9037 -6.3236 -16.1055 1.0521 -#> 0.3880 21.7167 8.6837 -1.8329 9.5305 7.4091 -7.5914 -9.3135 -#> -11.2178 -0.9932 4.0680 -18.4544 7.8578 4.7529 -9.2014 3.2285 -#> -5.4010 8.5502 -7.7846 -0.5676 -3.9020 7.2746 13.7788 4.3461 -#> 7.3959 3.2083 -5.1978 -0.5509 8.3545 -5.1124 -16.1492 1.5927 -#> -5.4056 -7.4793 -1.7978 -0.9334 -20.6932 3.2113 8.4319 -12.3685 -#> -1.3231 -0.5456 -8.2329 1.9113 -5.4219 1.5084 1.4032 7.3739 -#> -#> Columns 25 to 32 -4.0028 4.8457 -5.0337 -5.2634 5.4246 -7.4923 -6.1144 10.9337 -#> -5.0902 -2.3824 0.2125 8.0339 1.0811 -8.2466 -3.7509 -5.0870 -#> -9.6053 -12.6881 6.7873 7.1377 11.8440 -9.8080 9.0595 12.2712 -#> 0.4937 4.3730 8.4381 12.5765 -13.4349 -1.9148 0.0590 5.0614 -#> 5.8021 -0.0977 7.5855 16.1178 5.8871 1.0429 -3.0092 12.8243 -#> 2.4016 -4.3190 -6.2966 2.1748 -1.2029 -5.1658 -15.5438 -2.8358 -#> -2.3391 -11.2238 2.7683 0.0513 -6.0206 -8.7212 1.2179 9.4017 -#> -4.2383 -3.6747 -0.1034 -1.2900 2.7896 3.6094 -5.0779 -11.1294 -#> -0.4860 -1.2960 -7.0285 -3.7450 0.6067 -2.7440 -2.7454 2.7484 -#> 5.7242 6.1890 -6.7480 10.1540 -5.7611 12.0798 -8.7057 7.2471 -#> -4.3361 -14.2966 9.0196 -4.6395 7.8449 8.9978 2.8720 -6.5707 -#> -2.1374 -4.8998 -0.3253 -1.8154 -4.7756 0.1213 4.8799 0.6199 -#> 3.0059 -20.9024 2.4299 -4.7674 -3.9045 -1.9628 -6.2359 -4.9126 -#> 4.3799 7.5832 -0.6820 13.5544 -13.8275 6.0846 -12.6547 9.0053 -#> -0.1091 13.7221 -3.1154 7.0069 -11.3168 7.5186 -4.2445 2.5850 -#> 1.4133 1.5797 0.1667 4.9029 -14.3867 0.2739 0.6428 3.6051 -#> 1.4212 -10.1993 -11.1234 5.9852 -9.9299 -4.6147 -6.8913 -3.7936 -#> 1.9547 2.7585 2.8571 -8.3228 -8.9590 4.0832 -1.1072 3.0466 -#> 11.6103 -2.7016 6.9391 -3.3514 -8.2382 7.7049 -13.1237 2.9195 -#> 6.5049 0.7257 -1.6754 -13.8499 6.7940 6.9524 14.0003 -0.5845 -#> 15.9858 10.4259 -1.7270 4.0562 6.4757 -2.5022 -2.3366 -8.7249 -#> -20.8702 7.3881 -10.0172 -4.7277 -2.9041 6.2343 0.9083 -6.3604 -#> 11.2073 -18.9086 -10.6599 -6.8272 10.4249 -2.8861 9.5068 3.4434 -#> -3.0630 -3.0321 12.9202 -5.8672 4.3428 -16.0467 6.4028 16.5198 -#> 2.1288 4.1729 1.9260 -6.8011 27.9381 -14.9567 5.8274 0.3583 -#> -0.3553 -4.5055 6.6590 -16.1631 -3.7251 4.1277 3.0556 10.5262 -#> -1.7344 -8.8326 2.0646 -6.5968 -3.3695 -2.0245 -6.8629 4.3947 -#> -3.8936 -4.9729 -9.2678 7.3137 13.4529 -1.2636 0.7179 -4.3780 -#> 7.6871 2.5940 -2.2730 -2.1216 -6.6604 2.8323 -16.7128 -0.9180 -#> 11.0534 0.7482 -5.2661 6.5815 3.0277 -9.2092 12.3954 17.9892 -#> 1.9958 1.8362 13.7950 4.5558 -13.7289 8.9340 4.6768 2.5235 -#> 15.4579 -0.8263 -0.8164 9.3055 -23.9530 6.0654 5.2620 -3.2954 -#> 7.7474 -13.7643 0.0998 3.7900 0.5168 5.0000 1.4169 17.5255 -#> -#> Columns 33 to 40 3.7398 11.9164 -2.1994 17.3545 7.6919 18.2535 16.2565 3.0534 -#> -22.1746 -2.5972 -10.5328 -5.6384 17.1634 0.1492 2.6765 -8.7797 -#> 15.7078 13.5311 -1.9342 4.0391 -6.3492 0.9819 9.8864 -8.1232 -#> -7.4723 1.2027 5.8385 -15.9131 -16.5342 1.8749 7.4802 -13.5046 -#> -14.5844 -6.6044 6.7171 11.3866 -1.6240 4.8207 14.8608 -5.6832 -#> -5.3270 -8.5717 -11.8968 -5.8006 8.5304 6.1275 5.4905 -1.5662 -#> 20.9067 -3.2596 16.0589 -8.4640 -4.7471 0.1439 -9.1014 1.8498 -#> 6.4862 0.3508 5.7172 -7.2685 16.5987 -5.4466 -11.0539 9.2751 -#> 8.8171 3.6017 4.7839 6.9964 -12.3529 -3.2227 6.5608 18.1055 -#> -1.8451 -7.4598 2.2739 -2.4590 -23.3320 3.6801 -1.7517 -0.7305 -#> -2.1438 -0.6471 -15.6980 9.3079 -3.2956 3.1527 1.0967 6.6124 -#> -9.3272 5.5347 5.8071 -15.6016 23.7244 -3.1802 2.5827 2.6769 -#> -1.8480 3.5504 -9.2644 -16.1763 7.2894 13.7280 -0.3345 -6.5412 -#> -6.9716 -2.2138 7.0869 -13.8287 5.5485 -20.5346 -9.7861 -2.8131 -#> 0.6799 -17.7559 2.9644 -3.0908 8.0651 -5.7967 17.4581 -7.6221 -#> 1.4858 -4.2815 9.1155 3.8075 5.8086 -0.4867 3.1360 -3.1349 -#> -10.7100 -4.7337 -4.7890 -9.3010 11.4710 5.1431 -6.5741 -6.5903 -#> 1.1697 2.8402 0.9476 -12.3607 13.7122 -3.1704 -12.2792 -3.6520 -#> -1.7845 -10.8223 1.3488 -4.9230 -6.2594 -8.1701 8.8945 7.6211 -#> 9.5109 13.8461 -2.3755 12.3814 -1.8163 -7.5476 -0.9768 -0.7303 -#> 13.7919 13.6527 15.5735 10.0841 10.3817 10.6021 9.0033 1.7628 -#> 13.4732 5.4684 -8.3357 -10.4179 -7.9186 -15.1649 -8.5268 13.5443 -#> 7.3305 -0.4604 10.7282 -6.5767 3.2823 -7.0912 -6.9034 -7.9658 -#> 3.3321 13.0385 3.1378 1.2988 -11.9930 11.8531 -7.0118 -3.9382 -#> 2.0293 9.2086 -0.8242 12.7483 -19.4353 0.8570 -3.8385 5.4355 -#> -11.7978 4.2577 -3.3972 3.6214 11.9412 -15.4736 -8.1722 -5.0488 -#> -9.0630 1.7378 -9.9009 3.4573 -3.5296 6.7400 1.7715 -6.7341 -#> -8.6042 -2.0558 0.2023 -2.2278 -11.8652 1.5105 11.7174 2.7020 -#> 7.3962 -5.1905 4.9489 -4.2631 3.5248 0.8108 -4.9632 -8.9748 -#> -14.8573 -14.3859 14.9262 -2.3562 -20.1310 4.1151 13.4385 -11.9751 -#> -5.2578 13.2722 0.9739 -5.3776 3.9516 3.5788 -3.9147 -5.6340 -#> 7.1877 3.0640 -10.6718 -2.7446 -0.6904 8.7616 6.3383 7.2771 -#> 15.3132 2.0735 4.8753 0.2851 4.6359 -0.4220 9.4668 -14.1132 -#> -#> Columns 41 to 48 -2.7489 -8.7637 3.4835 1.0170 -0.7160 -1.7874 4.3668 18.0269 -#> 11.4088 -3.2957 -4.7056 7.7861 -8.8527 -3.9654 9.5578 -0.8020 -#> -2.7851 4.7159 8.3065 -5.3754 2.2233 9.1180 4.2429 3.4977 -#> -3.0859 -0.8083 -13.5866 6.5434 -10.6219 -11.0563 -5.6027 2.0945 -#> 6.7164 0.0129 3.6310 3.6835 -9.5113 12.8007 -0.6328 5.7456 -#> 3.5358 2.2530 1.3245 -20.5415 -1.0149 -3.6555 -11.8244 4.6888 -#> -13.7008 -0.9236 1.7582 16.2404 3.8166 -10.3989 1.1112 -3.0338 -#> 1.4287 -1.9917 -9.1448 -2.0913 10.1029 -1.3184 1.9773 10.6877 -#> -2.1908 5.9729 15.2308 7.5356 -2.5093 7.4100 4.0253 4.1294 -#> 5.3162 4.4217 3.4451 -7.0947 -6.7558 10.9257 -5.3132 -9.3164 -#> 17.2181 6.2738 1.8721 -4.4210 12.1152 8.9523 -14.2943 5.3058 -#> 10.6419 -0.2886 10.8269 -5.1502 0.3061 0.6123 6.4105 13.0537 -#> -6.0470 -2.5747 -5.8937 1.1997 6.6017 -6.8245 -4.9771 8.0405 -#> 8.6029 -4.6612 -0.0321 4.4581 -7.2621 -3.3948 -4.2746 -0.9244 -#> 3.5538 9.7629 -6.2850 -0.0550 0.5231 11.7813 1.8676 -2.0707 -#> -7.6222 12.5437 7.6741 -3.4922 1.6843 -1.0553 5.5238 -7.2334 -#> -2.4571 -6.9917 -8.7695 -11.9418 -12.1425 -9.5730 -2.0288 7.2764 -#> -0.2375 -6.6255 -1.9505 2.1925 2.3345 -6.5458 12.4992 6.5844 -#> 9.2619 -10.2640 2.0741 7.4089 4.0091 7.8756 -13.4588 0.8496 -#> -10.6017 9.2146 6.2082 -4.6360 -7.2165 18.5288 -2.6394 -6.6095 -#> -1.1088 4.6886 14.3204 -1.6789 2.7128 12.7741 12.5627 -7.0749 -#> 3.5440 10.7971 -7.8047 -4.8617 6.3549 1.0599 7.3522 -7.8855 -#> -16.5681 12.9976 14.0391 -18.0699 -2.0621 -1.0039 -7.8445 -5.6196 -#> -3.3615 -7.7826 2.4972 -9.4898 7.0115 2.9747 -2.6037 1.7966 -#> -16.6786 0.8662 10.1673 -12.8165 6.5936 3.3037 -1.5560 -8.5763 -#> 3.9353 -11.2481 4.8464 -17.0819 -7.2871 3.1291 1.1024 -1.7613 -#> 4.4568 -2.9537 16.3754 -1.8688 1.7240 0.7697 -11.0913 -7.0720 -#> 15.3738 2.0966 1.3261 8.6218 5.8933 -2.8206 -0.0873 7.9319 -#> -2.3210 3.2512 -13.6099 -1.0325 -2.7013 -14.6160 -2.2551 -1.8460 -#> -10.7967 1.6568 -6.5915 -11.5341 -9.9567 2.6287 4.8549 -18.6987 -#> 17.4152 9.9368 -4.7196 -2.0201 7.0575 1.9245 -10.3459 5.4035 -#> -9.8628 -1.2139 0.5197 -9.0915 -9.3666 -17.6017 -11.3053 -11.6865 -#> 5.1867 -0.8220 -2.7756 3.3252 5.9480 -6.7896 -3.5799 0.5004 -#> -#> Columns 49 to 54 -0.6389 -7.4733 -14.9716 -11.8120 -1.9556 -5.3680 -#> 10.7929 12.4003 -14.2223 -7.3193 -3.7665 0.6283 -#> 8.6197 2.0170 -6.0246 -6.1153 -0.3049 -7.7032 -#> 11.1266 -22.0888 0.6137 2.3653 1.6914 6.3109 -#> -0.6324 -5.6238 -12.4644 -6.6811 -6.2189 -8.1869 -#> -2.6817 -15.2342 -6.2547 -0.4114 -9.7645 -5.8096 -#> -2.0674 -12.6191 -15.4729 -7.7481 -3.3053 3.9714 -#> 4.4544 -6.5520 -3.6067 -3.6903 -1.3885 -1.8380 -#> 10.1238 23.3828 6.2061 -0.9916 9.4449 1.7633 -#> -1.6249 -16.9761 3.6877 -4.3206 -9.6594 -4.2910 -#> 0.2759 -0.0149 6.6960 -3.9407 -4.2733 -0.9428 -#> 8.0797 -1.3322 3.3517 -1.2963 -1.9725 -2.5632 -#> -3.1325 -3.4439 -1.0952 -1.0389 5.6795 -2.8524 -#> 10.0878 12.1763 -3.0694 -2.4060 -6.0147 8.5415 -#> -1.3119 -6.9195 10.8808 -0.8992 -4.8843 1.3601 -#> -4.5548 -20.3759 1.4613 8.7679 9.1140 -1.0856 -#> 0.3173 3.1479 -8.8415 -2.1779 -12.6805 -1.6526 -#> 6.4048 6.2706 4.1337 3.4527 3.3755 -1.7199 -#> -0.6586 -5.9617 12.3254 5.0271 8.6379 6.4460 -#> 4.2861 -12.6578 9.2266 1.6816 -0.8269 4.5423 -#> 9.5541 -5.7773 2.6169 3.7411 8.2549 1.6539 -#> 5.7854 0.5161 -1.5401 2.0400 -0.6564 -5.9205 -#> 15.7457 -0.5584 -4.0106 2.0536 -0.5748 2.0987 -#> -1.1142 8.8274 -2.3402 0.8552 8.2448 -0.9579 -#> 4.9794 2.9381 -5.4608 0.4388 -2.2405 1.1353 -#> -1.2917 1.1117 -4.4265 3.8347 -12.0188 -2.6317 -#> -6.2574 2.3697 -2.6166 2.7013 3.3212 -0.2425 -#> -3.8129 19.2050 -4.5047 -5.3278 10.3334 -2.1496 -#> -0.3943 11.1524 5.1407 9.3953 3.7767 0.9381 -#> -13.8334 -2.3486 1.6269 -4.3620 4.8251 -1.9973 -#> 0.3519 -7.7991 -6.3443 3.6354 -2.9999 -4.0078 -#> -1.0298 7.8492 -4.6288 1.7104 0.7321 -3.8820 -#> -2.9105 6.6443 -1.8968 -0.9760 -1.5631 4.2653 -#> -#> (15,.,.) = -#> Columns 1 to 8 -10.0990 9.5713 1.5318 -12.0500 10.1011 -0.1370 13.3657 -12.0867 -#> -2.0152 -2.9317 2.3648 0.7224 7.9930 5.1094 10.4382 2.3264 -#> -0.3481 -1.4523 -7.0102 3.4108 -2.3078 -5.3298 -15.2268 9.4144 -#> -2.6055 -9.8104 9.6568 12.2556 -10.5023 -2.5403 -8.1588 11.3824 -#> 4.8885 -1.2973 0.0989 20.7426 3.1836 2.7362 -4.7865 21.5729 -#> 4.6923 1.6641 -1.2218 21.5251 2.9839 -10.0373 15.2247 6.9074 -#> 0.9719 -3.3859 -6.3710 0.5712 -14.3356 8.1015 10.3925 -7.1472 -#> -5.1478 -0.4332 7.1794 -11.7585 11.1017 -0.9514 -0.9117 -7.5852 -#> 0.8523 4.0596 10.2042 8.8919 -6.7690 -3.1127 2.1410 5.3746 -#> -0.7264 -9.5159 13.2099 0.0314 -0.9318 -12.5969 2.5364 13.7676 -#> 3.2831 2.4765 -0.7692 0.4533 3.5371 11.6717 4.3137 10.3082 -#> 1.7459 3.0613 4.4691 3.0374 1.8916 -4.2943 -4.3574 2.6327 -#> -5.2546 0.7602 5.5456 2.5400 -7.4997 3.5122 8.7494 -3.9388 -#> -4.8197 -7.6028 1.1249 9.1811 -12.9616 -2.8928 4.7218 -12.8941 -#> 7.2280 -6.4193 0.2792 0.6822 6.5073 -7.9635 -0.0429 5.5636 -#> 7.6099 -0.7829 -5.8236 3.8139 2.2075 -0.5448 3.2761 4.8688 -#> 3.5679 0.7753 -15.0284 -4.2804 -11.6611 -5.5340 8.6002 5.5089 -#> 3.7578 -1.4981 0.1193 -5.3650 -1.1050 12.6927 -6.2565 -1.5469 -#> -0.8684 -3.1628 2.5079 -10.9548 -12.2026 -11.2409 -5.4725 6.1974 -#> -4.8493 -5.0138 4.2155 -2.2080 -8.8043 -5.2325 -5.2402 -5.7408 -#> 3.4794 1.2526 1.7075 -2.2464 5.5673 -13.7139 -11.0148 -7.4443 -#> -8.2279 0.0555 3.0568 -11.7474 12.0560 3.7196 -3.5746 -5.4535 -#> -1.3951 1.4963 -0.4092 -11.0516 -22.7744 3.8872 -2.3623 -20.0583 -#> -4.2623 -9.1239 -3.2667 3.5099 -9.2813 -14.1512 11.7609 -1.7458 -#> -4.4801 2.6365 7.5302 -6.0688 -19.6205 -5.4619 2.6827 -14.5390 -#> 1.4995 4.9967 -4.6511 2.8508 -1.2898 4.5070 0.4309 7.5099 -#> 1.0764 1.9196 0.5463 3.8593 -3.8163 9.3600 7.4160 -3.2611 -#> -3.6800 4.9303 -10.1005 16.2065 -2.8412 -9.7141 -9.2764 13.5852 -#> 2.1092 -2.0972 6.2155 -2.3583 -1.2559 1.8911 -17.7634 18.2820 -#> -0.5072 -1.4840 -1.8773 -13.4366 -9.4894 -5.1573 19.5537 -5.9230 -#> -3.0255 3.9880 -7.6719 6.5883 9.1100 -8.1304 -5.2865 6.3455 -#> -1.7676 -3.3332 2.5160 -3.1491 -18.3137 -9.7269 15.7431 -19.0633 -#> 3.7121 -0.4173 -9.2082 0.8137 -20.6655 -6.5320 1.3618 -3.3141 -#> -#> Columns 9 to 16 -5.0178 2.8376 -13.4590 -1.7824 -3.2256 6.0948 10.5693 -7.3420 -#> -13.4998 9.7929 7.6963 -5.6667 5.6835 -5.7594 17.1983 -8.0135 -#> -6.6259 1.8900 0.4777 -2.9068 -1.9392 2.3020 12.7189 11.0737 -#> 9.7699 -1.4176 19.3781 -8.8971 1.0619 1.5661 12.3683 6.8381 -#> -8.8433 6.7435 5.3905 -2.1816 7.6027 -13.1408 9.6943 18.9325 -#> -3.7578 -20.4967 -12.1749 13.7635 -8.2955 10.1202 -9.7523 0.7279 -#> -1.5795 -27.3397 6.2327 4.8520 -6.9096 7.8854 11.5034 19.2436 -#> 21.0009 10.6960 0.3235 18.1828 1.3996 1.6413 9.7474 -12.8684 -#> 3.1225 3.5732 4.3682 10.1331 -2.1940 1.4707 6.3428 7.3753 -#> 5.8743 -4.9510 7.6673 3.3482 0.4906 4.1957 21.5036 -0.4971 -#> 4.3325 4.0696 -7.1543 4.7685 9.2816 -8.0765 13.1303 2.2542 -#> 1.3199 8.9509 -14.1791 0.3937 12.9080 2.3078 13.6441 -7.4704 -#> 14.1569 5.6918 -2.8820 5.4940 5.2787 -3.9716 11.0201 -9.0025 -#> -1.4026 0.8986 0.0281 -1.1187 0.9862 -23.6242 12.1266 2.6198 -#> -5.9391 -5.4533 2.9304 -14.2286 8.5711 6.9260 -5.7363 9.8813 -#> -13.7160 -13.2488 6.5590 0.5550 -0.6141 0.2143 -11.7267 -2.9618 -#> 8.3337 -8.7911 -4.3042 0.2995 -3.9446 2.9244 -13.8378 13.2256 -#> 8.2129 -0.0243 -2.5634 4.2056 7.3335 -13.9953 -12.1880 -10.9778 -#> -15.5198 3.0817 4.3212 2.8397 -4.5304 -2.8506 18.1735 0.9459 -#> -0.1223 -5.3029 -0.6803 1.7909 8.2157 9.3911 3.0801 -20.6149 -#> 6.4214 -5.0558 1.5302 -1.3221 6.9508 -19.6272 -11.9980 -1.2132 -#> 10.1410 15.7525 0.6530 -15.2663 -6.4518 -4.8851 16.6364 9.8039 -#> 2.4659 -9.5107 5.0274 3.2727 -20.7507 -2.8210 0.5230 9.5059 -#> -17.5149 7.4447 -0.9244 5.0719 -8.1361 2.1893 7.6107 2.9166 -#> -15.9416 3.8260 -6.9479 2.5022 -14.7595 12.8274 -20.2647 1.7912 -#> 7.4356 -4.8601 -4.9859 13.4862 -3.5921 12.9788 -8.9852 -11.1844 -#> -9.6747 2.3929 -1.8941 -0.1248 1.6079 8.6074 -6.0711 10.0755 -#> 5.3568 16.4666 6.2013 -1.3843 0.6055 2.8061 -12.3449 17.9458 -#> 11.9440 -0.8353 3.4883 10.3915 7.5584 -1.4332 -4.6735 12.5461 -#> -22.4059 -4.5566 7.3376 -4.3847 -1.5582 -7.5045 7.5009 -1.9745 -#> 5.6077 -5.3056 -13.0279 1.4611 -4.6591 -15.2655 4.6496 -10.6508 -#> 0.5694 -4.6875 -1.8906 -10.3570 -15.9350 -3.2111 -2.0236 -6.8917 -#> -11.4327 -4.4946 11.8158 -3.7137 -5.7878 -3.2148 2.5095 5.8133 -#> -#> Columns 17 to 24 23.9548 4.4337 22.5554 10.9288 -2.2020 3.8517 1.3465 15.9809 -#> -2.5955 9.0179 -3.8016 -8.0012 1.7868 -3.1316 -5.1833 -18.7513 -#> -11.2852 13.7604 9.7052 -4.6647 9.1436 1.4140 15.3762 -5.4790 -#> -30.2964 9.0454 3.9029 -30.8797 -6.0035 6.8617 12.3477 -8.1561 -#> -7.8357 -2.4154 0.1894 -17.0568 12.7654 -1.0467 -1.8494 -2.1127 -#> 9.1948 0.0944 3.3596 1.7337 7.1465 -3.6636 -13.6780 2.3150 -#> 1.4789 11.8322 15.1057 13.5677 2.5585 10.9729 1.6040 -10.0423 -#> 18.6681 6.5529 -2.8593 3.3970 6.8450 0.3297 -10.0932 -0.2324 -#> 2.9418 -4.5070 17.7137 -1.9664 2.8364 -10.0479 6.7950 10.6355 -#> -3.8491 14.8961 0.6555 -14.6044 7.5461 19.3217 -9.9193 -8.7238 -#> 13.7032 -13.1803 -1.7174 11.4773 12.4560 -9.4533 6.6293 -14.7780 -#> 10.5144 6.1302 -7.4889 19.8148 5.8770 -0.9864 6.1714 -4.5893 -#> 10.9235 -19.2067 -7.2074 1.9999 -0.9868 0.1538 -3.3172 -12.9561 -#> -8.1066 -9.5891 -5.8863 -4.7349 3.4509 -1.9967 -0.9951 2.1819 -#> -0.1153 -1.2667 7.1105 -10.0341 -10.6734 25.9604 11.4678 6.4583 -#> 15.4342 3.8854 -0.8302 -5.8226 -13.3946 1.0479 6.4934 -15.2569 -#> 7.5051 3.4195 1.0269 7.6924 13.4544 -7.2745 0.9785 -1.0435 -#> -7.2453 10.1602 2.5436 -12.7391 0.8896 -1.8076 -5.2475 -9.5786 -#> 0.9025 10.1723 8.4099 17.3126 -3.8429 0.2009 18.8467 17.4271 -#> 5.6315 15.4214 -6.9537 -16.5940 5.3793 19.0690 -8.0471 -13.2905 -#> 11.9089 3.0548 -5.6932 7.0078 -2.3373 -3.9093 5.7294 4.5638 -#> 11.4385 -25.3665 0.1245 4.7662 6.2247 -4.7778 0.2985 1.8013 -#> 11.1348 11.0020 0.1509 -5.2465 10.6244 8.7351 -2.8962 -4.9217 -#> 4.3799 13.4892 4.0956 -2.8105 -4.9192 -1.0366 1.2922 0.6402 -#> -8.3492 24.5169 -10.7443 -2.3853 8.8224 1.1678 9.0221 0.4602 -#> -0.2645 2.5483 -1.9621 6.9814 6.4197 15.2534 -17.7681 -2.2454 -#> -7.4653 4.4360 10.7323 -7.0816 3.3647 9.4408 9.3030 8.1634 -#> -2.1583 1.4271 -0.0035 -8.9739 5.9935 -9.9652 15.8939 2.9064 -#> -22.6930 3.8092 -0.5091 -9.1934 10.3580 13.3964 -6.4586 -0.6747 -#> -6.0896 5.1857 3.3376 -4.1952 -0.2352 -1.3639 27.2648 -4.6405 -#> 13.2255 -4.4454 -0.9535 -1.1588 5.7364 -5.5923 8.2029 1.3131 -#> 8.5942 -18.0943 -3.8576 14.8559 -0.2151 -0.5652 5.3782 5.7814 -#> -2.4000 7.8375 8.4794 -0.1797 8.0042 -5.4103 -2.8932 -6.7073 -#> -#> Columns 25 to 32 9.5944 -3.9121 -0.6456 -4.2796 4.8669 -3.1430 -16.6505 10.3947 -#> 6.2774 -4.8735 -10.7184 -1.3617 8.1813 -6.5940 9.6497 11.8253 -#> -0.8060 -7.0873 -5.1033 3.6967 -5.9224 -1.5106 -3.2493 -22.3877 -#> -2.1715 16.7673 -14.4711 -6.9998 -2.7722 -5.2056 -5.5870 -3.6650 -#> -10.9918 9.5981 -4.8545 1.3297 1.6327 6.8057 17.8017 -10.4679 -#> -5.9841 5.0507 -1.8719 16.6006 -8.6540 18.5502 6.4155 -0.4553 -#> -1.2806 -4.6073 1.7010 -2.0524 21.2555 1.6784 -14.3942 14.7029 -#> -6.6802 -10.5134 -5.3347 1.4647 6.6026 7.7703 -8.3399 -1.4836 -#> -6.5434 0.1531 0.9292 2.4402 -17.5561 5.7259 9.4887 -2.3714 -#> -0.8271 9.9857 -0.1178 -1.8616 -7.1793 6.0239 -7.9769 7.4283 -#> 7.1402 4.4461 -0.9369 -2.3136 -0.0208 10.2257 -4.8433 -9.9684 -#> 5.9708 1.4622 -4.8579 -8.3934 -7.1991 14.0736 -2.2637 0.5849 -#> 4.8735 8.7754 -5.3562 -14.4586 11.7871 3.8676 -26.7182 6.0868 -#> -1.8051 -0.7627 7.4101 -11.8195 9.1956 3.0181 1.0301 0.4012 -#> 2.3346 5.5885 -6.8960 2.6455 -11.1043 1.1019 -3.3641 1.5160 -#> 0.4052 2.8550 9.9087 -9.2560 4.5426 3.6349 1.1024 1.1567 -#> -3.4402 4.1977 2.2295 3.9754 -5.8424 2.3379 9.0788 1.3631 -#> -0.3071 4.4000 0.4199 6.6844 9.9604 -3.5819 -0.1863 -0.4231 -#> -12.1294 12.6241 4.2895 -2.8436 -20.4353 -1.1826 -21.5068 -3.8060 -#> -1.0920 -4.3573 -8.0907 1.3174 -1.0410 -4.7343 3.5406 -20.6220 -#> 11.1171 5.0765 12.8293 -2.8107 -9.1112 10.5481 -5.7296 -10.1943 -#> 2.2845 -14.3124 -0.5741 -18.6881 7.3470 8.4301 -16.0972 -4.9773 -#> 14.1851 15.5705 -8.0232 -14.2519 -8.7095 5.3573 8.2457 3.0091 -#> -8.0929 0.0361 -15.0765 1.3317 -9.0183 0.3250 -2.7328 4.0589 -#> -0.5257 6.8122 2.5629 23.2483 -1.5479 12.1290 3.9113 -14.1497 -#> -3.3007 -0.4385 0.9678 11.9494 -6.6841 -8.8127 8.3065 2.3145 -#> -2.0953 4.7034 -6.3555 4.3592 -7.7332 2.2606 10.0530 -0.9074 -#> -8.5917 8.7902 11.5992 4.3040 -12.4374 -3.4665 -0.0276 -18.1506 -#> -13.3506 1.1399 -7.6358 3.4748 12.0200 1.2798 1.8368 6.4288 -#> -1.7819 13.8193 9.8684 -1.6199 -5.4542 -4.0736 1.5121 11.8665 -#> 3.2625 10.5987 1.5617 -10.7859 -7.1992 1.7146 5.2890 -3.0862 -#> 6.1506 0.8400 9.1340 -6.4293 -11.4344 13.4539 -25.8874 -12.8097 -#> -0.2576 2.2089 0.5780 -3.6492 9.1683 -5.4479 -13.5015 -8.2683 -#> -#> Columns 33 to 40 8.8396 -4.0908 3.3033 5.9178 -2.1426 4.7535 11.7253 2.1089 -#> -7.8518 -0.6864 5.4062 -0.0704 2.2593 1.2757 -3.3115 -2.8540 -#> 8.0880 8.8613 -0.2046 25.4886 -0.0231 -0.6460 -3.0581 3.7206 -#> 1.3567 2.0511 3.5708 -3.5456 -7.6781 -1.4215 -6.5459 3.4979 -#> 8.6816 6.3005 -2.1095 7.5814 -10.5794 2.5243 2.6340 2.6379 -#> 2.2035 10.3515 -11.2097 -9.3520 7.2193 16.8672 -1.1143 -6.3585 -#> -6.3557 2.2419 4.5310 -5.1427 4.8296 0.9774 17.5645 -2.1679 -#> 0.7955 3.9828 19.1163 7.6492 -5.3007 4.0388 10.3170 -10.0017 -#> -10.3534 6.4167 -19.4180 3.1978 5.7691 -21.6791 -2.2751 3.2430 -#> -1.7144 -2.9233 -4.6692 2.2329 24.4980 3.3718 -0.1364 3.9223 -#> 15.9999 -5.0894 -0.0592 5.6503 -1.0933 -18.6162 13.1086 -2.5467 -#> -8.7221 -2.9894 1.3795 11.1901 -4.6743 -4.0438 6.3607 -14.1750 -#> 6.4703 -12.0389 15.9626 -14.4179 -11.5250 11.3063 18.6204 -6.1838 -#> -9.8206 -8.7003 8.7307 6.8307 5.9919 -0.1534 -13.9388 3.1807 -#> -5.1773 -2.5748 -13.1049 17.7876 2.0671 7.2958 1.2341 24.2162 -#> 2.9031 -0.5510 -8.2014 -11.4646 -0.3466 5.5836 5.9189 -2.3536 -#> -7.3025 9.3635 -6.8624 3.7640 1.4827 4.0607 -1.9772 0.5987 -#> -12.8694 -6.1116 8.6849 -18.8294 -10.3818 -5.0519 -12.2981 6.4965 -#> 22.5084 4.1215 -7.5574 0.8125 7.2067 -9.4430 14.0021 5.2206 -#> 2.0273 4.4054 6.1284 13.6395 14.3388 -8.0690 6.9133 -0.9929 -#> -6.8994 3.5835 4.0496 -10.4464 2.8428 10.3485 3.0069 -3.1604 -#> 2.0785 9.0722 3.1944 16.6751 0.8630 4.7099 2.2799 -12.4083 -#> -14.1698 -11.5007 -2.5575 18.4120 12.5671 0.3347 14.5842 -6.3854 -#> -0.4839 3.4926 -0.2347 -4.1269 -8.5848 -17.0848 5.6182 -10.1792 -#> 6.5714 -2.6046 -5.7125 6.4672 -3.9295 -19.8987 6.2386 18.6289 -#> -18.1906 13.5562 -6.0317 -7.6967 9.0732 5.7860 3.1072 -0.3426 -#> 2.8831 -2.1550 -8.5623 6.4899 -9.2266 7.8874 -10.9110 -7.0155 -#> 0.4335 12.8337 11.4560 11.9060 -20.1756 0.3952 -4.7485 11.1188 -#> -0.0864 -0.7748 5.9271 -9.5802 -24.4697 3.9133 -9.2256 -10.1374 -#> 9.4869 -26.5448 -4.7492 1.5490 -0.8644 4.6959 0.7020 6.8054 -#> -11.8460 -2.2999 8.3583 -4.7632 5.6372 12.6364 4.8529 -11.5818 -#> 7.3621 -14.5742 5.1122 -4.9590 -6.8797 0.0120 15.1780 -1.1519 -#> 9.1480 -8.6119 5.6794 1.8353 -1.2049 -3.6436 -15.5298 6.1394 -#> -#> Columns 41 to 48 -0.9185 1.0292 2.3816 -19.0016 3.2346 5.1427 -9.2602 3.5470 -#> -4.6642 -7.5193 -6.2972 -3.6340 -4.3782 3.5069 12.3738 3.8607 -#> -1.7061 17.2409 0.5289 -6.2155 -9.2250 7.7128 2.6069 -4.1813 -#> -22.3711 15.8590 -12.1630 -1.9723 3.4912 0.7936 -0.4754 -4.8637 -#> -10.7078 8.7035 -17.7855 -6.9211 9.6944 -0.4565 14.0562 7.5147 -#> 15.2351 6.8293 11.9645 9.6385 16.1149 5.4100 13.9070 11.3935 -#> -18.9558 4.3913 -11.3429 -11.7447 10.8551 -1.7770 -10.4569 2.5707 -#> -5.0401 -9.8150 1.7656 -1.7277 -14.3881 -3.8345 2.5442 7.6936 -#> -3.8712 7.5262 5.4346 -3.8012 -4.1631 3.5016 -10.7002 2.8133 -#> -12.1691 9.2626 -20.9391 -0.4648 0.0210 -11.2584 -9.1199 24.2553 -#> -4.7094 -2.3166 -9.8245 9.5187 -3.5064 16.1850 3.9799 2.8132 -#> 6.3814 -3.9819 -1.0964 3.3634 -8.4593 17.8737 5.7258 -2.5306 -#> 13.5904 -5.1623 -4.3888 -9.1726 -5.7060 13.3960 9.3317 11.0526 -#> -15.8183 3.6202 8.2642 -3.5512 7.4538 -7.7091 6.9085 -20.2944 -#> -1.1283 -18.0089 9.1768 -19.2064 13.2384 0.6168 -3.4532 1.7254 -#> 1.0596 4.8809 -7.4798 9.9955 2.8880 -0.7054 -1.5500 3.2815 -#> 3.6454 -0.9967 1.9723 7.5453 10.3257 9.4845 16.9860 -0.8021 -#> -7.3195 14.6915 -0.9854 -4.1521 -0.8937 -13.6568 -3.7865 -23.4725 -#> -14.4901 -3.1384 -17.0815 -4.9651 5.3681 -3.0924 -18.9295 -15.6902 -#> -2.3697 -4.0433 -11.8967 13.1645 5.7599 3.2860 -4.1032 8.4708 -#> 5.5353 -7.1651 5.3395 2.2028 -13.1871 -10.0166 -14.7682 1.3285 -#> 12.3502 2.8922 8.9898 -1.6964 -13.0801 -0.3030 -7.0138 11.3802 -#> -8.0877 3.5271 -2.7746 -7.2920 11.5408 12.2708 5.6332 15.9228 -#> 7.5764 11.6671 -3.1646 -0.9293 -8.9805 -10.0701 3.1878 3.2721 -#> 3.0567 9.6918 0.9289 9.2551 -17.5344 -4.5718 11.4749 -8.6895 -#> 9.8700 -5.0108 11.2222 2.5289 14.6404 -18.5415 -10.1208 -7.4381 -#> -5.1425 -8.1582 5.6891 1.3202 9.9968 -10.1401 9.7812 -5.5545 -#> 6.0983 3.3249 4.1080 3.2006 5.4616 6.3398 14.2389 -3.0887 -#> -5.4333 9.9757 14.0214 0.1371 9.8365 -4.8653 -3.6701 -17.4995 -#> -2.9224 1.7300 -6.4158 -4.8144 3.8159 -14.3869 1.6443 3.7402 -#> 5.6625 -1.9869 -0.9393 2.7615 0.5217 1.2785 0.9142 7.9043 -#> 13.3689 -3.6108 0.6681 -16.7220 -0.2150 9.0452 3.9819 8.0725 -#> -11.0557 21.5327 -4.6524 -2.5301 15.3207 -1.6733 29.7789 0.5129 -#> -#> Columns 49 to 54 1.5166 8.2477 6.7136 6.3118 -4.9695 11.5162 -#> 1.4363 -12.9781 0.1371 0.9093 -3.6536 4.9203 -#> -18.0463 5.2483 -0.5983 5.6387 -2.6682 1.3394 -#> -2.6935 -6.1874 -6.5942 -4.2045 6.6405 -9.5767 -#> 0.8123 0.8789 -17.2362 -1.3923 9.8621 -4.7030 -#> 0.9047 25.0544 4.3639 7.7567 19.5197 8.8219 -#> 18.4562 11.7296 5.6156 4.0337 8.2729 -3.8409 -#> 6.9517 -5.0020 -3.8453 -0.1510 1.8457 4.7716 -#> -7.2966 8.9733 -7.9880 -5.7693 -1.0765 -3.2241 -#> 19.3973 -9.3490 -5.5793 2.3670 4.4611 6.6473 -#> 2.0844 -0.0334 -8.0439 5.7844 1.4116 4.2696 -#> -0.1985 5.3961 6.0418 -0.2969 -0.8843 -2.1523 -#> 2.0317 -4.4195 -3.6542 0.5706 -2.4710 0.7280 -#> 7.4199 -3.0333 -4.4803 -2.6479 6.0699 -2.7246 -#> -11.5188 16.6455 -0.1031 -2.7591 3.5070 -0.1705 -#> -3.0831 -4.5135 12.9683 1.0926 1.7067 2.5385 -#> 5.8785 6.7963 7.6884 5.5664 11.5878 2.8810 -#> 13.8876 -7.7859 1.2779 -7.9931 -4.2584 -0.1300 -#> 14.8720 15.5157 3.7080 11.8443 -3.9134 -2.1870 -#> -5.7535 7.5996 -4.3604 2.5150 -6.0650 -3.4270 -#> 3.9493 1.7726 -3.3017 -2.0998 -8.4011 8.5350 -#> 0.7961 -3.1373 15.8146 -4.7486 -1.0243 4.7894 -#> 8.8234 -8.4761 -7.7498 9.6531 10.3125 0.0289 -#> 11.9533 9.9974 0.7059 -16.6528 1.1749 -7.9864 -#> -10.4745 -5.0384 11.4629 2.7173 4.2650 -1.2628 -#> 11.0660 -5.0848 -0.7776 -6.1537 -0.6618 -0.3784 -#> -9.9422 -1.2817 -1.6077 5.3577 3.3337 0.4067 -#> -12.0042 5.7682 -6.7125 4.8503 6.7699 2.9700 -#> -3.3479 -7.8093 -2.1469 -7.6515 11.3454 -7.3593 -#> 21.3429 2.5380 -10.8956 2.5114 -3.3077 -1.5169 -#> 1.3157 5.7816 2.1688 -8.4880 6.3563 1.6823 -#> 4.5685 -1.2746 8.4463 10.5379 -11.6546 5.8264 -#> 9.5908 -13.6034 2.6211 4.8941 5.8864 2.5896 -#> -#> (16,.,.) = -#> Columns 1 to 8 5.7149 2.6410 0.4331 -1.4741 -3.6267 -1.4720 -1.6433 -14.4904 -#> -4.1965 -2.3145 2.4730 7.7392 -2.5778 -1.9739 5.9185 6.2945 -#> -0.3824 2.3342 5.6595 -3.6659 13.7773 -5.9059 -3.7858 3.5306 -#> -2.4134 -1.4280 -7.1424 2.0948 2.2874 -1.0853 5.7903 10.8578 -#> 3.2607 0.0906 0.4323 -3.3052 13.3146 -0.6935 5.2667 13.1755 -#> 1.6097 -0.1559 1.9115 5.1168 6.6133 13.6786 -12.4811 5.9716 -#> 7.3616 -5.4578 2.2418 8.3101 8.4899 7.1793 4.6639 -8.8759 -#> 1.4888 3.3524 4.0891 7.2668 -7.6339 10.0442 3.4466 1.9793 -#> 2.9624 -0.7599 -4.4550 -16.5763 -7.2150 -9.3511 -14.4847 3.8877 -#> 5.4535 2.1113 0.6556 8.3213 11.4949 4.0772 -0.9157 6.0573 -#> 1.3422 -2.3339 2.4545 2.0971 8.9639 -10.1366 -1.9312 11.8052 -#> -1.4407 -5.1037 -0.8735 -7.5622 -6.1879 -0.0690 -6.2049 6.1181 -#> 1.4357 -3.5222 -1.9755 9.4500 -1.6771 4.4376 0.3270 3.4174 -#> 2.5677 0.9643 -0.4635 -1.1444 -5.9775 -13.5607 9.5219 -1.5825 -#> -3.4731 4.3262 7.8018 -1.1242 -0.3613 -4.4380 -17.4636 11.9620 -#> -4.2367 -0.5903 -4.0691 4.3360 6.9549 5.8410 -7.6657 5.2381 -#> 0.2624 -11.5379 -1.8299 -6.6303 17.5205 6.7721 7.8004 2.3381 -#> -3.8974 1.7662 -4.8138 9.5514 -9.4977 0.3022 -1.0301 -7.1786 -#> 8.0423 3.0451 -5.4620 -1.9462 -11.0921 2.8610 -8.4385 18.6906 -#> -0.7504 8.3239 2.5793 2.7778 -8.9122 6.5091 -6.5768 5.6288 -#> 4.3457 -1.8958 -10.0769 -3.2070 0.9191 15.5076 1.3231 -1.1706 -#> -3.5223 0.8839 0.7376 4.0680 -6.0472 -10.2105 2.2769 2.2336 -#> 3.7377 -5.7436 -0.2260 4.7655 3.2836 9.0940 3.0380 4.9916 -#> -0.6492 -7.3590 -3.8933 -4.5155 3.0945 -17.9669 0.7992 -2.5830 -#> -4.1239 3.6299 0.7977 1.3964 0.1331 -2.0033 18.6986 -2.7015 -#> 0.3150 -1.0361 4.5186 -2.9666 -1.6695 4.2238 -9.6508 -5.4337 -#> -2.3472 -2.5659 5.6535 -6.8268 0.4131 -15.7032 4.3612 -4.8776 -#> 5.7511 1.5074 4.6386 -14.9877 11.4655 -3.4371 6.8665 4.7969 -#> -1.2161 2.2089 -1.8261 3.9251 -1.3776 -0.7568 0.8804 -8.0807 -#> -0.7410 0.6941 2.5636 16.1539 17.1623 4.5810 -12.6779 0.4777 -#> -0.5473 0.5947 -0.6145 -0.4223 4.8203 -2.9468 -8.0156 16.5324 -#> -2.8870 -0.4101 -8.1284 8.2239 -12.4741 9.3738 0.8917 2.3292 -#> -0.0762 -4.8836 -2.4877 -2.8065 11.9033 -6.2954 12.8217 4.6448 -#> -#> Columns 9 to 16 -10.8486 19.0040 15.5205 6.3897 12.7509 18.7076 6.7341 24.7735 -#> -12.0438 12.6606 -0.3553 -4.9212 12.0861 4.6795 6.6952 5.5794 -#> -4.6007 0.9672 8.7212 11.1086 0.4787 8.4304 -7.2332 13.4946 -#> -8.7545 2.6975 -7.5476 -0.5051 -26.9210 7.1168 7.2227 7.0064 -#> 5.0392 10.5801 3.8094 10.7401 4.8611 6.8573 -2.9970 10.8328 -#> 2.7897 2.6428 -3.0699 -1.1935 1.6653 3.4466 -6.4703 15.9608 -#> -4.0934 -13.0422 30.0203 9.2027 -0.0284 -8.5338 17.6166 5.7459 -#> 0.6521 6.9662 3.7325 13.7420 22.0283 3.8541 -6.6730 -10.8181 -#> 8.4981 -0.7829 1.4430 7.1065 -20.1671 -2.2740 4.0409 1.3867 -#> 1.0489 7.8033 -10.2676 6.4569 -12.2644 27.0497 -12.3430 -4.7140 -#> -4.2990 -7.0005 9.4127 10.5111 5.6435 -16.3177 -19.4785 4.6994 -#> 7.1485 0.1711 -2.3173 8.7568 13.6645 15.3494 -2.7278 -6.3752 -#> -22.2274 -13.7903 2.3464 9.1934 12.2973 1.7982 -8.8251 1.1781 -#> -1.8933 -11.6445 -5.1614 2.8326 7.9728 -9.9083 1.7724 6.6256 -#> -0.4213 -2.8628 -9.6671 12.5749 -0.2841 17.4236 -15.0692 12.7773 -#> -2.8915 4.1267 3.7853 -2.7139 6.6256 11.3734 -20.9926 -4.2265 -#> 2.4341 -8.6435 -2.8338 0.5613 7.2049 6.5807 -2.2237 -2.7587 -#> -10.2696 0.0926 -3.1129 -10.9694 7.9198 -8.8719 13.5763 -9.7744 -#> -15.7203 -4.0138 12.2160 -7.0447 -18.8194 -24.3269 12.7430 9.2565 -#> 1.8916 2.4190 4.8938 8.2787 25.5384 11.2692 -9.0044 -2.9264 -#> 4.7253 -1.1566 -7.7629 6.2313 19.2309 12.9958 -1.3933 -23.9209 -#> 4.0398 3.1965 1.0156 1.8936 0.3614 -6.0729 -5.9055 -5.6367 -#> 12.6499 -8.3206 -7.9043 -2.4800 19.1821 5.0694 -7.8344 -22.8667 -#> 1.9533 11.7377 11.7026 -3.2078 4.5977 -9.8745 -2.6903 -0.1726 -#> 0.5964 2.2094 -20.7317 -7.7429 8.8774 -9.4002 -8.3394 -5.2582 -#> -7.7434 2.7373 -0.4027 -2.3704 12.3224 -3.0172 3.5563 -13.3000 -#> -6.1673 2.1377 -9.6263 -7.3032 -8.1092 4.8288 0.9269 6.6680 -#> 7.1307 -11.4082 -1.8733 -3.4725 0.7833 -11.1207 1.6904 -3.3928 -#> -13.8459 -8.0944 -4.5687 -1.3178 -12.7660 -14.1142 2.3761 -1.5851 -#> -4.4088 -4.7370 -13.4434 -11.3929 -10.0721 0.3293 14.4283 -5.7123 -#> 1.6628 -4.8135 4.1967 -3.6642 2.6072 8.6424 -5.3487 -5.5875 -#> -15.1982 -16.2075 -11.1081 2.1837 -15.1037 2.5881 -16.0889 19.7138 -#> -1.3740 -0.5834 11.2036 -2.6599 8.2420 -12.5969 -12.3959 1.4875 -#> -#> Columns 17 to 24 2.0507 2.6598 -8.2677 -0.3819 -9.7047 -13.0155 0.8416 11.0928 -#> 9.5006 -16.9117 -12.5434 -9.1153 0.4889 -1.8486 0.6586 7.6950 -#> 15.1938 3.1147 0.5621 1.7471 -14.8611 -2.8270 -0.4675 11.0833 -#> -21.2756 -17.3433 -3.7241 2.3021 1.9242 17.5036 2.0321 11.3387 -#> 4.0239 -4.6242 1.3642 0.7304 -4.2136 6.3788 -5.1951 1.0344 -#> -0.0384 -2.6888 -6.1012 6.7180 -12.0463 -12.4807 -5.0917 -9.4530 -#> 3.7491 -1.2887 10.0114 0.4980 -19.6182 -11.8975 -6.1498 -9.0285 -#> 3.5560 4.8208 2.8959 -9.5669 -5.5017 1.9822 8.5510 -2.1044 -#> -2.2116 2.3504 2.2195 14.9792 -9.9029 -1.9384 -0.5256 3.0722 -#> -22.9213 21.1300 -9.5968 13.4872 -10.8303 0.4232 -12.2590 -4.7347 -#> 19.0331 2.7374 4.2416 4.2413 -0.5785 -2.3002 -2.3245 -8.5098 -#> 3.4308 -4.5625 2.0586 6.0886 5.3807 -1.9928 -3.4943 0.4703 -#> -1.6577 -15.8600 8.2561 7.0638 1.7736 -16.2779 7.8722 -3.1787 -#> -0.3687 -16.4942 6.5590 -1.1787 3.5540 -14.2099 12.5565 4.2178 -#> 13.0011 0.9513 4.4265 10.9985 -0.1402 4.5048 -13.1231 7.3450 -#> 7.5692 -4.8010 -20.2641 10.6312 6.1431 9.6376 -9.9602 -9.9894 -#> 11.8609 6.2776 -2.6604 -13.9893 -0.5079 -3.7566 -5.1813 4.9594 -#> -5.7975 -2.5242 2.6223 -15.3010 8.8385 15.4189 13.6916 2.9766 -#> 15.4826 14.5025 -1.2235 9.1774 -3.8551 -1.9364 -10.3641 -6.1060 -#> -4.3016 0.8259 -4.8363 0.1938 -5.1897 -5.0624 -3.6538 14.0120 -#> -5.6653 12.6322 -1.3424 7.4415 10.4254 5.4954 5.7740 5.2493 -#> 10.3612 -11.3761 -4.5315 20.6901 -4.6147 -4.6538 -6.1524 -4.4536 -#> 1.1449 -1.2692 4.8713 10.5046 4.0514 1.0056 1.6884 -9.5293 -#> -4.5487 -7.4370 0.9980 0.2414 -9.4703 0.0699 6.6126 1.3606 -#> -7.5593 -4.8113 8.1871 15.3340 16.2282 0.7743 -10.5867 -0.0072 -#> -7.5182 2.6224 -0.3788 -3.3806 -4.6903 -6.0372 5.0801 -11.2094 -#> -0.7491 -4.1805 6.0432 10.6777 0.9205 -10.8444 -2.9452 -1.8480 -#> 10.1078 -2.1582 -1.2274 -9.9255 15.0571 17.5290 -3.8803 3.5488 -#> -7.4319 5.0795 13.8641 -1.4426 1.3595 0.4998 -3.2554 -2.9333 -#> -3.0166 -4.6155 21.9036 -3.6327 0.2526 4.0147 -3.2307 2.0131 -#> 2.7417 -0.6201 4.8743 2.8442 3.4621 6.0141 0.4053 -16.9837 -#> 0.8861 1.6744 -1.4065 14.6046 1.5714 -24.4373 0.3676 -3.4827 -#> 4.0054 0.5725 8.7190 -8.5844 -1.3806 -6.2709 13.3831 -4.2304 -#> -#> Columns 25 to 32 -4.1254 9.3742 -11.9406 19.7302 6.2971 2.7894 13.8389 6.6408 -#> -1.9426 -1.7817 -12.3767 1.1051 3.3208 -1.9366 2.5308 -11.9070 -#> 8.8551 -2.2768 -3.4851 -2.1931 4.0168 3.8313 7.3140 11.3613 -#> 5.8380 1.4407 -14.6187 -6.7719 -12.8975 -12.0475 -1.8910 -8.9767 -#> 1.2515 -0.0015 11.6763 6.6943 3.9679 -4.1267 -11.1167 6.0731 -#> -7.4304 -0.4224 -1.8240 -6.7562 3.3597 3.2047 14.7926 3.0720 -#> 18.2255 -5.5199 12.2591 12.3209 -10.4560 -6.3152 0.1159 -13.3165 -#> 3.0514 -6.1623 1.4615 -2.6432 -3.0623 -16.2089 3.5304 -2.6365 -#> 1.8521 3.1686 1.3424 0.4103 7.6215 -2.8771 5.7098 -15.4748 -#> 7.0573 8.4287 -15.0557 -2.5131 -13.4608 -2.3850 1.0915 -15.5075 -#> -7.8992 0.6289 -4.1602 5.5693 -4.8073 -8.5766 4.1574 -1.1716 -#> -7.6856 4.9891 0.0264 8.6593 12.1591 -10.1124 4.3597 10.4212 -#> -2.5367 -9.9828 -11.1830 3.8636 -6.6999 -9.8037 14.4445 18.1645 -#> 3.0433 8.5416 -6.2895 -0.2634 7.1224 -22.0405 -3.1111 5.2978 -#> 2.5170 -4.6569 -3.9122 -10.7506 13.1106 -1.4986 13.1161 14.7422 -#> 1.7151 5.4397 -4.4998 0.5734 -3.2140 8.3545 -6.4990 -22.1587 -#> 1.0871 -0.3733 6.6648 -1.9736 -3.1345 9.0372 -7.9000 3.6416 -#> -5.7346 -5.5697 -7.7569 -6.2873 4.6999 -4.1704 1.8441 -8.9156 -#> 1.9548 6.3252 3.0557 -9.0061 5.7293 -4.4918 10.9426 -2.5873 -#> 15.6885 -2.0274 -10.3964 4.3669 5.7090 -10.1337 -2.0274 -5.9175 -#> -10.9127 8.2897 1.2689 -2.7881 5.2909 -7.6923 -7.8837 2.5822 -#> -12.6344 11.2309 -6.9455 4.0729 -0.8880 -14.4394 -11.9550 5.4840 -#> 11.6747 -18.4813 -0.9224 14.7047 -13.1532 -16.9584 -14.5604 -3.2588 -#> 7.1224 7.6081 7.8043 6.9435 -4.3951 19.4718 0.1496 -0.0237 -#> 3.6916 -3.1680 -14.2709 12.8420 5.4446 3.6627 -16.5807 15.2178 -#> 5.9516 -3.0243 14.8711 -3.7670 6.9118 9.2722 8.1439 7.4384 -#> -5.9544 -2.2944 6.3805 -9.6320 12.2830 6.9966 -4.4074 0.8192 -#> -18.1687 -16.2847 -10.2525 -7.6339 -7.2074 9.0036 -15.0035 4.3509 -#> 1.0724 -6.5366 11.6397 -5.2278 -2.8765 4.4298 5.9241 6.1974 -#> 3.5181 1.3551 -1.8297 -4.7495 -8.6474 3.3282 16.5622 -4.5502 -#> -8.1147 12.4193 6.6285 7.0230 -7.9623 11.4584 -7.5197 -14.5441 -#> 4.9385 -5.5366 -15.0028 8.5349 0.1998 -13.5030 7.5491 10.4168 -#> 6.9734 -9.1558 2.4214 2.2550 -6.5530 0.7602 9.4171 -5.7679 -#> -#> Columns 33 to 40 -16.2546 -23.7417 -17.0193 -1.3142 2.8238 7.1652 -5.3280 -1.3377 -#> 14.1420 -1.5059 22.5524 -21.2378 13.9141 8.4371 9.6989 5.8117 -#> -7.3609 -3.2967 -12.9083 -11.0725 4.1133 11.0140 9.3464 7.2597 -#> -10.0718 16.3289 -1.2712 -3.5266 5.6203 -5.0587 -0.1206 -14.1769 -#> -5.1636 5.6576 10.7469 20.5651 8.4264 10.4495 -3.8374 5.2713 -#> 15.6951 14.4955 -11.6119 9.4762 -3.1124 -2.3191 2.1196 3.5569 -#> -10.7368 -10.5437 -12.2213 10.9230 -4.4011 -1.6173 0.0974 -0.5739 -#> -14.4136 -15.9359 -10.0872 -3.8252 16.9249 -4.7211 7.9440 1.1553 -#> -3.7608 -0.2942 -17.8180 -3.6190 8.0015 -14.3660 1.6582 -10.8504 -#> 21.1495 16.3938 9.4025 1.9902 -3.1828 10.3133 -0.6980 -12.5243 -#> 3.1599 -1.0148 -10.7063 4.5622 -6.2987 -11.1163 8.4843 1.3144 -#> 3.5440 13.5956 5.0636 -11.6090 9.5306 -3.8764 -3.9018 7.6265 -#> 5.8936 0.2157 -4.4326 12.8327 8.0861 -25.1949 -14.3138 -12.4501 -#> 5.9666 15.9317 3.8190 -4.7217 11.1876 5.2509 -10.9786 -0.5133 -#> 0.6909 18.5410 4.4026 -8.1224 8.0649 18.7591 18.1166 -18.0383 -#> 9.0911 14.8643 11.4708 4.9207 -8.1068 7.9500 4.9009 -6.7330 -#> 9.7842 -4.6255 20.4290 -4.4621 1.0605 -4.9756 0.1691 15.5058 -#> -2.8511 -16.5230 -13.0380 -6.8763 -5.0870 5.3321 -3.4553 10.2239 -#> -4.1570 -19.2809 -12.6503 4.1567 6.9719 5.7000 -1.0137 5.4218 -#> 3.9844 23.7523 -20.5564 2.7967 13.0207 1.8161 4.7416 -9.0043 -#> 0.8970 -10.7307 9.0940 10.2688 1.0261 2.0990 -3.9108 3.1225 -#> 13.4368 -10.2608 8.3270 -0.6738 3.3290 -3.8392 10.7277 0.5851 -#> -2.8274 21.8511 17.5014 7.3997 2.9204 -2.4399 -3.8114 -6.8817 -#> -12.0053 9.8954 -13.6067 10.2075 -0.5364 4.7130 -2.0404 2.1682 -#> 7.2322 1.0930 -6.5927 1.4633 -1.6856 11.1873 -12.6404 -4.7624 -#> 21.7513 1.6566 -7.0826 -9.0371 -14.3815 -19.6564 -6.0599 8.9391 -#> -3.6946 -2.1719 -14.3045 3.9593 -11.4235 -0.6905 -10.8632 -6.2279 -#> -7.9132 4.1077 -7.4057 3.9989 2.7405 10.1765 -11.8403 0.2625 -#> -12.8951 -8.4539 -3.7227 -8.7636 -19.2086 -20.4250 -23.3362 -6.3952 -#> 9.0696 -1.4293 -5.9031 -7.8074 -13.0599 0.1288 -5.4286 -6.5128 -#> 3.3542 21.9397 1.2557 0.1547 18.7915 -2.2035 12.2932 4.3940 -#> 19.6359 -5.6027 3.4627 -4.9635 4.0945 -17.2492 -8.7759 -7.4648 -#> -14.4994 13.8843 -3.1236 9.1751 -7.8411 -6.0328 -1.2764 5.8990 -#> -#> Columns 41 to 48 12.4684 1.5413 -0.9921 -2.4225 15.1364 11.2646 9.6565 18.3759 -#> -2.2483 -9.2219 -9.1508 8.3533 -7.1090 -0.2778 -0.5484 4.5380 -#> 6.2917 -1.6471 -0.3049 -6.6333 -9.6016 11.6996 -4.1160 8.6289 -#> 6.0482 2.3693 -8.5061 6.0234 -1.0929 8.3549 -6.7385 -3.8282 -#> 3.0921 -0.1369 2.3729 -4.0608 -12.5768 6.9210 -2.2395 0.3691 -#> -3.9794 -9.2462 -3.3190 -5.6941 5.3892 -5.3677 0.8627 1.9602 -#> 5.0847 7.7314 -8.0740 -0.1435 12.0844 -1.2089 18.1352 -6.2714 -#> 4.8262 2.1688 6.4618 -0.6966 14.3354 -0.1032 7.7291 7.0555 -#> 3.2609 14.0137 -2.2330 -13.2423 -4.2445 1.9784 -8.1714 -6.3727 -#> -4.8374 -6.5268 12.9251 15.1205 -8.1213 -1.3245 -10.2242 -14.8910 -#> -6.9816 8.6871 -3.6590 -14.3809 -9.3692 2.6268 -1.2327 8.6051 -#> 1.6206 13.5033 -5.4795 -2.0907 -5.7077 -14.7999 -5.7586 4.2999 -#> 8.7203 3.3506 -14.3992 -5.7758 3.1561 -16.5495 5.3469 3.9362 -#> -4.2642 8.1815 4.7881 2.3056 -6.4157 -12.4372 8.1750 -13.3807 -#> 9.0167 12.7056 11.6007 -11.0262 4.2971 -3.2547 1.4945 4.9691 -#> -16.8074 -6.4873 3.3881 13.5651 3.2480 2.2724 -7.9075 -0.5044 -#> 6.9120 5.2717 2.6873 10.3306 -8.6791 4.5743 11.1047 -7.8470 -#> 3.4473 -9.3260 -13.5324 -2.1988 3.6893 9.5879 -2.7626 -3.1742 -#> -1.0179 15.3903 16.5882 -15.5050 10.5391 16.1257 -4.2574 -0.6911 -#> 0.2554 10.1870 11.1322 -0.8859 7.5715 -6.3537 -6.4601 8.0480 -#> 1.9401 -4.3159 13.1100 7.9316 -0.6930 12.7543 -0.0876 2.2636 -#> 1.8901 -1.4288 1.4151 12.0978 -8.4133 1.3448 -13.1126 -4.5250 -#> 14.9244 15.4586 22.7820 14.1672 -0.7533 -3.9422 -0.6175 -6.9593 -#> 2.2502 8.3303 -2.0124 -6.4444 -1.9069 -9.1721 -9.1885 0.4218 -#> -1.2529 4.6876 3.5248 0.3786 6.1484 -2.2346 -12.1829 25.1356 -#> -3.8372 -10.7091 -7.9372 2.7616 -1.8327 -8.7472 3.2606 -12.9457 -#> -0.8361 -8.9647 -5.8202 0.8926 5.8276 0.1345 -3.0728 7.0463 -#> -6.0486 10.4944 -4.7435 -7.4233 -1.6023 2.5923 0.8755 7.7830 -#> 6.3221 -10.5719 -15.1768 -0.5792 -4.6111 5.8933 1.8873 1.3806 -#> 4.9719 2.2166 -5.4620 -3.4779 8.0047 -12.0724 4.7335 -3.8603 -#> -5.3593 -5.7105 7.5636 13.7879 -7.3514 4.2796 -1.3703 -14.9948 -#> 1.2680 6.3434 4.6993 -5.8200 6.6031 -3.4956 5.4723 8.7672 -#> -8.4542 3.9480 -2.9003 -7.2767 -2.1211 -11.5426 -1.4058 -4.2086 -#> -#> Columns 49 to 54 6.9345 6.8547 3.0552 -3.7366 15.4697 0.1524 -#> 6.2922 -0.7933 -6.4809 2.7974 -5.4409 0.0367 -#> -3.3257 -0.0392 2.6232 -1.5331 10.6490 -2.2028 -#> 4.5382 -9.6036 -9.4649 6.2032 -3.8240 1.4407 -#> 13.5199 6.7690 2.6443 -3.3702 5.4460 -3.5104 -#> -16.6280 2.0464 1.1608 -5.5702 -3.6791 -5.3750 -#> 6.2361 -2.5801 -0.8141 5.3224 -2.5808 -0.4248 -#> 6.5722 4.9813 13.0259 3.5532 -0.2108 -0.3481 -#> -8.5610 -4.7566 0.1911 -19.4216 8.7068 1.1852 -#> 16.1358 11.3908 0.7024 -5.9319 -4.2298 -3.7900 -#> 0.7456 -2.5285 2.6793 0.9100 4.2093 -1.8958 -#> -14.6449 -7.2198 2.0372 -3.0012 0.7265 -2.1829 -#> 4.9887 -4.5485 -7.1482 0.7946 -0.2220 1.1991 -#> 7.3561 -6.5854 0.4942 -9.0523 -14.6778 -0.6867 -#> 1.8704 -5.6577 14.4754 0.7415 2.1388 -1.6480 -#> 1.0187 7.2594 -7.2277 2.4263 -2.5897 -0.5953 -#> 5.8552 -3.0989 -4.6741 2.1087 -6.3685 -5.4360 -#> -4.7954 -3.0337 -2.6234 9.1441 4.1686 1.3206 -#> 0.5679 9.4099 5.8011 0.1202 -0.3942 -0.2886 -#> -2.6996 -0.6632 0.7895 -1.5725 -0.1924 0.0885 -#> 14.5838 10.1875 0.2721 3.2883 1.1273 2.9354 -#> -8.6839 4.6683 4.7676 -4.0084 -3.8531 -2.5571 -#> -2.7880 3.9429 3.5918 2.5719 -6.0650 -0.8717 -#> -10.4243 -2.6173 -0.4466 -6.1775 -4.4496 -1.5418 -#> -13.9890 3.4637 -13.8115 7.9119 2.7131 0.7817 -#> -2.7265 -7.1634 4.9050 0.6368 -4.2845 0.3284 -#> 6.9923 -13.7115 2.1192 -0.0309 3.4392 -0.0764 -#> -2.5682 6.7601 -11.6196 -0.8245 5.3279 2.1712 -#> -1.3624 -9.8982 -3.0243 3.6376 -3.2003 -0.3407 -#> -1.5863 12.3002 -15.6227 9.8221 10.8348 1.3270 -#> -0.2349 16.1233 7.4516 5.6809 -5.4817 -1.3903 -#> 5.5133 7.0013 -9.5816 2.0696 2.0649 -0.7826 -#> 4.8998 -2.3120 -3.3847 -3.6179 -0.9100 -0.9852 -#> -#> (17,.,.) = -#> Columns 1 to 8 6.1443 6.9194 -1.0562 -4.3711 4.4654 0.5805 -9.1360 -14.1596 -#> 2.9841 6.6422 -2.7177 -1.1271 0.7274 -5.4302 2.3300 -5.4333 -#> 1.9327 -5.6299 -6.9753 -3.4866 7.1639 4.7113 -8.2368 9.1127 -#> 4.4792 -8.6570 -7.9390 -12.0866 -1.9326 4.4371 -24.3383 4.0757 -#> -0.5831 -6.2944 -0.5927 -0.3604 11.4115 2.1684 -7.3916 3.0542 -#> 4.4370 -0.6599 3.2872 9.1564 2.6958 6.7784 -1.4288 -2.8477 -#> -2.1455 4.1315 -10.8879 1.0214 7.5736 2.6349 -12.0602 -16.0574 -#> 1.5223 4.4186 -2.2786 -1.9884 0.6818 -11.9779 -7.7711 0.4184 -#> -0.6403 9.8892 -0.0395 -1.9764 -3.8331 12.8968 -8.1312 12.7658 -#> 4.0269 4.3379 -2.1609 -17.5246 8.5467 2.9727 -6.4218 -1.9910 -#> -3.8391 2.7261 -0.3629 0.9506 2.2703 8.2623 13.7309 -7.8259 -#> 2.4645 -2.5920 -12.7258 0.0863 -2.0993 -6.0942 0.4621 13.7128 -#> 2.4760 1.2588 -9.4273 8.8638 4.6503 1.9964 4.9189 -3.0370 -#> -0.5484 10.3669 -6.2906 -7.2585 -16.6554 -2.5892 -7.9406 -4.5755 -#> -0.7397 -8.5692 4.6649 -3.7718 0.4087 4.4101 -11.1031 -4.4849 -#> 0.1002 -2.6376 2.6019 12.8042 -1.8125 2.6265 8.1862 2.4251 -#> -1.0834 -3.4857 -4.6050 9.3387 -5.0314 -9.9726 -9.3422 -8.8175 -#> -1.5390 0.1313 9.5713 -3.0821 -2.9337 -3.8771 3.7128 4.3213 -#> -7.7632 6.8190 6.7207 -13.4798 14.6956 3.4435 -2.8333 -21.4868 -#> 3.2687 -4.6805 -4.7812 -3.1980 1.6897 -13.9128 -9.4960 3.1243 -#> -1.6829 3.7035 14.6709 -2.2077 -3.2603 -1.1370 10.1571 1.2648 -#> -1.0883 11.0762 1.3416 -12.7511 -21.1933 2.9799 1.6736 15.2179 -#> 4.4480 2.8421 -5.4820 -4.7026 -5.0694 -11.4562 -23.3585 7.1575 -#> -2.9082 -1.0902 -8.4298 -1.1375 -2.0630 -7.2376 8.0495 15.6140 -#> 7.1673 -10.4637 -7.5146 0.0550 10.0286 -4.0913 -6.3031 20.5785 -#> -5.4081 -0.1117 2.7705 12.1973 -8.7742 -0.2077 8.2314 11.1816 -#> 1.1246 -5.8824 -3.4459 2.8865 3.2704 4.4080 -1.6483 5.7580 -#> -0.6552 3.7956 -3.3537 -1.6918 3.8882 4.0538 3.1555 -8.2176 -#> -2.3748 -6.2723 5.2245 4.1894 10.9072 -3.6395 -0.2955 -1.0144 -#> -3.5075 -5.5204 -6.1432 0.8693 9.1727 -10.3702 6.5089 3.1676 -#> -2.1095 6.3571 -2.2335 10.9228 -13.9975 -2.3345 22.6371 -9.1804 -#> -0.1568 -0.9048 5.5829 -2.9037 13.8580 -8.1319 -1.9552 3.5291 -#> -3.8153 -0.3087 -7.4113 -8.1043 7.0150 4.7633 16.0928 3.2886 -#> -#> Columns 9 to 16 -3.1810 -3.2627 1.0612 4.0951 1.5249 5.4468 -0.3204 7.8181 -#> -4.6727 -5.7859 0.5801 3.7623 0.3317 -1.4330 1.5571 -0.2559 -#> -7.3271 5.7157 -5.1694 16.8466 -2.7004 2.6971 -3.5376 -10.4199 -#> -6.5211 11.6065 -8.2419 5.3917 -1.8777 -7.2023 5.5177 7.2373 -#> 6.9915 -10.4649 -8.8021 -0.0459 -0.1659 -1.2228 -1.3253 8.1643 -#> 3.4482 6.1002 3.7333 -9.9588 11.2046 11.5108 -6.2545 -9.3857 -#> 8.2876 1.7361 -12.5369 2.4079 16.4401 5.7171 2.8861 13.5347 -#> 3.4647 -11.3662 -21.3331 7.0156 6.9562 6.3768 -10.0717 0.1385 -#> 15.1363 16.1886 -0.6350 11.5681 -6.6108 -8.0296 2.9866 -5.0846 -#> 3.5970 18.8378 -11.3572 -4.2139 0.3005 7.6319 18.2322 -4.4625 -#> -0.8092 -5.0381 4.6445 1.8130 -9.0166 8.6400 7.3903 -17.6204 -#> 14.0589 -4.6007 10.7953 -7.4705 -12.3759 9.3497 -3.1502 -5.5020 -#> -6.2404 -2.1810 -2.7716 0.7559 2.4455 11.5034 4.8724 -9.3390 -#> 1.5732 -4.2609 14.5878 -21.0616 2.1890 -5.8857 -4.9004 8.9875 -#> 1.0933 -3.6500 3.7066 5.4885 -13.8248 -0.8643 -16.2240 7.1034 -#> -17.7197 5.6153 12.2955 10.4517 -3.1934 4.9312 5.7033 13.4370 -#> 1.3269 -10.6556 -6.0935 6.4030 2.2019 -2.4704 -3.2745 -7.5166 -#> -6.9070 -1.5901 -3.8122 -5.1475 -3.1792 1.8839 -6.9131 7.7553 -#> 9.1986 -4.2436 -2.6430 4.5851 -18.2240 -13.6614 2.8320 -6.5608 -#> -3.7126 -0.7393 12.5292 5.0482 -8.4647 -1.1843 15.3833 -12.8543 -#> -13.4071 -6.8954 -6.4824 -6.8973 -8.1641 7.5570 20.1569 13.7861 -#> 7.7510 10.6749 -6.2694 0.7353 1.9931 3.6837 -7.3125 1.7338 -#> -0.2942 6.2285 5.4998 -8.2894 14.3482 12.4478 -3.6528 -4.6218 -#> 13.3558 -4.2591 18.4636 -8.0938 5.3546 8.5646 -4.5114 3.6857 -#> -1.7329 -15.8077 12.7673 -10.2201 9.5951 8.8423 -11.7141 -9.4775 -#> 19.4649 -6.8292 9.3490 -7.6591 8.2599 -1.1958 -1.2143 9.2001 -#> -9.5506 -3.6073 14.2369 -3.8460 -1.9205 5.9566 -4.4946 -9.4718 -#> 2.9816 -0.1607 -3.5225 14.3550 -7.2591 -11.0014 -2.8438 3.6171 -#> 3.2333 -0.8371 2.9425 2.9499 2.0615 9.2706 -8.6544 -7.0034 -#> -16.4620 2.1289 0.1354 -8.4689 -1.2499 4.8122 2.5817 13.4833 -#> -0.5567 1.8226 -2.1932 -11.4949 -8.9891 7.0014 11.6330 0.0430 -#> -15.2057 -1.4595 -11.9441 16.5283 16.2629 -5.4902 20.6133 -0.9546 -#> -7.3588 -4.2087 -0.1877 4.6742 10.1569 4.0011 -0.8341 8.5255 -#> -#> Columns 17 to 24 4.7876 17.2720 3.3702 -1.0435 -10.8395 -8.1406 -7.5049 -5.5539 -#> -9.9691 -1.1988 -0.3827 8.6837 -16.2213 -13.0124 -2.5975 -7.0430 -#> 4.5661 17.4118 7.9166 2.5864 -10.3547 -1.7683 9.1435 10.5543 -#> -23.2334 10.3631 -25.5210 -12.3129 -4.8079 9.2390 -4.7016 -1.0820 -#> -2.2913 -9.7322 -9.6260 5.5965 -2.5656 -0.5326 -6.9418 -10.7495 -#> -3.2222 8.0219 5.9834 -14.8087 -6.2356 -2.7075 -13.7779 -38.2064 -#> 1.0871 0.2784 4.6760 -6.9322 15.0898 -14.0151 -7.9154 -12.2438 -#> 20.5265 -13.0739 -4.6356 -14.1243 1.4672 -5.4715 -0.2433 19.5680 -#> -12.5312 -2.4462 12.4738 -14.6028 17.5852 3.3055 0.7279 2.9435 -#> -9.0759 7.4424 -13.4984 -3.4864 10.9933 23.4347 -12.4106 -6.8116 -#> 0.2945 -2.3298 -7.9317 14.3155 -0.0490 -8.9765 -2.0218 7.3739 -#> -13.2542 5.0205 -2.1731 1.3804 1.0738 -4.6849 -13.5458 2.3084 -#> 7.1460 -7.8515 2.2562 -6.2295 11.3502 -1.0303 -5.2992 15.8323 -#> -15.0881 0.1767 6.4603 2.2481 10.8668 -1.9336 -3.0059 -3.9025 -#> -3.4863 0.9929 17.7379 -0.4459 4.1941 -0.3456 7.3584 -9.6679 -#> -7.0207 0.9989 -17.1677 12.6743 -3.4119 5.4517 4.1034 -7.5619 -#> -6.6729 -7.0634 -8.5295 -6.7227 7.5573 -20.6982 -17.9593 -6.2486 -#> -0.9564 16.8376 -17.7000 8.7197 -8.0258 3.5434 3.8272 8.6623 -#> 4.7000 -2.5202 -4.5670 -4.7674 -4.1558 19.1030 0.3822 -3.8329 -#> -4.2104 18.9013 -0.6283 -5.7242 -0.6763 4.5063 4.3415 -4.9077 -#> -6.0676 -5.2026 -9.6267 16.3435 -8.3280 16.4242 10.7939 5.0139 -#> -6.8391 1.9071 -11.6442 21.1259 -11.9573 -2.9056 -0.0309 -0.9524 -#> -15.3838 -14.4158 -5.6303 10.7407 18.0120 -5.8079 3.7584 -4.3958 -#> 12.5217 -0.7652 -7.5056 1.5211 -10.9556 9.6789 1.6204 -0.5635 -#> 8.5732 20.9318 -13.5805 8.2473 -2.1256 13.2180 -0.0080 2.3703 -#> 12.6771 -0.0184 15.6403 1.7141 19.6619 -13.3052 17.5606 -7.7566 -#> 2.2863 0.1965 7.5713 -2.2330 0.8285 -10.7955 3.7508 -3.1906 -#> -12.9624 -5.9707 -4.7057 -9.0526 -14.0223 20.6805 0.6815 6.7069 -#> 11.4943 -6.2175 7.6624 8.1907 3.7794 3.9137 16.4800 9.7840 -#> -6.1348 4.4264 4.3409 11.0234 -1.3618 5.7269 -19.5008 -0.8807 -#> 3.5661 -2.3396 -6.4453 4.2679 7.8474 10.3957 -0.1799 1.4265 -#> 3.1961 -9.3156 25.0814 8.3016 9.0736 -3.5149 -4.4687 4.5316 -#> -2.1196 1.2412 -5.3532 -1.9141 1.0356 2.0371 3.8051 3.0251 -#> -#> Columns 25 to 32 -12.7784 -3.1151 -4.1067 -17.6687 -13.4600 6.3249 -9.3811 -7.3388 -#> -0.7619 -11.6666 12.7568 -7.8023 -13.0570 0.8027 15.5756 -0.4485 -#> 0.7670 -3.6165 1.0185 6.6977 0.9419 0.6155 -6.7409 9.6993 -#> 5.1670 0.1816 -5.8841 10.4241 8.7812 -0.6064 17.1802 5.0143 -#> -4.9780 -2.7439 -14.4820 -3.7913 -4.8708 -2.0714 -3.2589 12.7400 -#> -11.4598 -1.2145 -10.0119 -10.2576 -8.2680 -11.5806 -3.2187 -2.6170 -#> -2.6593 3.2811 -11.8429 6.7609 -2.9523 -20.6242 1.8380 5.0370 -#> -0.6210 -23.1475 -2.7063 -5.0878 3.5943 -1.2283 -3.5178 -7.9973 -#> 3.6137 9.2104 9.5769 -2.2702 11.1843 -0.4034 1.9130 11.4164 -#> 7.6460 -0.3827 -20.7806 -0.2912 -2.5550 2.5483 -0.9074 3.8046 -#> -10.1698 -1.4984 -7.6352 1.1972 3.4435 4.5729 2.1873 2.6382 -#> 2.0410 -4.2316 1.8286 4.2103 -2.9651 6.1363 9.6191 -2.9329 -#> -2.6176 1.7511 -8.1758 -4.8099 -9.2014 3.0600 4.1247 -6.5042 -#> -0.7695 -2.7357 6.5961 0.2550 -8.4879 5.1099 7.8306 -3.8784 -#> 7.1554 -8.0673 -4.7431 4.5292 12.5880 0.7548 17.6900 -0.6163 -#> -9.0725 9.8556 5.5272 -7.2525 -4.6794 -9.3108 12.0713 6.8277 -#> -15.2457 -11.3464 -0.8299 -5.3889 -7.3931 -14.3264 -2.2502 4.6353 -#> 10.7940 4.0557 14.0390 -9.8524 2.6743 -4.7332 5.4225 -1.7788 -#> 5.1810 -9.9234 5.5246 1.2374 7.2331 -0.2130 10.8272 -0.3448 -#> 4.7325 4.7821 -2.9912 4.3711 -2.0756 12.5127 -0.9461 -8.1639 -#> 8.8977 8.8578 -9.6155 -7.2892 -4.1856 4.3503 5.4361 -3.7193 -#> -3.5420 -11.7217 5.6519 -1.5706 1.9119 4.2155 4.5332 -15.1637 -#> 9.5145 6.2907 -9.3909 7.1549 4.9775 3.4711 7.4276 7.5582 -#> -4.8574 7.1038 5.1234 8.3762 -4.2527 -6.3355 6.4191 1.9055 -#> 10.1519 12.8104 4.5860 0.7293 0.0193 -1.5512 -3.6323 -3.6445 -#> 9.6038 3.8469 3.8026 -2.4492 -5.6320 -7.9020 -0.5373 -3.5500 -#> -3.2464 6.0656 -0.9350 3.5552 3.2878 8.9093 4.0319 -3.2805 -#> -0.9716 -4.9932 1.2699 -2.3636 9.6565 2.8870 -2.7157 6.6711 -#> -4.0200 2.8403 11.0463 5.4527 13.8217 -11.7949 -6.8832 -15.9980 -#> 20.2676 10.8881 -7.9728 -13.7948 4.5471 -4.2364 4.6383 12.1830 -#> -13.1566 -7.2088 3.0935 -5.6241 -1.0032 -1.1888 5.7101 9.9611 -#> -16.2755 11.5573 6.1171 -1.1188 -9.2500 7.4123 6.8200 -2.3877 -#> 0.3761 1.2744 1.5478 4.8301 3.0607 -1.7451 -8.4018 5.3828 -#> -#> Columns 33 to 40 -2.6798 10.7408 5.4904 -4.9847 -6.8072 5.9453 -7.2387 9.5456 -#> -1.6163 -9.3078 7.8399 -18.4102 7.3172 -3.7820 -10.7607 6.2403 -#> -10.8727 3.4745 -0.8042 18.6595 2.5602 7.4274 -2.0725 -3.4180 -#> -4.3281 11.4767 6.3640 10.9480 -6.1661 -18.5870 -4.1403 10.0234 -#> -2.4974 1.1080 -6.0992 10.2662 -2.3050 13.6306 16.0703 9.8415 -#> -3.2326 1.2055 -2.1907 -1.9525 -3.4973 8.5366 -14.1278 -2.0794 -#> 1.5418 12.4918 11.6216 2.1373 6.5673 4.2851 -1.3019 15.2588 -#> 6.4475 4.5054 -8.5098 -8.8409 -0.7175 -3.4654 10.5672 2.5838 -#> 5.9579 0.1376 4.9462 1.3211 9.5206 -8.4694 0.2855 -4.4919 -#> 5.8723 8.9215 9.1594 -7.8355 5.6397 2.0405 -0.8956 -3.4817 -#> -0.3586 0.7275 -0.6278 6.3590 13.4949 3.0733 3.7876 0.0623 -#> 12.0649 0.3028 0.3616 -4.0420 10.4038 -0.6742 2.6561 23.8164 -#> -9.3511 -1.2811 1.8103 -9.9004 8.7360 -3.9421 -0.2291 15.3043 -#> -2.3802 -7.0159 11.3137 -13.9367 6.5566 -6.2087 -11.2825 7.4017 -#> 5.6427 -5.7506 -0.9135 1.0892 -2.6045 -13.5775 4.7687 0.4022 -#> -10.2733 -16.6287 -0.3390 -2.6910 0.1103 0.9711 2.0200 -10.0394 -#> 0.4027 -8.1790 -1.2274 -1.8224 6.4739 0.3178 -10.8276 5.8462 -#> 5.6329 0.4834 1.0717 2.2724 -4.3132 -6.7506 1.2790 9.4726 -#> -6.6045 10.1189 6.9280 1.8852 -2.8820 -9.5778 -18.9174 -7.5127 -#> -4.6235 11.4260 -15.3983 4.8237 0.5430 3.1613 17.0547 -7.7714 -#> 0.9165 -15.7357 -2.9726 -16.5025 -4.2553 8.5064 4.5877 6.8933 -#> -1.0045 -1.5503 11.9903 -16.7153 1.7860 -8.8464 -1.3527 -5.7166 -#> 1.9460 4.4425 -13.1039 0.3578 8.0725 8.0274 9.0462 -14.6979 -#> -1.4275 7.5097 6.2892 0.8070 -3.7205 0.7155 -12.4836 7.3437 -#> -3.1733 9.4923 -12.6845 17.5650 -1.7871 8.4914 -5.2970 -6.5126 -#> 5.3399 6.7639 7.6913 -3.0564 8.4512 2.6952 8.8310 -0.6163 -#> -4.9734 1.6869 3.8372 11.4282 -10.7808 -0.1684 8.9313 -4.8800 -#> -8.7450 -6.8279 -6.4785 21.0190 -10.2943 21.6078 -29.3649 4.0950 -#> -2.7746 -2.4019 1.7970 8.5955 -9.3389 -1.3879 13.5859 -5.6466 -#> 11.1821 -5.2228 13.2041 1.8326 -6.4913 5.4415 -16.9786 -5.4153 -#> -2.8319 -4.7344 9.0530 -7.0999 4.7814 -6.2008 -4.4767 28.7513 -#> -19.1089 -8.6855 13.1355 -9.2310 6.0743 -9.0933 11.9112 -10.9186 -#> -9.4442 0.6115 -4.7476 4.9595 2.3222 31.3597 -0.4587 -3.3852 -#> -#> Columns 41 to 48 13.2335 18.4072 3.8452 -6.9916 16.0191 -1.4387 7.3699 14.8535 -#> -4.2993 10.6820 7.0353 1.3130 -8.7711 10.1553 4.1372 -0.5599 -#> 3.2517 12.1267 -15.7250 0.9289 -12.1262 -4.4340 -5.4977 10.2717 -#> 2.3578 -1.1384 -13.6929 1.9523 -9.0392 16.9055 5.2991 -21.7345 -#> 20.1297 9.7540 -4.4430 17.1155 -10.7662 3.5038 7.8592 3.0781 -#> 8.2474 -7.9042 2.5285 4.0943 11.3423 -2.9282 -3.7371 -5.7120 -#> 4.6633 -6.2643 -2.6094 -16.2379 1.1435 4.4389 17.2909 8.6930 -#> -0.7847 -14.8175 14.3593 -0.9707 13.6857 8.9951 -1.7974 5.3307 -#> -2.7323 13.8403 -4.1151 -8.8232 -5.2754 3.0921 -12.9723 -7.8679 -#> 12.0113 -14.4334 -7.5176 -4.4632 3.3497 4.9875 -13.5565 -1.4509 -#> 2.6535 2.2753 3.7532 1.5156 2.9532 -11.4222 15.0870 12.4004 -#> -0.8589 2.4802 -6.1565 -4.2074 -2.5356 -13.0278 -3.9227 -18.4011 -#> 6.0597 -10.3485 1.7872 -6.2897 0.9800 3.4572 17.1559 -0.8167 -#> -3.0977 -4.0392 5.5219 -7.0214 -15.3351 -7.9001 5.0116 -29.1513 -#> 24.2113 14.1920 3.8048 5.9459 -8.5433 -11.8677 -0.5119 5.3183 -#> -1.5524 -1.9231 -10.8412 14.6299 6.4453 -4.2846 -3.5581 6.6648 -#> -17.1757 -4.2006 -2.9053 -7.5145 11.6744 -18.1325 5.4189 14.1761 -#> -13.5461 6.5545 18.4231 9.6028 6.9905 7.2333 -4.9760 -7.3707 -#> 4.2127 -3.3927 8.6950 -22.2378 -4.6125 -7.6642 4.6457 22.9384 -#> 1.9780 -16.4488 -4.6691 -0.0215 3.3830 1.5566 -12.8181 -11.4244 -#> 4.7052 4.0998 3.6235 -2.1365 11.1460 -6.0001 -11.2796 -2.0067 -#> -9.5723 6.5852 -8.2466 3.4642 -19.0371 -1.4680 -10.6882 -6.0754 -#> 1.5706 -2.1467 -8.1451 4.2954 -5.1098 -9.6729 -8.2277 -16.7589 -#> 4.9486 -7.8326 -9.1857 4.9922 -6.8312 2.1194 13.4255 -7.9282 -#> -2.9973 4.8711 -14.2213 12.5082 -8.9272 -13.5373 -10.6331 -1.7991 -#> -20.9875 -15.0079 -0.1355 -4.0733 3.4722 -1.1602 -1.9319 -10.5716 -#> 10.5513 -6.3437 -10.0204 -5.1811 -7.3454 3.2951 13.2194 -10.6875 -#> -9.0411 9.6437 -0.7267 -8.0317 -3.7791 -14.4714 3.3345 0.9388 -#> -12.0010 -11.0004 -18.0084 11.7813 -4.8742 17.4474 13.8801 2.0438 -#> 31.7265 -13.4833 13.5332 -2.5016 -7.9255 4.0851 1.6553 21.2569 -#> -1.0274 -0.3764 3.9221 0.6137 4.0974 -6.4963 -6.7259 -11.1689 -#> 7.8031 0.8675 5.7429 2.3580 -2.4164 2.3665 5.2437 22.7314 -#> -4.2878 -10.5383 -9.9768 1.3139 -9.4152 -6.3573 9.6883 5.9264 -#> -#> Columns 49 to 54 -0.2671 9.7701 6.0322 4.6811 2.9762 2.2722 -#> -14.4196 -1.0555 10.7776 -8.0051 1.9966 -1.7877 -#> 3.2285 -12.2791 6.1325 -8.0227 6.4722 2.3322 -#> -9.7100 -12.2404 12.3004 -3.0280 -5.8205 -3.4416 -#> 1.8904 1.2133 7.7791 2.6255 1.2190 0.2403 -#> 13.7775 11.0335 -12.7964 -3.6531 -5.7224 -0.2613 -#> 2.1140 0.6198 8.4620 2.8346 -13.2978 -4.7245 -#> 6.4016 6.4218 5.0143 -7.5094 -10.5561 -1.5737 -#> 5.6732 -21.9615 9.2310 6.7673 -3.2778 0.9200 -#> 10.3036 -3.3432 -9.5236 6.6128 -3.7361 -1.9285 -#> -3.4186 -3.4549 -3.8103 2.1477 0.9592 -0.1993 -#> -4.8919 10.2478 8.8289 1.0471 5.5893 -2.8752 -#> 1.7449 1.9779 -4.1833 2.3643 -0.7714 -1.5880 -#> -7.5838 -4.1429 -6.4595 16.2080 -8.8949 -4.2480 -#> -7.4641 -0.9506 -3.1015 8.1307 11.6222 0.3097 -#> 13.7682 8.3304 1.3823 -18.3744 4.5722 3.6273 -#> -11.2872 3.8188 0.5564 -11.4159 -6.8905 -0.0975 -#> 11.2198 -1.0655 6.9872 -7.0629 -3.6281 -0.3584 -#> -10.5131 -2.6208 -5.8299 0.5071 9.3178 3.8768 -#> 3.3297 8.2219 -13.3294 -0.0814 1.1981 -1.7593 -#> 7.4771 22.3231 3.6667 -4.9007 1.9783 -1.0990 -#> 11.3080 -15.7802 1.4606 -12.4734 2.3892 1.7499 -#> 0.4649 -9.0525 7.7570 -7.6014 -11.8574 0.6533 -#> 5.2104 1.1293 6.9246 6.9781 -3.4111 0.2557 -#> 6.8321 -5.3332 0.8903 -1.9140 -0.1111 0.7168 -#> 11.2020 13.3858 -6.2562 15.5736 -7.2203 -3.6565 -#> -2.5946 10.7949 -9.5117 13.1883 0.9728 2.0467 -#> -16.0799 -13.1516 0.1700 -12.3154 1.6330 4.2724 -#> 1.1213 8.2665 14.9941 -4.0737 -4.9093 1.5110 -#> 4.0347 -4.5001 -0.1521 7.3414 6.3084 -0.2847 -#> 3.4297 8.9231 13.8903 -1.4625 -3.8002 0.3066 -#> -15.8880 12.6141 4.3672 -4.6317 -0.2330 2.2429 -#> 10.0463 3.7442 -3.5000 0.3014 -3.8686 -2.9811 -#> -#> (18,.,.) = -#> Columns 1 to 8 1.6420 1.2879 5.3164 -7.8571 12.7048 5.8433 11.3036 3.7826 -#> -2.5109 0.0431 -1.9696 4.5404 -18.1270 -4.8284 9.7173 6.8896 -#> 3.3734 -1.0133 -1.5317 13.5069 8.2392 -2.2201 18.2343 8.3940 -#> -0.6494 1.1275 -0.7992 6.7130 6.8398 -0.1860 3.1122 -4.9259 -#> -0.1896 1.4126 -7.5902 8.6638 -6.4634 2.9422 6.8588 -8.1542 -#> -2.5423 1.3982 8.7002 -5.7975 -3.0473 -11.9556 13.9886 8.3534 -#> 0.0925 -1.9056 4.1929 9.6189 3.0107 18.1900 -4.0006 -0.3840 -#> -1.9065 -12.0112 0.1455 -5.1230 -5.6154 -0.0627 -6.9732 7.9400 -#> 3.1135 9.0919 -4.3371 -19.9986 6.5665 -11.0074 -8.8907 -3.8055 -#> 7.1416 3.7126 -1.0907 10.3849 11.9761 -1.6818 9.6157 1.0954 -#> 1.6181 -7.9607 -6.3046 11.3903 -19.3071 0.1217 6.8742 7.1179 -#> 3.1030 0.0477 -3.0999 -4.6140 -9.5560 -5.0644 -0.4219 9.1221 -#> -0.3628 -5.0311 7.3550 5.5133 -4.0872 -4.3923 8.5265 11.5909 -#> -2.2950 5.0372 6.1929 0.2985 -9.3952 2.1639 -2.5889 0.1801 -#> 0.5811 -7.8022 -0.1743 -1.9489 -9.0799 -2.7091 12.2969 -0.8607 -#> 2.0249 1.1647 -2.1831 13.7438 -3.9637 -1.8503 8.7329 -7.0915 -#> -3.8079 -6.8342 -6.9091 -2.8699 -1.6679 -16.3355 2.1067 -1.9638 -#> -8.4592 8.9154 0.8565 0.4704 -4.0822 7.5768 -11.9635 12.3542 -#> 3.6177 3.4600 -10.9865 1.7910 -2.5461 10.9428 -11.2756 -9.3878 -#> 2.6046 -3.8795 -1.0403 8.5378 3.9353 3.5416 -1.2939 14.9872 -#> 2.5174 1.1112 -1.4658 5.9015 6.6519 -5.6944 -8.8193 -4.0332 -#> -1.5362 0.8535 -5.1453 -6.1017 -5.8288 -13.3209 -13.5367 -0.0891 -#> 5.5770 -5.0353 -3.9072 9.3719 -15.7994 -14.6314 3.9531 14.4180 -#> 4.4054 -8.4694 10.0511 8.5610 -10.1093 4.8425 1.0703 -1.3042 -#> 6.2056 -1.5487 -4.2585 6.9884 1.3044 -2.5759 5.4127 9.3555 -#> 2.2354 6.1777 8.6935 0.8330 7.0591 9.0817 4.4968 -2.9316 -#> -0.3657 -2.1088 1.9124 -2.1532 0.5189 -4.1221 10.8716 4.1775 -#> -0.4629 -0.4061 -5.8569 -5.7979 8.3717 -0.3196 -4.8337 -5.8717 -#> -6.2881 -4.1826 0.7540 6.1380 -4.3152 -7.9107 6.6223 2.5199 -#> 8.2005 6.7751 -7.6609 12.9753 -3.5936 10.3625 -2.7662 0.3874 -#> 2.6145 -1.6590 1.1084 6.4919 -10.4815 -7.4845 5.0912 -3.3561 -#> -5.6482 6.7101 0.6589 -2.6008 9.2927 -2.4136 25.8413 4.2583 -#> -1.5522 -0.6731 3.7094 13.6084 -3.4717 14.0455 18.0751 9.0139 -#> -#> Columns 9 to 16 5.3804 -12.1351 -10.3660 0.5842 8.7693 14.6643 1.6818 -4.7866 -#> -6.3455 3.6485 -5.2393 8.7314 -12.4487 4.7045 7.5412 -7.6969 -#> 8.4230 -14.6234 11.1057 -14.9826 4.4700 -5.2711 19.1685 -4.3796 -#> 17.0186 -6.2625 7.5988 -5.7124 6.4860 5.1624 2.9079 -1.0623 -#> 5.3565 -3.7169 29.1445 0.5628 -6.2952 -15.5886 -2.4912 1.7751 -#> 4.5112 -12.6647 -5.2051 -5.0987 -2.6762 -12.0308 -17.8213 2.3569 -#> 9.6305 7.1402 -9.4125 4.1921 5.6178 -4.1310 -7.4195 20.3406 -#> -23.4526 6.1727 4.8096 6.1797 -0.8832 13.5739 10.5601 1.5159 -#> 11.5793 3.3354 -4.7421 -2.3175 7.3162 -5.4094 5.8834 10.9601 -#> 21.4171 3.9900 5.2486 17.4438 4.1869 -10.7578 9.4085 15.2720 -#> 3.6663 10.5914 9.0197 -12.5738 3.7574 -8.8465 -1.0220 3.9162 -#> 8.9307 -4.4353 2.4044 3.1607 -5.1845 -4.3000 -9.6301 -17.7922 -#> 4.1468 -5.9985 -0.0510 -0.9246 8.0400 -0.8469 -19.3457 -0.0925 -#> 2.5744 -1.3962 -4.2692 6.4907 -18.3116 3.7004 -19.8901 2.7946 -#> -0.8892 -3.9821 -2.5982 1.1824 -8.8158 -0.4670 -10.6899 7.9036 -#> -11.7663 -8.2280 10.0735 12.8687 -3.1658 -1.1699 5.9914 4.3675 -#> -2.5636 -13.9672 -10.1241 6.0991 -17.4340 -10.4532 -0.0374 6.1176 -#> -7.6453 9.5692 -7.7252 -12.5117 6.0954 -0.0058 2.8967 -21.3386 -#> -13.3735 5.3355 -13.9365 10.3804 8.2995 -10.9733 33.1979 27.0988 -#> -12.7816 -11.8679 11.4570 -3.9373 5.9092 7.1102 -6.8729 -6.7231 -#> -6.3768 3.6131 2.9764 -1.4754 15.5975 8.9042 16.6951 13.3923 -#> -0.5956 8.4284 9.3541 6.6690 3.1478 15.4808 10.6831 -8.6146 -#> -2.4373 -18.7960 18.9351 15.9599 11.4891 -7.5023 -15.6213 -6.1754 -#> 3.8959 -1.9634 2.8737 4.6599 4.4457 -1.1856 4.8629 -12.7711 -#> 6.0529 -14.6800 2.7329 -1.2511 -7.1787 -5.7072 6.1981 -25.1700 -#> 4.8224 1.9774 8.1258 1.0748 -27.5141 -2.1715 -2.2420 0.7899 -#> 11.7260 -18.2582 6.0380 -6.0932 -3.7677 3.0523 4.9454 -11.8659 -#> -0.3142 -1.3074 -2.0408 -8.9408 15.0563 -0.6220 19.6902 1.4908 -#> -6.0120 9.0143 -10.1494 -11.0386 -5.0090 -5.1131 -9.2287 -8.1093 -#> 14.9981 18.3641 -4.4207 -3.9998 -10.9331 -15.5465 0.6011 -2.4611 -#> 5.7311 2.3118 7.1591 16.2519 -0.1902 6.0938 -4.1115 0.3160 -#> 16.1710 -7.7254 5.3234 -6.6723 0.1106 3.8326 3.5976 -9.7607 -#> -1.7634 -1.1571 2.4107 -5.2687 10.5207 -9.7934 -5.3621 -4.9413 -#> -#> Columns 17 to 24 12.7919 13.8965 7.2914 7.1090 2.6619 -11.1852 20.6129 -0.3397 -#> -6.1425 4.7137 0.3191 12.4660 -12.7929 -10.1171 1.9036 -9.9026 -#> -2.9329 5.4450 8.1227 13.2214 2.8128 -23.6955 13.6789 -10.9361 -#> -21.1594 3.2425 14.4772 -10.3611 21.0172 -5.6483 9.4607 6.3364 -#> -8.1113 2.5801 -8.5287 -0.6597 -14.0022 -0.5884 3.7643 -12.5078 -#> 15.2535 -7.8894 21.2285 -13.4478 -13.0378 -9.4753 6.7139 -6.2283 -#> -18.2535 2.6662 2.6795 -2.7393 4.5147 2.3993 -10.3161 9.5086 -#> 9.5280 -14.5483 4.6656 3.5450 -12.7685 5.6188 -4.8503 11.8668 -#> -8.0276 17.9217 -7.9395 -7.7207 20.0763 -10.4341 0.3116 6.6646 -#> -12.0671 3.4923 3.1873 -8.6960 13.8439 -15.9396 -5.4830 9.7459 -#> 6.6190 -1.9811 -8.0694 9.4840 -16.2822 2.9730 1.0627 3.1966 -#> 20.4468 -0.6983 4.6333 5.5524 -14.3997 -1.8202 -4.0220 -12.3155 -#> 11.8901 -5.3156 4.3961 -4.3514 -17.1259 10.1295 -3.1806 -3.7633 -#> -22.2255 5.1331 19.4986 9.8792 1.8949 14.8938 -12.4817 3.8372 -#> -2.5785 -4.3861 9.4344 -4.7644 2.7722 -10.0445 -4.9544 2.5280 -#> 10.1179 4.8086 -4.7714 6.3682 -4.8846 -8.6565 14.4215 -2.6060 -#> 2.7266 -6.6072 7.6766 8.2032 -2.2043 1.3790 2.5085 -2.5679 -#> -9.4860 13.6252 0.5499 4.3637 -0.8586 3.5431 2.8470 5.7581 -#> 8.7090 8.6967 -25.1840 -12.8700 -0.1018 8.5919 1.0733 10.8271 -#> -6.4616 3.3153 5.3792 7.9816 -1.5986 -3.9117 -6.9535 3.2914 -#> 19.0527 -3.1730 3.6325 3.6042 9.1775 6.4150 2.8763 3.6961 -#> 7.6531 1.3698 -9.0558 13.6710 -0.5876 -10.8151 11.8866 -19.2419 -#> -8.6815 -17.6992 12.5567 12.1497 6.0738 3.7060 -3.1910 -0.9991 -#> 4.7524 -5.6268 2.4945 1.2878 -8.5590 -1.7245 2.4662 -12.2543 -#> 20.1751 1.5843 12.8122 2.2665 -0.8076 -7.3859 17.5358 -16.9012 -#> -4.6932 9.3886 0.5833 -5.7472 -2.2443 -9.8603 -19.1601 3.3404 -#> 0.8012 -0.8414 2.3527 -1.0912 8.3260 2.9054 2.3910 3.4724 -#> 22.5107 3.1786 12.0413 1.5933 7.6991 -9.4353 13.6481 -2.4569 -#> -3.8804 -11.3571 13.3741 1.0029 -6.2457 12.8943 0.5813 3.3167 -#> -0.2584 -1.0563 -17.1628 -12.4293 1.2945 11.4786 -6.6795 6.9568 -#> -6.6891 -13.8328 2.5390 -0.3918 -2.6157 -3.9123 7.7240 -1.6569 -#> -9.8826 -1.3782 1.5560 13.3592 0.2965 17.9120 7.0649 -11.1058 -#> -5.5006 9.0927 8.0763 2.2972 0.9185 5.0729 -2.5752 2.4860 -#> -#> Columns 25 to 32 2.6157 12.1656 -3.4293 11.2619 29.0579 9.0954 -7.2592 2.0037 -#> 0.8193 -0.0767 3.5949 -10.6917 -5.6788 11.6697 -5.4333 -6.3205 -#> -3.6810 1.5525 -5.6771 -0.1137 4.2578 9.2257 -7.5001 1.5330 -#> -0.0569 -4.2848 9.9154 6.3289 -5.2351 7.7988 4.8216 -2.9912 -#> -3.8000 6.5655 -2.0004 1.7156 -17.9381 -4.2242 3.4018 -1.0467 -#> -9.4344 -9.8031 -12.0416 8.2862 13.2678 -0.7302 0.4335 -8.4655 -#> -7.8939 3.8292 9.9277 0.7376 1.4882 4.2665 10.4770 -8.0951 -#> 10.8728 9.1898 1.6196 -2.5027 1.6465 18.7318 -5.1867 4.6003 -#> -11.2066 -9.4481 -5.4285 -4.1906 -7.5989 -2.8678 -16.7686 -2.0709 -#> -17.2278 -10.1598 10.4174 -3.8264 12.3131 -2.5120 18.8831 -7.7288 -#> 0.3052 5.7338 -7.8968 -1.5548 -9.5453 4.5017 -5.5146 12.2713 -#> -9.3960 3.2968 -12.8012 4.0913 -15.1407 -0.3459 -14.7093 -7.6451 -#> -1.7817 4.8734 5.0182 4.4158 -6.2907 18.4289 -16.1908 10.3766 -#> 0.4862 5.4331 -6.9836 3.9721 -4.2630 9.7002 2.3095 -14.5059 -#> -4.1786 -5.7285 2.5321 6.2680 -4.6553 -2.1756 6.8678 2.6513 -#> -8.6665 -6.0416 -0.5409 0.5216 4.5455 -1.9902 0.3366 0.8028 -#> -10.9652 6.0440 -2.8763 -5.2005 15.1445 1.7218 19.8993 -3.4327 -#> 10.6043 -3.2512 -10.1111 4.6157 -4.0006 -2.1343 -1.4302 10.6029 -#> -0.7327 -0.6984 -2.8468 22.2596 8.2914 2.4505 16.9699 -3.4296 -#> 2.0949 2.8143 5.6126 -0.4120 -5.6877 3.9536 -10.9128 0.8354 -#> 16.6468 7.1970 -5.7594 1.3074 7.8698 0.9892 6.8667 -4.1368 -#> -2.2990 3.1578 -5.4826 -1.8124 2.9626 8.2702 -14.0820 -6.5329 -#> -5.5275 -8.0167 -3.2093 -3.3525 -9.3943 13.0748 -7.8546 -4.9310 -#> -1.6238 -7.5335 -0.1831 3.1611 -7.2427 -1.5588 -5.7129 -0.7068 -#> 8.0404 -3.7601 4.9198 -5.5980 -1.6015 -12.4899 -1.4713 10.5558 -#> -7.2427 2.0596 -5.3566 -7.9559 -4.5101 -5.5048 18.3094 4.4071 -#> -1.3398 2.4388 6.1397 1.8313 3.2439 -6.4362 6.1027 1.0680 -#> 9.4973 -1.7636 0.4137 8.9540 -8.8187 0.1385 -0.4692 -5.5934 -#> 14.0594 -1.4364 3.2645 -2.2201 -1.2013 -4.2073 11.8217 9.4594 -#> 7.2463 -7.3954 14.7617 11.0248 -11.8653 -7.9344 -2.0461 17.9101 -#> -4.8893 0.5295 -10.6414 2.5047 -1.8547 8.1770 -11.2773 3.5899 -#> 6.6188 5.2276 17.5914 -12.3284 13.5262 15.2202 2.1484 -6.2322 -#> -3.2337 -6.4030 2.5477 0.6266 -1.7585 4.5958 -3.8904 -2.0269 -#> -#> Columns 33 to 40 0.0029 -3.3509 -12.6342 0.0077 -13.8816 19.4735 6.4888 -4.3073 -#> 1.5395 0.2632 -5.4625 0.9212 -9.7440 5.6657 -5.8855 -1.8879 -#> 5.9266 0.1231 -6.3651 0.3660 -1.8050 -17.6215 -0.0248 -11.7501 -#> -10.3363 -1.2273 6.9968 -4.5623 24.2431 -8.3985 7.2473 -31.0217 -#> 1.2550 9.8227 3.1240 -18.6046 -13.7359 -3.0107 -1.1276 -2.6916 -#> -8.9907 -9.6516 14.6502 8.2522 3.6585 -7.0430 2.7615 4.5177 -#> -26.4023 -3.6227 6.8321 -8.3929 -11.2319 2.2254 -2.6124 -8.0238 -#> 6.9105 7.0635 -1.4849 2.4072 -19.7923 6.6321 3.8779 2.5356 -#> 6.3276 -8.5924 -5.0837 13.8570 12.6960 10.3987 11.6815 2.7757 -#> 2.8557 -1.6411 -8.9849 -14.0671 -11.5407 -7.7375 5.4139 -15.6535 -#> 0.7287 -7.0148 -4.2302 21.0221 -13.1672 11.1729 -5.9700 -4.3374 -#> 10.4731 -10.1309 3.3727 20.1838 10.8264 5.5776 -10.2286 -10.2985 -#> 3.3903 -8.1233 -0.6413 4.7653 -13.4528 3.2653 1.5537 -0.2125 -#> -4.8917 -23.8506 8.8801 -4.8572 4.5650 8.1567 8.8407 10.9621 -#> 2.9578 9.7805 -9.9187 2.7422 9.3902 0.2009 -16.6340 10.2006 -#> -3.8805 -6.7501 7.9872 2.7867 -0.0815 -7.9427 -3.5837 -0.0629 -#> 1.6030 -13.3641 11.1395 -1.3434 18.6885 3.3939 9.9179 -1.1933 -#> -18.0162 -8.5544 -0.6633 11.2242 -0.5546 -10.2144 -3.5952 -0.8228 -#> -0.0304 3.3971 -13.0760 12.1379 -18.7442 6.0984 11.3735 13.1533 -#> -1.3424 7.4699 -16.1672 15.2030 -1.1397 1.4846 -9.6994 0.7061 -#> -17.6452 7.4832 -12.2968 20.7037 -5.9765 6.0838 -3.7101 14.0650 -#> 9.8086 17.2381 14.2677 5.9241 -12.9382 15.8107 0.3432 13.0134 -#> -7.9321 19.8009 6.0164 -11.8849 21.6689 7.9013 -0.1456 -14.9485 -#> 3.3430 3.6445 10.6610 -2.1489 -17.0001 -7.2675 6.2436 -3.5449 -#> 3.4334 -2.1735 13.1118 -14.9150 6.0767 -11.2966 4.3785 -12.3123 -#> -2.0132 -6.5839 11.9638 -18.3315 -3.5849 -8.2416 -3.7682 6.0796 -#> -0.0251 -11.1150 7.8212 1.7854 7.5301 -4.5684 0.8110 4.1804 -#> 8.2421 -1.7128 -1.8108 12.8005 17.7682 4.3181 8.1897 -3.9628 -#> 0.6283 2.0559 12.3124 -3.8224 -3.3201 -12.3931 6.2987 3.5806 -#> -2.0786 6.6837 -3.2043 3.0021 -12.1649 0.8647 -19.2996 -11.3476 -#> 3.8575 -17.6842 13.0417 20.2883 -6.0133 -10.4405 -0.8185 3.9150 -#> 6.9854 10.4905 7.3135 -9.4315 -7.8653 -0.3362 -3.7238 3.9443 -#> 7.3930 -1.5779 0.2267 -8.8062 2.0794 -3.5313 10.0632 -14.0367 -#> -#> Columns 41 to 48 4.0510 -1.2702 -10.7254 -0.1518 6.2255 -16.2182 0.0292 13.3147 -#> -2.1117 7.7726 4.7524 -6.3753 9.8414 -0.9707 -1.7454 8.6750 -#> -0.9473 -6.6488 25.7680 -21.5618 -1.7513 8.2230 -4.7134 -9.3620 -#> 2.9891 -9.4650 26.0017 -24.0466 -3.2031 16.5410 -1.8369 21.9893 -#> -1.7402 8.2179 11.1601 -4.2580 2.8483 13.6395 5.4053 2.7420 -#> 8.2215 6.4315 -10.5551 6.0965 -29.5574 9.0842 4.3430 -8.8682 -#> 4.0511 -4.8678 -7.8091 -7.2168 -3.2019 0.7052 2.3832 -13.7279 -#> 3.7785 7.8156 -3.6508 -6.4915 9.9509 10.5353 0.7613 -1.9243 -#> -3.4082 -3.2314 7.5694 -9.4198 1.2483 -10.8518 -0.4544 5.1223 -#> 2.7928 -4.2954 13.6518 -4.1961 -32.9449 8.2825 10.6235 -6.5547 -#> -2.1706 0.5347 -4.1825 9.5515 -1.3363 -0.9141 -8.9849 -2.0354 -#> 11.3627 -0.6651 -7.9567 1.5583 -23.5124 15.5929 0.6318 0.4863 -#> 9.5106 -0.9316 -1.4384 -7.9514 3.3732 2.6686 -4.9989 -3.0646 -#> 1.5673 -3.4351 3.2348 5.1354 -16.7270 2.7156 -10.4553 -14.2158 -#> -2.6128 9.3098 6.3065 -3.1190 -4.8262 8.8057 -9.8614 -5.6171 -#> 19.3820 -7.3091 -1.5507 -3.3476 1.9809 10.8549 2.1270 -3.7262 -#> 7.5950 5.9518 -14.4187 1.6114 -7.7032 7.1640 17.7226 -8.4070 -#> 0.6154 5.7930 -13.4261 5.2674 9.2733 3.7338 -7.8101 1.1300 -#> -8.1761 12.1908 -21.7284 13.0551 -16.3943 -5.5680 -10.0639 3.2579 -#> 1.8726 4.2444 4.9620 -6.5421 -9.1609 10.7864 -0.0778 -10.7919 -#> 9.7009 -2.5378 -15.6215 8.3960 -7.2251 -3.9734 8.8126 -4.3367 -#> 2.8564 -32.7612 16.8721 0.2182 11.6929 -9.1474 -5.9281 3.8051 -#> 1.7788 1.4127 -1.8833 -6.9912 -12.5033 18.4125 3.6896 -15.0041 -#> -5.9166 -7.0237 -7.5080 -8.2371 7.0100 13.8009 -11.6932 -1.6627 -#> 4.9167 -2.6230 0.3647 -3.4979 -20.9022 16.2120 -6.7058 -14.1063 -#> -1.9939 -0.7363 -17.2596 14.2294 -19.4004 8.7278 -0.1109 -9.6073 -#> -4.8946 -0.0505 -6.9824 15.6053 -5.0343 -2.1242 -2.9937 0.5938 -#> -4.5974 10.0005 5.7879 -12.7138 4.0694 -22.5153 -8.6040 19.5947 -#> -7.7176 9.4326 -0.3602 -1.8544 9.7682 2.2847 -11.9673 -4.2708 -#> -16.3190 -5.7646 -6.2865 -0.0257 2.4428 12.3580 -13.9446 2.4780 -#> 8.4663 -8.2000 9.8663 -13.7863 -1.1844 13.4650 -4.9049 5.9373 -#> 2.4027 0.5915 -21.0702 -1.0956 6.1958 -18.1889 5.0352 -18.4239 -#> -9.9235 -7.0443 -2.4528 -12.1563 3.5734 -3.8522 -9.3329 -9.5187 -#> -#> Columns 49 to 54 2.4848 8.9600 -6.6746 -12.3711 -0.5049 2.8786 -#> 10.6893 -5.4225 9.1650 -13.3324 -13.8708 3.7911 -#> -16.2479 27.7164 0.7434 -11.4350 -8.6765 0.6951 -#> -0.2513 -5.2872 -8.1689 -6.9968 -0.3131 2.7240 -#> -20.2858 15.1985 -7.0279 -9.8774 -2.8880 7.5356 -#> -5.0441 -5.9110 -3.6796 10.4714 -2.4352 -5.0220 -#> -2.9365 3.6362 10.7610 -0.0537 1.7815 -1.1395 -#> 1.9543 -10.6093 -5.5753 -3.9769 -3.0863 1.6045 -#> 3.8588 11.3072 2.7108 -5.5991 4.0178 0.1168 -#> -22.2331 -13.1465 -2.4980 5.7941 -6.1919 -8.3823 -#> -0.9479 7.0255 0.0698 1.3584 0.6481 0.1080 -#> 4.3669 5.3655 -2.4001 3.0138 -11.6708 10.0307 -#> 5.1871 -7.0168 0.4818 -0.0774 -0.6100 2.0541 -#> 17.7132 6.4588 10.1605 -10.7837 -6.7728 -7.9383 -#> -7.6739 6.9783 -1.2624 -0.2311 1.1638 3.9497 -#> -17.5456 -9.7086 2.0341 9.9349 -3.0648 -0.5089 -#> -3.1795 4.3665 17.0024 -8.8965 -1.5841 -8.8056 -#> 2.2149 15.9212 -4.4968 0.0624 -3.1900 2.9269 -#> 11.9618 -2.4289 -17.1797 17.8757 -0.4401 7.4848 -#> 0.8993 9.7873 -2.5482 -0.0111 -6.4800 -1.8389 -#> -12.8026 -5.3223 -10.4207 -7.1540 -3.7932 -0.5174 -#> 3.0748 -8.7669 3.6482 -0.7108 2.5877 2.6914 -#> -1.5469 -5.7033 1.4735 4.2395 1.8257 -3.1212 -#> 20.0404 -0.7424 -2.4644 9.7535 1.2855 5.3932 -#> -4.9952 6.6275 -9.0983 11.9380 -6.4644 -2.4656 -#> 14.4117 -11.4367 6.1705 -2.5055 -4.0090 1.6790 -#> 1.7852 2.5159 5.6664 7.5982 13.7174 -5.9481 -#> -0.8873 -0.4220 -12.1559 -7.4173 -7.3533 -0.6071 -#> 3.2437 1.4743 -7.6123 13.1501 8.1817 -3.4226 -#> 3.7093 -13.1205 -7.3952 9.8814 4.7580 10.3219 -#> 3.5487 -23.3564 5.8674 6.2407 0.5246 4.0238 -#> 8.0665 28.6886 2.2886 -3.1141 11.1435 -2.9868 -#> 4.3882 5.3014 -3.7234 8.4248 -0.0245 -8.0384 -#> -#> (19,.,.) = -#> Columns 1 to 8 2.2563 11.1571 5.1803 2.0074 20.4880 1.9959 -11.7897 -3.3816 -#> 1.6023 4.9350 -9.8570 2.5549 3.4270 -6.4630 12.5979 -3.4586 -#> -3.7042 -7.2734 10.1350 -2.1168 4.5737 4.7926 -6.3151 13.1638 -#> 6.8103 -6.0863 3.9821 23.3459 -23.1046 3.2218 -2.8127 -2.0219 -#> 10.0323 -1.5816 -0.3515 -3.7162 -1.0843 -8.9585 -9.5895 -1.1016 -#> 4.5691 0.3556 6.5799 11.4361 -5.5251 -0.7265 -4.4567 7.4877 -#> 3.7387 -5.0253 6.2016 12.1069 1.3906 -3.1792 5.3994 -8.8015 -#> -6.3128 19.6994 0.8121 -5.3457 2.7312 -7.0969 0.8557 -5.9668 -#> -0.6554 -17.8413 -3.6626 7.7413 -3.0338 1.9055 2.3498 4.4129 -#> 2.2045 -17.2951 14.3557 5.4896 -13.9269 13.9771 0.1688 13.7065 -#> 1.4103 -10.9335 6.4542 -7.1951 0.4260 6.0540 -8.6265 -10.6153 -#> -2.4367 -3.0208 1.7582 -3.9473 11.5795 -9.8707 8.2718 3.9334 -#> 2.6910 13.7177 -12.4472 -0.7891 -4.1189 -4.9260 1.9796 6.3284 -#> -2.2626 0.8278 -16.8472 8.7072 -1.5437 -2.5874 18.4442 -12.6984 -#> 0.2770 -5.7886 -1.6831 -6.2280 0.3892 -10.7264 1.6883 0.5731 -#> 5.9888 2.8220 5.1822 -10.9491 2.9278 -1.2400 6.9684 -3.0588 -#> 2.0853 6.4697 9.1610 14.2869 3.9100 7.3812 -5.6622 -2.6049 -#> -2.8214 -2.8717 3.4763 -4.1231 4.3919 -7.9700 14.0622 -10.4217 -#> 0.0360 -7.2489 7.1088 -5.2261 7.6209 -6.4602 3.4370 -3.3627 -#> 3.0146 -8.4109 7.3443 6.8336 -16.3648 7.2043 7.3347 -7.1156 -#> 11.6654 -2.6083 -1.8932 -11.5787 -7.3925 -12.9529 -8.5728 6.7741 -#> -7.3200 10.8262 -1.1320 -12.8965 17.0946 -4.7953 1.3419 -5.9471 -#> 0.9198 -4.3271 -5.4991 3.7542 -11.3688 7.0314 -0.1410 -5.9312 -#> 5.9632 8.4848 2.1999 4.6207 0.4360 1.6097 17.6682 7.0791 -#> -3.5695 -2.5270 -0.5916 -8.6880 -8.8335 14.8022 4.9419 -2.7857 -#> -5.3153 -5.2006 1.6115 9.8521 -2.7637 13.6420 20.2294 2.9587 -#> 0.1992 -6.7249 2.8040 3.9956 -7.6735 16.6353 0.2961 -0.4600 -#> -0.9324 4.6584 -9.8605 -3.0216 3.8279 -18.3609 -14.6428 2.9325 -#> -2.3640 2.2242 -7.8729 -2.3704 11.0849 -3.0016 12.9536 4.4597 -#> 0.6707 -3.5127 -1.1921 7.2875 5.6279 -5.3606 16.7637 -0.5393 -#> -6.0561 6.4866 -2.1347 2.0663 -9.1948 -4.9570 4.9678 -2.9562 -#> 1.7940 8.1519 -9.7820 -8.1634 8.2479 7.6515 -7.6138 20.4446 -#> 1.0375 8.2653 -6.3321 4.4135 7.8574 2.1843 10.2095 -3.0581 -#> -#> Columns 9 to 16 -6.7446 -19.2839 -5.2996 10.6448 -2.2137 -9.8432 16.7385 1.8806 -#> 2.7721 -9.6319 15.9549 -3.9562 -2.7533 -1.4063 17.1808 -3.8886 -#> 0.8954 3.4040 -4.2748 -7.6926 -1.9848 17.2263 0.3864 -4.7782 -#> 13.6218 3.0880 -9.0650 13.7402 14.2595 14.3139 8.1423 -10.8839 -#> -2.8056 -1.5642 -10.9384 2.4840 -14.6773 15.3640 -11.8735 8.1836 -#> -12.2327 8.0373 -9.9830 -7.0531 -2.4499 -2.4429 -9.1589 7.2055 -#> 6.7662 1.5342 1.1651 4.3661 -5.1278 7.6165 2.6917 -7.9761 -#> -4.9940 -12.7029 -0.5018 4.1232 7.7914 -3.3340 3.7179 -15.3366 -#> -2.8687 11.0463 -19.0638 0.3108 8.7410 -5.9274 -4.5910 10.8263 -#> 2.7317 2.1383 -13.6276 31.3021 5.1940 6.0216 -4.0457 -5.5339 -#> -8.8773 1.3894 6.0943 8.2040 -22.9392 6.0407 -5.8266 9.3254 -#> -5.3709 6.6576 -0.3668 0.5234 -0.9117 8.5379 -2.6543 5.2700 -#> -4.0529 -2.2297 -3.6299 8.0911 7.2778 -1.6554 15.2207 -8.6982 -#> 2.9273 10.0753 1.0093 2.4645 6.0291 2.2682 1.5076 3.4788 -#> -16.8549 8.1650 -5.1286 -7.7697 -3.0778 13.3012 -7.2841 0.8980 -#> -4.7963 -5.0447 11.0234 19.7087 -13.5973 -1.4497 -6.0398 3.9615 -#> -11.6929 -12.9310 3.0861 3.8349 -0.8498 0.3676 7.5970 -15.8111 -#> 10.5982 -8.7752 -4.8810 3.8208 -2.7940 0.2052 2.5400 8.1477 -#> -3.8651 11.3115 -15.3378 -0.9253 7.8896 -0.5798 -9.4853 21.2292 -#> 1.5519 2.7537 7.5006 5.2048 3.9256 3.6954 -14.5668 -1.0811 -#> -6.4954 -15.8022 2.8659 9.0508 11.4639 -2.6907 -8.2854 -1.4667 -#> 7.8183 -1.5043 9.2190 -3.2909 -3.6400 -7.3343 4.0588 3.0736 -#> 1.6113 -7.8219 7.8065 20.6163 -7.6171 -11.7764 8.7982 -8.1107 -#> 4.7483 9.2704 -0.8942 2.4555 7.9658 2.8396 -11.1332 6.9196 -#> 9.2414 3.9240 7.5839 -10.3013 -8.0429 -0.9105 7.5698 5.5207 -#> 9.1179 9.3893 -2.6667 -0.5141 6.6394 2.4894 -3.3466 -12.0105 -#> -7.6842 7.3149 -2.5730 4.5251 -6.0077 4.9602 5.9953 2.1281 -#> -10.2351 -0.4251 2.1078 -6.7500 6.6732 -0.6379 -4.8256 7.8820 -#> 7.6989 5.7581 -0.8032 -1.8760 -1.4233 4.0420 -9.7446 -10.8131 -#> 13.1536 -3.2820 -5.9773 0.4688 -7.2200 -1.9802 6.8597 7.1834 -#> -6.9597 -6.3075 -10.8629 6.5933 5.1967 -2.3144 -13.2033 -6.3124 -#> -15.1643 11.3891 1.7059 -1.9721 6.4254 -8.3010 13.8574 -8.0904 -#> -0.6400 0.8845 9.1436 12.9765 -0.8867 -5.4252 -2.4219 2.5684 -#> -#> Columns 17 to 24 -11.6251 -3.3042 -9.5897 -19.7662 -11.3639 2.9764 -7.6847 5.3094 -#> 5.5535 12.3224 -0.4621 -1.7751 -6.2082 -2.8243 -3.3639 4.4086 -#> 4.9465 -6.1456 13.8044 -2.3682 0.1131 -1.0469 -0.6537 -18.7963 -#> -0.2615 3.3738 12.8766 -0.7729 13.0538 -3.7030 21.3555 -0.0547 -#> -5.4725 -2.8746 3.7941 -7.9571 -14.6536 -3.4342 -0.7045 1.1588 -#> 3.2876 -3.8150 2.2217 -7.6296 -6.1589 9.3530 -2.3499 17.2963 -#> -8.1793 7.5804 12.0472 -14.9795 2.5250 4.4984 18.0541 8.3617 -#> -10.3372 2.5190 -17.6749 -9.9759 -7.6969 -6.4078 13.4594 11.9478 -#> -5.4648 -6.1133 -3.1051 15.8897 -0.9202 8.7116 5.8395 -20.7721 -#> -4.9633 7.0661 1.5125 -9.6190 -6.7895 3.5231 -0.0329 23.3810 -#> -15.8677 12.4208 2.0475 6.4192 -9.7269 -0.6899 -15.3368 17.5331 -#> -7.8009 13.4144 -4.8032 15.5606 -10.1968 9.3907 -12.3619 0.2271 -#> -17.6470 3.5627 12.0824 -4.9767 -6.5277 0.2517 1.5120 17.2585 -#> -7.6810 2.1964 14.6920 -5.9527 10.4770 -6.3552 7.7612 9.4945 -#> 0.4429 -3.4509 -2.8073 0.4457 -8.6056 -3.4702 -23.0327 0.9529 -#> 0.8179 4.0997 2.5110 -15.3957 3.0363 10.0195 -9.8834 -8.2598 -#> 0.9310 -15.8169 -2.7237 -3.8845 -11.1570 3.2370 -4.0259 -0.8774 -#> -5.0513 18.1145 -1.9949 18.7507 16.7231 7.7012 -4.9110 -4.4177 -#> -32.6358 0.2473 -4.0207 2.8901 -18.4278 11.5049 -11.9611 8.3352 -#> 3.0942 8.3002 -8.3140 -5.4769 9.5909 -5.1340 1.8389 9.0214 -#> -16.4604 -16.0312 -13.7154 5.3215 -13.6139 12.0238 -12.3383 -9.2225 -#> -5.7482 1.7096 -0.1163 14.3944 0.9409 -12.6108 -5.2439 -6.4736 -#> -2.8593 -0.6078 5.9682 -8.9257 6.8423 6.6952 1.2522 -8.5496 -#> 0.1467 9.9529 4.9536 4.2114 14.9164 8.9636 -11.2608 -6.2331 -#> 7.4674 -1.1952 -6.9978 -3.2490 11.2160 -6.2838 -1.1349 5.1366 -#> 11.8175 3.2338 2.4789 -7.7572 9.2989 -6.1385 -4.1347 4.6263 -#> 3.8211 -2.9451 12.8165 -13.0431 2.9364 -0.6053 -3.3417 0.7684 -#> -5.6006 -21.6590 0.9646 13.6898 -1.2653 -14.3905 4.7676 -5.7793 -#> 3.3701 -2.2316 -6.6199 -1.5730 0.6153 2.5474 14.7909 14.1248 -#> 1.4418 6.3772 0.3450 6.5393 9.4165 15.8391 -5.4509 -7.6689 -#> 9.1360 -3.4296 4.2381 0.1421 -6.0637 -10.6114 -10.5209 -1.9886 -#> 8.9036 -12.5969 22.4951 -0.5840 -12.5121 15.3569 -18.5965 9.6492 -#> 1.6620 -3.6564 29.1041 -15.0301 24.3040 2.8217 5.5570 -3.0949 -#> -#> Columns 25 to 32 0.6754 -5.4351 -1.7427 -2.1690 -12.6507 -7.9582 -2.0196 -5.9914 -#> 6.2957 24.2517 -0.6646 5.8407 4.2708 0.6678 0.8753 7.4855 -#> 9.4350 -1.2963 -3.6902 -5.5026 2.7794 -3.1165 -1.7660 1.7011 -#> 10.9539 -17.5292 4.3171 -8.1743 -14.3399 9.7072 -7.5103 9.4845 -#> -3.8901 7.8685 11.0553 -7.6921 -0.4697 11.2485 -3.0967 7.2322 -#> 5.9094 3.8987 8.5515 24.0678 -3.1706 2.4028 0.1868 -4.1452 -#> -4.8616 5.1777 23.9718 -11.5407 7.5438 3.0655 -15.7713 -8.2458 -#> -0.8644 5.6632 -3.6068 -3.0043 14.2817 7.5471 5.6906 -4.2427 -#> -7.6225 -10.5315 -8.6553 -2.2209 -15.1343 -2.7813 14.5287 -10.1209 -#> 8.1794 -1.0438 17.5802 1.3836 6.7183 10.8328 -2.5115 2.8458 -#> 2.9136 -10.5741 12.6177 0.9100 2.0254 3.3510 -2.8068 -13.4172 -#> -13.1170 1.6400 -13.0929 -10.3412 5.5702 -11.1293 2.2765 3.6344 -#> 1.1890 -1.9082 -2.3892 -8.1190 2.6411 -3.1444 -6.9475 -6.7670 -#> -14.5015 11.7778 -2.8347 -13.3463 -5.1831 4.0238 0.1387 2.6846 -#> 11.0696 -13.9293 -6.1347 7.4526 -5.8511 -3.2405 -0.3918 13.2905 -#> 17.0813 -5.0977 6.4885 12.9523 8.3253 -10.0022 -6.1836 14.6672 -#> -9.5167 -0.1452 8.0347 9.8185 -4.8826 7.0202 -13.3114 -7.3593 -#> -3.5866 -2.1113 6.1158 -4.5314 0.5352 3.9646 -8.2554 1.0600 -#> -6.3913 -3.4983 -2.0942 2.1771 1.5844 -3.6043 4.8755 -9.8164 -#> 11.0548 -20.8779 -16.4557 5.0648 -3.3323 -10.0792 18.6829 -5.9163 -#> -2.9553 -3.4663 5.8928 -17.9741 7.4210 14.4276 12.8937 3.8324 -#> 6.7334 12.8358 -6.2878 -5.9928 12.9341 1.9918 -0.3133 0.8225 -#> -8.5431 -4.5959 5.7668 -6.6540 6.5829 6.1331 1.0447 -7.4261 -#> -5.5379 0.1180 -2.7121 -5.2564 -3.0441 -9.8881 4.7905 -2.0531 -#> 1.8131 -4.7013 -3.3520 6.5667 3.2757 5.0640 9.5182 2.7596 -#> -7.9812 6.5552 -21.8246 15.2338 -1.6433 -0.1308 4.5212 -3.7304 -#> 1.0232 1.4512 -3.1897 -7.0585 -15.2776 3.4033 -12.1728 -0.5707 -#> 6.9714 -8.0632 -3.9023 -1.7853 -7.5021 9.3818 9.8124 -7.2047 -#> -8.3226 -3.2363 5.5813 -9.9950 -1.7240 17.5094 -6.7417 -5.7870 -#> 9.7854 15.8732 13.8283 -0.1509 -6.0653 9.1286 -3.0200 -1.7873 -#> -1.2083 -4.4204 -5.7270 11.8766 -5.3162 -0.7704 9.3954 -1.7334 -#> -2.2619 11.5894 -10.0027 -5.0707 4.5968 -5.3793 -13.2128 12.2694 -#> -9.4825 -1.5247 -0.3769 3.1203 5.3710 -5.7538 -8.5316 -1.2781 -#> -#> Columns 33 to 40 -9.2212 17.9095 8.4176 -3.7203 -6.5719 -1.9328 -3.5606 4.5551 -#> -1.8634 6.8070 5.5377 7.7612 -0.3723 -8.5310 -11.2221 12.8836 -#> -0.5835 -20.5723 6.2568 -13.6192 6.2027 2.5340 8.3277 -1.0690 -#> 9.4813 -13.0169 -6.5955 -9.6504 3.9302 -10.4059 0.3368 11.0317 -#> 13.2242 -6.7998 -7.4696 -4.5022 7.5813 -2.4320 18.8617 6.3256 -#> 5.4512 5.7200 4.0642 -1.2367 -0.7577 13.4109 1.9488 2.7067 -#> 17.0755 -5.8967 -4.9994 2.3772 -3.3988 7.8152 -7.2953 -7.5021 -#> -8.0343 -4.6180 2.3566 9.2143 5.1217 18.9750 7.0116 0.8720 -#> 8.8868 -12.8316 32.2617 1.5173 -29.2014 -9.1697 1.7793 -11.5384 -#> -8.6272 -4.4338 -21.9003 4.1937 -13.0200 8.8821 0.1906 13.0268 -#> 11.8491 11.2913 -2.7882 -20.0160 22.0333 16.8525 -3.0531 -2.2457 -#> 3.9485 -1.9158 2.2538 -21.1789 -3.8281 0.7445 2.8729 5.1457 -#> 4.8669 -7.2527 -3.9915 -13.5313 13.7822 12.5903 8.1406 -10.5367 -#> -2.3059 2.7685 -3.8104 11.5816 -24.4048 -18.5444 -8.2961 3.2680 -#> -16.0640 -19.7496 -16.1403 -6.2305 15.5044 0.6679 0.0522 19.7908 -#> 11.9814 5.5630 -0.8154 -19.5417 9.2743 8.0217 -15.5096 15.4365 -#> 4.4094 14.3800 3.7023 -2.6637 -6.9884 15.1600 -10.9028 8.2918 -#> 8.1691 17.8184 3.8099 3.3481 -12.3396 -3.6603 -0.4540 -1.5425 -#> -15.0172 -12.5406 2.6438 1.7852 10.4317 16.6561 -11.3111 -20.8243 -#> -12.4438 -10.9364 3.7333 -6.6735 13.5679 -1.7383 -2.9847 6.8149 -#> -1.8302 17.7674 5.9496 1.7236 -20.1008 2.4569 4.4831 -5.7613 -#> -0.7342 -9.7959 -7.8026 -6.3387 -2.7167 13.1393 7.8315 9.4000 -#> 9.4051 9.6574 14.3628 -27.0737 -19.5745 -15.6395 -3.8150 2.3063 -#> -9.8362 7.6502 0.7495 5.7295 -0.4509 -10.2346 3.5372 -6.7736 -#> -10.6270 14.9071 3.1708 -15.4192 2.3833 -13.9320 -9.1408 3.5938 -#> -9.1457 -11.3525 -5.3337 4.7483 -4.7272 -3.0666 0.1602 2.4069 -#> -8.1658 3.7447 -3.9244 9.3006 -1.0673 -7.4670 -3.5135 -1.1845 -#> -3.0652 9.2276 17.3474 -4.8320 0.7358 -4.0601 13.5146 -17.2049 -#> -12.7639 1.8174 -5.2838 10.5123 11.9389 1.5286 5.6593 0.2460 -#> -12.6807 2.0698 -15.8524 -7.2091 -7.4212 0.1610 13.4838 2.1601 -#> 7.2512 -1.3859 0.5661 -15.6594 -3.3170 -2.9406 0.2242 -3.9722 -#> -25.7610 -6.7176 10.3714 12.6513 -11.1680 15.6171 -19.9599 1.9016 -#> -2.6267 20.8037 -2.0124 4.1648 1.2791 6.2882 -3.8924 3.8411 -#> -#> Columns 41 to 48 12.6592 -29.1091 4.9542 6.5500 -6.5881 -9.1306 -3.5637 6.9740 -#> 12.5785 4.9390 -7.1512 9.5374 19.1770 -2.1619 -8.8319 -6.1387 -#> 1.6069 0.0530 -7.3822 0.7928 4.1775 -0.2300 -11.3976 15.9961 -#> -0.9627 5.4817 -15.4045 10.0725 5.5603 -11.5714 12.0004 1.3725 -#> -2.4268 17.9863 10.2603 6.4608 12.7709 1.9074 -2.4812 2.6753 -#> 6.0541 0.7232 12.2480 5.2808 1.4006 10.6693 12.6639 12.8870 -#> -6.6617 -14.0500 -14.0936 -3.0179 -11.9357 5.9934 3.5963 -6.5327 -#> -1.3355 -2.3099 -5.0284 8.5370 3.2054 -19.0103 -6.5679 6.2471 -#> 10.6122 11.9086 -0.1262 6.2764 -3.3073 0.9882 0.3496 13.5524 -#> 1.8818 3.3550 -19.4137 -0.5076 8.5231 5.5520 7.0628 -4.7189 -#> 2.0409 -11.5721 1.2794 4.2566 -1.2600 11.6436 8.4207 9.9688 -#> 7.2530 -0.5176 9.8714 18.3778 1.0340 4.8684 10.6200 8.9634 -#> -1.2232 -8.9204 -2.1755 -1.1896 -3.3770 -2.9910 0.3296 -5.7852 -#> -7.5940 5.8494 -15.8490 -10.0028 6.6867 3.4989 2.2816 -7.2253 -#> 6.5804 -0.6134 -2.1385 15.5190 8.1329 -11.0707 11.2124 -2.0956 -#> 20.2437 -4.8852 -6.5738 2.1046 -1.0111 -4.7278 1.7603 7.4301 -#> 4.8448 1.4679 3.6642 -7.8998 -14.3289 8.8894 7.9565 2.9185 -#> -2.0559 3.1594 5.7790 -13.0144 -14.4777 -5.6680 -10.7241 -4.0730 -#> -2.1987 -0.4621 -29.1733 -3.3608 -10.2317 -5.5618 -2.7433 16.9024 -#> -2.5571 -18.0090 -6.6635 17.0925 7.7247 -9.7994 -3.9254 9.3806 -#> -2.6549 11.5325 1.9270 -18.3873 -4.6849 3.4927 -5.3377 8.0902 -#> -6.9130 0.0039 0.7236 -23.0587 10.3118 17.0714 10.3968 9.7206 -#> 7.9891 -6.5689 -13.5171 4.7222 1.5046 -7.6745 -7.4960 19.0716 -#> 18.9985 2.7868 8.8201 3.8623 4.7100 -1.8755 1.1543 0.4049 -#> -1.9884 -16.9054 6.6718 6.4880 5.4618 -16.1783 -5.3956 5.4232 -#> 2.3665 6.1933 11.9676 5.5544 -6.5978 9.0037 -4.2192 -6.4146 -#> -8.1628 -11.2495 -7.0179 -6.8217 -1.2933 6.0335 -4.5630 -6.4349 -#> -9.8572 -11.5498 -4.4926 -3.8380 0.7976 -1.0803 8.9509 24.5363 -#> -16.4516 13.6568 10.2520 -10.9182 7.4908 -13.2014 -1.2110 -18.3038 -#> -0.4914 4.0439 -2.5239 7.8569 -3.8332 -2.2241 4.7996 -14.7188 -#> 10.2891 6.7213 -2.7195 -8.7093 0.4224 8.2590 14.4842 5.9226 -#> -2.3089 -15.3783 -6.7664 -7.4489 5.3329 -1.4130 -3.1034 -6.9561 -#> 0.3502 -1.1984 -10.8877 -5.4941 5.8245 -7.2559 2.3815 5.3173 -#> -#> Columns 49 to 54 1.2747 5.2395 13.4698 14.2738 12.2964 -1.5817 -#> 1.6910 -12.1397 -12.7877 -10.0018 7.3331 0.6355 -#> 0.7433 -5.8941 -6.2776 4.1104 2.8945 0.8537 -#> -12.5226 -6.5539 -10.7138 7.6140 -1.7271 -7.8163 -#> 3.8362 -6.7615 2.8636 -6.1179 -3.3340 -9.7257 -#> -1.7788 4.3488 22.8753 1.8028 5.4304 -1.8112 -#> -1.4235 3.3501 5.2342 4.2336 -14.7682 -7.2064 -#> 10.1970 3.3946 -1.0360 -1.1210 -4.4042 -4.1440 -#> 11.1277 0.9043 12.8983 13.0145 5.9844 -1.5443 -#> -4.2875 -5.3223 3.4411 -5.3851 -1.3105 0.6720 -#> -6.5519 12.1854 0.0411 -1.8376 0.8744 -3.2010 -#> -0.1420 7.3235 7.9048 7.8784 4.8304 -10.4523 -#> -3.6911 4.2254 5.7063 8.9996 -4.4160 0.4081 -#> 8.5811 -7.6967 -5.5423 -5.7741 13.3399 1.0483 -#> 7.6063 -7.0702 3.5762 -6.5361 15.6316 -0.6609 -#> -9.8814 1.9607 -0.4300 -0.0419 2.0097 9.4041 -#> 13.4482 10.9558 14.7211 0.1087 -0.7896 2.0498 -#> -16.6676 -4.7743 -7.8918 1.7374 -4.2081 -5.8012 -#> -8.2581 -6.0684 8.2776 -10.0404 9.0940 -0.6857 -#> -3.9399 6.9471 -3.0039 -4.1868 0.1530 -1.3818 -#> 5.2030 3.0198 0.3190 4.3111 -2.7122 9.6921 -#> 8.5626 9.0417 -1.1888 6.9106 -3.1359 5.2059 -#> 11.7007 7.5871 -0.9854 -0.6513 0.4028 -3.6121 -#> -9.2621 4.3883 -7.0322 -1.0245 -1.7525 -3.5827 -#> -20.7991 20.7254 -10.1940 0.9200 5.2781 -3.8300 -#> 1.3659 0.4175 12.2263 -13.9632 9.0599 -8.4849 -#> -4.6873 1.5768 7.5582 -0.5245 10.7458 5.4537 -#> 13.1642 -0.6127 -1.5006 20.3494 3.8420 6.7404 -#> -3.1472 -10.0608 7.6488 4.2702 -4.5538 -2.1601 -#> -27.4567 4.2973 -14.8549 -6.8283 -10.6547 -6.2378 -#> 4.6752 -3.8682 5.5952 1.8841 -0.3845 -1.9298 -#> 5.4148 -13.7245 6.6207 5.3980 -1.0152 7.5767 -#> 4.4342 1.4319 0.4299 -7.3339 -1.4901 -0.7209 -#> -#> (20,.,.) = -#> Columns 1 to 8 -0.1621 -1.1462 5.5624 -3.2040 -15.1401 7.7637 2.6860 5.4238 -#> 0.8396 -3.5203 9.1231 15.4953 11.7898 -1.6306 -9.3022 1.3270 -#> -1.3684 -3.0398 -10.3844 3.6498 -5.7445 -19.0700 -0.8349 10.7056 -#> -0.0485 5.6198 3.5489 9.6158 1.7576 -4.0273 -13.8786 -16.4744 -#> -2.5919 -8.6451 -13.4473 -2.8219 6.3921 -3.4349 3.5949 -5.9716 -#> -4.9905 4.9731 10.2412 1.4691 -4.3466 7.6242 16.1750 6.4483 -#> -1.5315 -0.1067 0.8477 -12.7635 2.7935 15.6729 -8.6513 -7.0986 -#> 4.7082 -2.4865 -1.2656 4.2381 1.8470 -8.7176 -0.1376 -1.4295 -#> -4.5697 1.0897 4.0721 -16.2454 0.7888 -14.9746 -1.8295 -3.1963 -#> 0.1823 1.8564 6.2780 3.6687 7.3347 -1.5156 -4.9225 -16.4798 -#> -1.4463 1.9402 -1.1858 -8.6737 -12.8343 -5.3619 6.5962 -7.5694 -#> -2.6235 4.2898 -3.5294 4.9846 -10.0827 -30.8690 -0.3993 -3.9279 -#> 0.2410 -3.7194 5.1257 8.5350 -3.8493 3.5762 -2.3895 -23.1393 -#> 1.4401 2.6178 11.1458 10.0722 10.3905 -0.3786 -1.1106 -14.5141 -#> 1.5678 -0.4691 -5.4716 -15.0261 -3.4612 -16.8536 10.2880 0.9811 -#> -5.6570 0.1170 5.6482 8.2850 -9.1239 2.9178 14.5901 -0.7696 -#> -1.4326 3.0316 5.5369 10.9070 17.6759 0.0452 7.9348 3.0246 -#> -1.1936 2.5122 7.0014 1.9653 0.1634 -1.7728 -1.8913 9.4284 -#> 1.6027 5.4183 -3.9848 -12.1217 10.1409 13.7953 21.0640 2.8829 -#> -1.6356 -3.6014 0.0734 10.9810 -12.7295 -14.2390 -3.9708 6.6003 -#> -2.8815 -3.3364 9.8267 14.1771 5.8635 -0.1086 -6.7500 11.7749 -#> 7.1709 8.3964 1.4076 -13.0953 -0.5085 4.6746 -12.6335 1.4956 -#> -0.1749 3.9424 3.2652 16.9313 -10.2269 -11.3709 -13.7632 -4.8729 -#> 2.5850 -0.2872 -5.2173 -7.3208 -12.8036 9.8185 4.2883 -4.7945 -#> 2.4509 10.4656 -9.4248 8.9418 -7.5186 1.9619 -2.0981 9.3907 -#> -2.6993 -4.5462 -2.2453 -4.4655 -2.1707 -3.4057 -9.1211 7.5554 -#> 2.3860 0.4197 -4.8111 -3.5332 -5.9333 -0.6345 -5.2828 -10.2688 -#> 1.8418 5.4025 3.4547 8.0757 -2.6860 -1.5656 -6.3146 9.1767 -#> 3.1711 -3.4898 -6.5597 0.5113 0.2847 -9.3321 5.6376 -4.6058 -#> 8.5215 6.1679 -2.6810 -5.5771 -6.6683 12.2519 2.2518 -16.5943 -#> -0.2038 -2.1986 2.9182 1.5978 -9.3678 -4.8177 3.7600 -1.1258 -#> -2.5421 -3.1019 -1.8657 3.2041 3.9845 -5.3289 8.3623 -6.0053 -#> -2.2026 -4.4154 1.7093 1.2705 -11.6828 5.0229 7.7633 -4.8920 -#> -#> Columns 9 to 16 -9.6379 -8.8353 -2.0164 -3.9967 -14.9024 -13.2522 5.6158 -8.8205 -#> -1.7650 3.1118 -1.6893 16.0329 11.6731 0.8990 9.0909 -8.5504 -#> -4.1310 -3.4250 2.2022 -3.6775 -2.3736 -1.0813 -7.3223 2.0234 -#> -7.7183 5.4651 -2.1186 -0.2175 2.3976 -5.4979 -1.3650 7.4051 -#> 0.3214 -4.2397 -1.5178 7.8210 -2.7930 -4.5397 -10.7320 11.6219 -#> 3.3722 8.3514 12.6042 -6.2435 9.9449 6.8076 -7.1994 7.9202 -#> -0.7087 -3.7693 -5.8845 2.2852 4.7547 7.6629 -0.9728 -5.5279 -#> 0.5844 -13.7592 -3.8658 -4.3545 1.7405 10.1863 3.6145 7.0898 -#> 1.1853 1.7078 11.3930 -4.2465 -10.0186 -9.8318 6.1848 2.0908 -#> 14.1360 3.3278 8.2308 20.9610 -2.2449 2.7065 -13.6995 12.0065 -#> 8.1762 5.8692 -1.0152 -1.7100 -3.0871 7.8233 -0.6612 -0.3667 -#> 0.7398 2.9508 14.0272 -14.8148 -6.7409 8.5475 -8.1818 7.2170 -#> 3.1297 11.1152 16.2606 -10.3462 4.2060 -4.3647 -0.5266 -2.5029 -#> 6.4961 -8.4803 -3.9518 3.8498 10.7788 -2.2223 -5.9765 1.3549 -#> -5.1063 4.6187 -2.1797 -22.5261 -7.0005 6.2757 -9.0270 25.0982 -#> -0.1420 1.3720 7.4795 9.6153 4.3483 7.4302 -9.9186 -5.4024 -#> 3.4592 -8.3573 -17.2632 -6.2594 4.1314 10.3204 7.1232 6.3911 -#> -5.9489 2.3011 -0.3504 15.1394 0.2342 6.0624 4.9731 -7.1409 -#> -6.4556 -3.4254 -15.0925 -5.4326 -7.9049 5.4858 1.3068 17.6305 -#> -11.9726 -0.1709 -2.0448 -8.8095 -3.7666 -1.9334 -3.4856 8.0408 -#> -5.5254 -2.6180 -5.0557 11.0289 -9.9996 10.3354 -9.6572 -0.4148 -#> -10.8925 -12.7567 0.0669 4.5049 -2.1159 7.2722 -7.0065 -12.0945 -#> 0.5809 -2.2126 -3.6979 -12.0575 -1.5567 -1.7114 6.3236 -0.1161 -#> -9.8552 5.4610 0.4954 8.4186 16.3263 8.1864 10.1633 -7.8072 -#> -5.9334 17.6178 12.9692 3.4399 -1.6990 -7.9805 -3.4114 -7.6213 -#> 19.0645 -4.7203 11.0164 -0.9063 4.8046 -8.6888 11.7567 -3.8675 -#> 11.7528 2.8265 -1.7707 -4.1104 -10.3084 -4.3146 2.8402 -6.7274 -#> -16.2602 11.0432 -2.8828 0.1409 -2.7798 -4.6135 4.9228 -6.8063 -#> 13.7149 12.8582 -4.4341 4.5278 10.9037 6.1661 -4.7417 14.1037 -#> -0.3279 21.5061 -1.1329 3.3908 -11.6120 -4.9290 10.7559 11.6238 -#> 4.9558 -5.5311 -8.7086 -15.1959 8.1782 -3.1170 -10.8156 -11.8758 -#> 9.2265 -7.8700 8.2888 -2.7848 -6.6362 -3.8733 6.7170 -9.3142 -#> 7.6552 11.7999 4.7896 7.7007 8.5952 2.0664 11.5924 -1.0443 -#> -#> Columns 17 to 24 -0.5321 11.0464 0.0856 0.5655 14.5991 12.9319 0.4836 7.6641 -#> -6.7579 -5.3101 19.7148 3.8677 -14.5008 -9.3260 -5.3839 3.1493 -#> -11.6755 -0.6071 2.5071 10.7974 -6.7518 1.3822 3.6109 4.1592 -#> -2.6031 -5.6818 18.6989 -2.4348 -9.8791 6.2369 -15.8756 9.3603 -#> -8.1665 6.7317 12.1462 16.9517 -15.2266 -5.7444 1.5100 6.8900 -#> 14.4074 8.8029 -5.4545 15.4322 6.9543 0.2443 2.1472 21.9676 -#> -1.3419 6.2277 11.9838 -7.4835 2.0090 2.4590 2.9050 11.5184 -#> -4.5372 1.9943 13.6520 -3.9550 -19.4712 0.5986 25.6229 7.6501 -#> -12.0105 14.3773 0.4855 -3.1355 12.9980 17.6869 -7.5678 -20.4467 -#> 5.4924 -7.7566 -11.8491 7.2381 3.8651 -11.9726 -13.3154 30.7087 -#> 4.9715 3.6618 -5.9609 6.0388 10.1822 -0.1636 2.9248 21.1720 -#> 4.4321 13.4352 5.0641 11.5861 -9.4310 -1.9816 -6.3792 13.6577 -#> 5.9904 -4.2367 2.1718 3.8697 -2.1954 -12.3323 0.6447 11.7598 -#> -12.3906 -5.8361 15.9661 -4.0532 -19.7730 -7.9416 -13.6731 -14.6051 -#> -11.5482 -6.3060 0.4562 10.2639 -7.1850 2.3008 8.4813 4.6316 -#> 16.3523 -1.7841 -12.6560 -0.8742 14.8849 -1.3381 -6.6948 20.4880 -#> 15.8185 0.9278 5.1805 13.5886 -8.4595 -12.5742 -5.5707 6.3068 -#> -1.7460 -6.1560 -0.1155 -1.3805 -11.9903 5.2742 -2.6962 1.2403 -#> -20.6755 8.6058 -19.3932 -14.9239 14.5644 24.0100 -0.9756 -3.3321 -#> 2.8825 1.0881 -2.3569 -6.5990 -1.0636 -9.5998 -2.6571 2.1721 -#> -0.2178 -13.1473 -11.7695 -11.7659 11.5251 6.0879 0.9299 4.1882 -#> 9.5220 -7.2763 9.1356 -7.0332 5.8918 -4.9225 -2.3877 -2.8981 -#> -3.8195 -11.3067 17.5316 14.3856 -2.0751 -10.7862 -10.6612 0.7596 -#> 4.8557 -3.7417 2.7421 -3.9924 15.1623 5.9950 -3.4114 -5.8673 -#> 12.6678 -10.7893 5.1291 5.0086 -12.7731 -7.5149 1.4797 4.7836 -#> 6.8793 15.9510 -0.8191 -11.1323 -21.8725 -1.1746 9.8490 -1.9807 -#> -6.6938 11.1216 -5.1915 9.9398 6.9210 0.1600 7.1468 -3.6307 -#> -9.6642 2.1299 -1.9093 4.7243 2.9268 8.8768 -1.7267 -6.9847 -#> 5.6409 0.9974 22.4437 3.0573 -12.4774 7.3325 14.7486 -7.2137 -#> 11.1331 -3.1796 -14.4943 -6.3280 4.9811 12.2918 -0.0304 -3.3065 -#> 12.5455 -14.9207 -2.9549 -0.1376 7.9348 -4.7665 -13.3366 7.2516 -#> -17.0092 3.3592 12.2121 0.9171 9.6731 1.8164 10.6970 -3.2334 -#> -6.6542 7.6337 4.5601 4.0772 9.9508 10.3802 -0.0720 10.9570 -#> -#> Columns 25 to 32 14.7814 2.0855 -11.5443 -8.8636 -3.2921 -8.8975 -2.2202 2.0542 -#> -4.3575 14.1877 21.3428 7.5668 -0.0204 0.6640 -1.0708 -12.3276 -#> -5.2300 6.1240 7.4403 2.9730 2.8976 1.5309 -4.3767 -3.9842 -#> 10.1208 5.2596 -2.1734 -1.7602 5.0769 -13.1904 3.5458 7.7316 -#> 2.8890 4.8204 -0.0606 -1.4020 6.3128 -0.8638 10.7315 4.2823 -#> 12.1047 -4.3720 4.1811 11.6347 9.3276 9.7779 -12.5684 12.7005 -#> 8.1034 -5.7556 1.0095 7.1351 17.4726 3.1717 7.3035 7.1441 -#> -1.0227 13.4178 12.7260 -4.6267 -11.1069 3.4204 -7.7886 0.5724 -#> -1.9399 3.8624 -1.8930 -7.9153 -3.2695 -4.7378 16.7488 10.2630 -#> 17.3373 11.5215 4.4709 2.6323 -5.6626 -22.0692 -10.8594 -5.0825 -#> 8.9432 -0.7131 11.7734 -3.8360 12.8653 9.0661 5.6882 -10.2663 -#> 3.6119 5.3201 -1.0089 7.1640 6.3922 5.7196 4.8207 -6.0330 -#> 16.9539 -9.5322 -8.6009 -10.1206 11.5182 2.3165 -12.3131 -6.9663 -#> -7.6772 5.6404 11.2296 -6.5533 -3.4842 -1.0550 -11.3338 -2.7795 -#> -3.0589 2.6489 0.0237 3.8240 -3.4378 -0.5842 12.3299 5.6363 -#> 1.8475 -3.6678 -7.2582 2.2023 7.2895 -5.9783 -3.0408 -9.8344 -#> -3.9609 -16.0397 -2.3379 9.4380 -2.0241 6.7547 4.8268 -3.5794 -#> -10.1694 -6.8895 -1.8487 10.7999 11.4416 -8.0720 12.5670 -3.0657 -#> -5.8882 -6.5137 -14.9760 -9.7027 -10.1098 -4.3701 2.7771 -3.2866 -#> 4.8151 -0.7031 13.0597 0.4731 2.4569 5.2606 -7.8424 -6.6531 -#> 4.7037 -5.5518 -13.4641 -14.5033 -9.6714 -9.2366 -10.1225 -18.6002 -#> -4.9474 23.4492 7.2622 -3.3553 0.5726 4.0995 11.6019 -6.0000 -#> 7.2762 6.6382 14.5240 2.3849 1.0163 9.1811 4.0564 7.0639 -#> -7.9470 1.7570 -2.2795 -8.6340 -10.8866 4.2379 2.3539 4.4703 -#> -1.2421 11.0018 1.2535 -10.1757 -10.0624 4.5975 -10.5913 -5.5541 -#> -6.7632 -9.0438 3.3362 3.3541 -7.8718 -7.5242 -13.5980 5.6186 -#> -0.8035 -7.1194 3.7788 0.8013 -3.8723 13.9522 -7.8633 2.2792 -#> 13.8338 21.8256 3.2970 -3.9498 -10.6964 2.5834 -0.7092 -9.7314 -#> -6.8580 -4.3072 -12.3198 -4.7211 -11.5913 0.8492 -7.2394 -14.1186 -#> 11.9156 4.7210 -0.4337 -4.2159 6.3054 -5.5326 2.0532 23.0029 -#> 4.6552 -2.2351 4.9680 -2.8215 2.8679 3.2595 -1.1829 7.9291 -#> 3.0608 -3.2694 -4.6975 -0.3494 -3.4043 1.4205 2.0749 -7.1137 -#> 0.9281 0.6515 3.0646 -1.4669 3.3878 3.2400 -1.6345 -0.2212 -#> -#> Columns 33 to 40 9.4695 -4.2720 5.6363 -3.0229 1.0772 4.3404 2.7501 8.8858 -#> -0.6602 -6.7124 0.2431 9.1193 -10.9290 -5.7383 0.3658 -0.0696 -#> 4.7950 -5.0814 -4.8913 -10.7810 17.6417 -3.0259 10.1834 9.6151 -#> -12.9637 1.7315 1.6938 0.9733 4.8371 -3.4247 5.4597 -3.4403 -#> 11.6400 -6.0170 0.6291 4.2209 8.5032 -11.3803 3.6310 -2.7111 -#> 18.9459 -0.5712 -1.3361 -2.9936 -7.7226 4.8759 3.8769 -4.6880 -#> -6.3998 2.6967 1.3923 18.0601 2.7739 5.9172 -8.6638 -17.0653 -#> -0.1278 1.9916 -3.3368 4.4702 -4.1882 -0.3656 0.7066 -15.3845 -#> 7.9409 5.3684 -7.0261 -2.6445 -0.9552 -8.4829 -3.8520 -2.4927 -#> -6.1056 -10.0198 1.6371 9.4007 2.8853 0.1508 -15.6866 -5.7417 -#> 16.0239 3.5886 2.7665 3.9043 -11.8213 -0.2662 -9.6202 2.5200 -#> 22.2557 -10.0216 14.6203 -1.1390 -17.5313 8.1871 0.2831 6.0888 -#> 1.0272 0.8753 7.7316 15.5033 -12.1568 -7.3910 1.0686 -14.9231 -#> -10.8576 -14.7991 0.4624 17.4074 -3.3230 12.4914 -1.6869 -6.4875 -#> 2.1269 -1.6013 -18.9873 -4.1625 1.2891 0.9499 10.2098 -8.4856 -#> 5.0698 -2.1100 4.8530 -5.0091 -0.4301 4.8984 1.7890 1.5331 -#> -0.8267 -1.6853 -1.6022 -2.2362 -6.0568 11.0617 0.2674 8.6695 -#> -0.0252 9.0893 -7.1012 0.4103 -7.2297 3.1142 4.9596 -2.1848 -#> -3.7477 -5.8867 -15.8268 7.3226 8.2175 2.2897 1.7719 -8.0674 -#> -5.4004 -0.0574 4.5258 0.9831 2.4997 3.1957 0.6281 5.7712 -#> -4.2081 5.1848 -1.3553 12.6855 7.1704 -0.9142 -4.2758 -1.6022 -#> 0.9877 -2.9343 -1.7928 -5.7885 14.6740 1.5706 -5.1352 1.3063 -#> 4.1812 1.9999 -0.0702 -3.6045 -12.3558 -4.5112 13.6995 10.0074 -#> 19.6984 -15.9481 -2.8311 15.3651 4.1220 -0.2127 5.8846 -8.1123 -#> 4.0772 -9.8475 13.4586 -2.3518 -0.8136 2.3596 2.0469 3.7419 -#> -9.5606 2.4306 10.7676 -10.6189 -7.2922 4.6915 -7.9171 -0.7868 -#> -3.5749 1.9010 0.1177 1.1493 -5.8188 -1.8577 6.5739 0.6710 -#> -7.3241 4.7530 -3.0966 2.7295 13.0151 -16.4036 2.9880 -4.5060 -#> 12.3952 3.8545 -4.4082 4.3266 -0.9524 -16.2677 -0.3679 -8.6198 -#> -1.7516 -3.5273 2.3961 16.9297 -8.4036 -5.4464 -2.6408 2.6277 -#> 8.5223 -5.4636 8.0825 -4.6614 0.9806 6.6712 1.4723 6.5970 -#> -8.9293 8.8014 4.4584 9.0321 -5.9880 0.2031 1.6938 2.3761 -#> 2.8712 1.0475 -3.0858 9.3976 3.4289 -2.6453 8.2908 -2.6215 -#> -#> Columns 41 to 48 1.0282 13.6111 -7.0692 -6.5561 -6.1215 4.2179 -15.8656 1.0944 -#> -6.8862 -1.9006 -0.4087 1.5739 -1.7935 -7.1098 -11.9831 -3.2336 -#> -8.8803 7.4065 10.8403 -8.0686 -6.7777 1.9696 -6.3820 10.2839 -#> 6.9485 3.3956 3.3277 3.0010 -5.4892 9.3176 2.4230 -8.8501 -#> -1.8685 -6.0866 -7.0440 1.3316 0.4566 -10.7386 -12.7538 2.4884 -#> 1.0826 3.4354 -2.3901 -2.5510 5.8950 -18.2303 -2.8210 -15.3365 -#> 12.4550 -8.2105 -6.3106 9.6198 2.0183 -11.0000 7.1958 -12.3621 -#> -6.6099 -6.4649 5.2101 -5.0436 -0.9363 1.8431 7.5087 17.3076 -#> 11.8163 -9.2981 12.9937 -7.0190 0.4832 7.5841 13.7286 -11.5256 -#> 4.6048 -7.5720 -0.0066 -1.8987 7.8794 12.4534 -2.0041 -15.7481 -#> -6.0992 1.4299 0.5921 7.1506 3.3198 2.5087 -5.8267 5.3067 -#> -7.0652 -5.6709 8.6364 1.6091 8.9214 -6.8446 -8.9418 2.2526 -#> -4.1896 4.0452 -8.5550 5.1426 8.1493 -5.5370 -0.8277 20.2468 -#> 0.6847 -4.9379 5.2159 13.2771 9.5256 -5.6407 5.5542 -9.0216 -#> 1.2668 -10.6842 7.2327 -6.9216 -10.3167 -6.8236 11.4308 -6.7491 -#> -16.9219 8.2406 -14.5830 4.0330 -4.7996 -2.9334 -18.2929 -10.6340 -#> 6.6933 -19.0212 1.3064 5.3452 19.1179 -4.2772 6.7768 6.4978 -#> -2.2510 -1.6161 -3.3499 12.3447 -8.1975 -5.6954 0.0561 4.9685 -#> -2.0489 4.4903 12.4049 -11.0199 -7.8862 3.5034 22.8815 -1.3020 -#> -11.0295 7.3003 9.1650 0.5886 -5.0501 16.9966 3.6578 -1.9543 -#> -12.0994 -8.8277 -8.1078 9.0297 0.4771 -14.6433 -13.8196 8.1879 -#> -8.8032 8.7606 12.5391 3.2308 -3.6012 13.6578 -6.5386 11.2676 -#> 1.1408 -15.7426 -9.8566 11.0532 18.8391 -5.4709 -7.6888 -3.2865 -#> 9.8008 15.5037 -2.0344 -2.3890 6.5953 4.7175 -13.4495 5.5906 -#> -8.6769 9.2517 -4.9664 -12.0861 -10.7764 16.0101 -19.5229 -1.9862 -#> 4.7044 0.4214 14.4001 0.7839 2.9285 -1.7922 27.9028 -4.0251 -#> 4.2268 2.6263 4.2395 -9.2513 -2.2608 -0.8726 6.1136 -7.9764 -#> -8.6987 3.2674 1.5147 1.2253 1.7308 8.3466 3.8393 8.1229 -#> 3.5640 1.9986 -0.7368 -4.6264 -1.0220 6.6309 12.8355 23.6585 -#> 9.5552 13.7640 -17.8714 -0.0075 1.6017 6.0655 -11.4160 -1.1558 -#> -8.3849 9.1110 -1.7346 7.3672 14.7431 -1.6898 -14.6558 5.7885 -#> 5.6095 2.8757 10.7389 9.7179 -6.1931 -11.3480 3.8011 14.7688 -#> 9.2174 6.8806 -3.9769 1.5860 5.7901 7.8094 -5.1846 -2.5478 -#> -#> Columns 49 to 54 2.4065 -13.3140 0.6704 -2.4768 -12.8549 -1.7246 -#> 0.2845 -18.0647 12.4616 16.8168 -3.2826 -5.1261 -#> 3.2000 -17.7489 -12.3529 4.3966 3.3123 -0.0723 -#> 5.7643 6.9179 -1.1015 0.8025 1.3063 -1.2388 -#> -2.1739 -6.4094 -4.2746 -3.4872 1.8942 5.8524 -#> -0.2877 8.1071 -7.1430 -2.0073 -0.5958 -0.6888 -#> -2.2713 1.2798 -17.8757 -5.1527 4.8490 2.7732 -#> -15.9087 -18.2601 -2.4891 -6.5220 -4.8303 1.9908 -#> 6.6733 13.3821 3.4212 -9.9961 -7.5872 -0.9345 -#> 13.2082 -5.6533 -6.0926 4.7916 -0.9141 -4.4621 -#> -5.1544 -13.5698 -5.2283 -10.9659 -3.0882 5.9742 -#> 12.7492 -2.8567 11.5335 -9.9027 -8.9687 -0.3478 -#> 12.4571 13.0040 -9.6064 -15.1820 0.5775 4.7391 -#> 8.7508 7.6705 18.0860 12.4494 -0.2345 -5.2390 -#> 1.8744 -3.1752 -9.6094 7.4359 -3.3472 -4.7954 -#> 10.9020 9.8056 1.9755 -6.1540 1.8149 -2.7576 -#> 9.7128 -10.5345 5.5649 -0.0743 1.4130 -2.4817 -#> -0.7271 -1.5640 11.7662 1.4352 -6.7959 -1.4346 -#> 2.9927 -5.6591 -1.1774 -1.0120 -0.9716 9.8057 -#> -7.2276 4.0850 -0.8801 5.2085 -4.5947 4.2419 -#> 6.9749 18.1678 -2.7030 -12.8955 -7.5763 -2.7017 -#> -1.0681 -17.7503 -5.8964 -6.9143 6.0632 -0.1159 -#> 1.6390 6.9608 -4.7435 4.3092 -2.7255 -7.1128 -#> 3.2927 4.4731 4.8954 -4.1874 1.2480 2.9463 -#> 8.6057 15.0453 2.8538 2.7735 -10.3994 -0.4426 -#> 4.0538 -6.3148 3.5170 7.2334 -4.1120 -6.4150 -#> -0.8619 11.2716 -4.3405 -1.9576 5.3423 -2.5746 -#> -11.5102 4.1975 6.4558 -12.8352 -8.9418 0.9216 -#> -6.5220 5.2047 4.1922 -4.8530 7.3112 -0.0121 -#> 11.3263 9.6909 -11.5103 -4.5442 -2.5944 4.6614 -#> -5.3980 -7.5770 7.9005 -7.3473 -2.6545 -0.8043 -#> -1.5698 1.8889 -9.2245 -2.3570 1.1759 0.8253 -#> 1.0218 1.3002 -6.2019 3.4486 0.1029 2.7112 -#> [ CPUFloatType{20,33,54} ]
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_conv_transpose2d.html b/docs/reference/torch_conv_transpose2d.html deleted file mode 100644 index db2a334e788a9e0d6a7476f25de952a7e6e2d142..0000000000000000000000000000000000000000 --- a/docs/reference/torch_conv_transpose2d.html +++ /dev/null @@ -1,305 +0,0 @@ - - - - - - - - -Conv_transpose2d — torch_conv_transpose2d • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Conv_transpose2d

    -
    - - -

    Arguments

    - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    input

    NA input tensor of shape \((\mbox{minibatch} , \mbox{in\_channels} , iH , iW)\)

    weight

    NA filters of shape \((\mbox{in\_channels} , \frac{\mbox{out\_channels}}{\mbox{groups}} , kH , kW)\)

    bias

    NA optional bias of shape \((\mbox{out\_channels})\). Default: None

    stride

    NA the stride of the convolving kernel. Can be a single number or a tuple (sH, sW). Default: 1

    padding

    NA dilation * (kernel_size - 1) - padding zero-padding will be added to both sides of each dimension in the input. Can be a single number or a tuple (padH, padW). Default: 0

    output_padding

    NA additional size added to one side of each dimension in the output shape. Can be a single number or a tuple (out_padH, out_padW). Default: 0

    groups

    NA split input into groups, \(\mbox{in\_channels}\) should be divisible by the number of groups. Default: 1

    dilation

    NA the spacing between kernel elements. Can be a single number or a tuple (dH, dW). Default: 1

    - -

    conv_transpose2d(input, weight, bias=None, stride=1, padding=0, output_padding=0, groups=1, dilation=1) -> Tensor

    - - - - -

    Applies a 2D transposed convolution operator over an input image -composed of several input planes, sometimes also called "deconvolution".

    -

    See ~torch.nn.ConvTranspose2d for details and output shape.

    -

    .. include:: cudnn_deterministic.rst

    - -

    Examples

    -
    # \dontrun{ - -# With square kernels and equal stride -inputs = torch_randn(c(1, 4, 5, 5)) -weights = torch_randn(c(4, 8, 3, 3)) -nnf_conv_transpose2d(inputs, weights, padding=1)
    #> torch_tensor -#> (1,1,.,.) = -#> 3.1282 4.7949 -5.5622 -1.0866 3.7899 -#> 3.4914 4.8807 4.3209 0.8437 7.1147 -#> -5.0276 7.5935 2.4507 6.2129 4.9112 -#> 2.9302 -4.0742 1.0907 0.0252 4.4256 -#> 4.1733 6.6849 -0.1333 0.7716 1.2488 -#> -#> (1,2,.,.) = -#> 4.0475 8.1682 9.4413 -3.4628 -2.3695 -#> -3.0556 -4.0963 5.6845 2.0032 0.2438 -#> 1.7169 -2.3353 -2.5287 -5.3750 -4.0894 -#> 1.7329 17.4464 -1.9850 -1.2224 -1.0126 -#> -8.3888 0.5081 -5.4379 -7.7908 1.4902 -#> -#> (1,3,.,.) = -#> -4.6889 1.3331 -3.8890 -4.2812 -1.6408 -#> -5.7474 6.4888 -0.3864 -0.5556 3.3423 -#> -0.5830 -4.7014 0.4339 -4.4822 -0.9338 -#> -3.2573 -5.3475 -6.7339 -4.1705 5.4993 -#> 1.4175 5.6303 -1.1562 5.8984 3.9368 -#> -#> (1,4,.,.) = -#> 2.7974 -1.8220 1.8960 -2.4363 9.3931 -#> -0.0791 9.0332 2.4753 6.5632 -1.9094 -#> -0.4198 -4.7226 4.5077 -6.0814 0.9503 -#> -0.6672 3.4472 -9.0451 1.0115 -4.7566 -#> 6.7951 6.0656 9.2166 3.3023 1.2087 -#> -#> (1,5,.,.) = -#> 4.9513 -5.1344 0.4485 2.9806 0.6510 -#> 2.6860 -0.6071 5.0654 6.0352 -0.5143 -#> 3.0599 2.8382 -1.2406 -3.1389 -6.3846 -#> -5.3770 3.7280 -12.7695 -4.8459 3.4087 -#> 1.6425 -3.5262 -2.5308 5.2363 0.8194 -#> -#> (1,6,.,.) = -#> 1.6877 11.4338 1.1768 -0.3375 -1.5256 -#> -3.2510 -1.9791 0.4848 -15.0722 2.7618 -#> 0.3499 4.9010 -0.0095 -6.5474 -4.1558 -#> 3.7509 8.8759 8.5394 2.6775 -2.9372 -#> -6.3247 2.4670 -8.6403 -6.7153 -1.6774 -#> -#> (1,7,.,.) = -#> 4.5894 2.9587 3.1108 0.5861 -1.6432 -#> 0.2160 -2.0561 2.0616 0.3782 0.4569 -#> -0.6198 -5.6256 -4.3481 -0.5680 -6.8947 -#> -1.1064 3.6694 3.1163 -0.8316 5.3151 -#> -2.6586 -10.2829 -6.7049 1.4970 0.0378 -#> -#> (1,8,.,.) = -#> -0.6657 -4.0047 -1.7929 -2.8312 -5.8341 -#> 0.8324 4.9875 -2.9101 4.2271 -0.9717 -#> -2.1073 2.4393 0.2561 -7.6416 -2.0861 -#> -0.1300 5.5534 -0.9693 0.1627 -0.2768 -#> 6.7889 -0.3533 3.2501 3.3338 -0.3276 -#> [ CPUFloatType{1,8,5,5} ]
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_conv_transpose3d.html b/docs/reference/torch_conv_transpose3d.html deleted file mode 100644 index babac9b8d4403691ddefffa800b7d9bd5355d623..0000000000000000000000000000000000000000 --- a/docs/reference/torch_conv_transpose3d.html +++ /dev/null @@ -1,243 +0,0 @@ - - - - - - - - -Conv_transpose3d — torch_conv_transpose3d • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Conv_transpose3d

    -
    - - -

    Arguments

    - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    input

    NA input tensor of shape \((\mbox{minibatch} , \mbox{in\_channels} , iT , iH , iW)\)

    weight

    NA filters of shape \((\mbox{in\_channels} , \frac{\mbox{out\_channels}}{\mbox{groups}} , kT , kH , kW)\)

    bias

    NA optional bias of shape \((\mbox{out\_channels})\). Default: None

    stride

    NA the stride of the convolving kernel. Can be a single number or a tuple (sT, sH, sW). Default: 1

    padding

    NA dilation * (kernel_size - 1) - padding zero-padding will be added to both sides of each dimension in the input. Can be a single number or a tuple (padT, padH, padW). Default: 0

    output_padding

    NA additional size added to one side of each dimension in the output shape. Can be a single number or a tuple (out_padT, out_padH, out_padW). Default: 0

    groups

    NA split input into groups, \(\mbox{in\_channels}\) should be divisible by the number of groups. Default: 1

    dilation

    NA the spacing between kernel elements. Can be a single number or a tuple (dT, dH, dW). Default: 1

    - -

    conv_transpose3d(input, weight, bias=None, stride=1, padding=0, output_padding=0, groups=1, dilation=1) -> Tensor

    - - - - -

    Applies a 3D transposed convolution operator over an input image -composed of several input planes, sometimes also called "deconvolution"

    -

    See ~torch.nn.ConvTranspose3d for details and output shape.

    -

    .. include:: cudnn_deterministic.rst

    - -

    Examples

    -
    
    -  
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_cos.html b/docs/reference/torch_cos.html deleted file mode 100644 index f6e3cd81829da8d400e4010ae5a279ae9b5303e6..0000000000000000000000000000000000000000 --- a/docs/reference/torch_cos.html +++ /dev/null @@ -1,233 +0,0 @@ - - - - - - - - -Cos — torch_cos • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Cos

    -
    - - -

    Arguments

    - - - - - - - - - - -
    input

    (Tensor) the input tensor.

    out

    (Tensor, optional) the output tensor.

    - -

    cos(input, out=None) -> Tensor

    - - - - -

    Returns a new tensor with the cosine of the elements of input.

    -

    $$ - \mbox{out}_{i} = \cos(\mbox{input}_{i}) -$$

    - -

    Examples

    -
    # \dontrun{ - -a = torch_randn(c(4)) -a
    #> torch_tensor -#> -0.6755 -#> -0.8212 -#> -0.0338 -#> 0.3401 -#> [ CPUFloatType{4} ]
    torch_cos(a)
    #> torch_tensor -#> 0.7804 -#> 0.6813 -#> 0.9994 -#> 0.9427 -#> [ CPUFloatType{4} ]
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_cosh.html b/docs/reference/torch_cosh.html deleted file mode 100644 index 3f16cf32af3a7d5f339cfc3740d19fe5190cc38d..0000000000000000000000000000000000000000 --- a/docs/reference/torch_cosh.html +++ /dev/null @@ -1,234 +0,0 @@ - - - - - - - - -Cosh — torch_cosh • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Cosh

    -
    - - -

    Arguments

    - - - - - - - - - - -
    input

    (Tensor) the input tensor.

    out

    (Tensor, optional) the output tensor.

    - -

    cosh(input, out=None) -> Tensor

    - - - - -

    Returns a new tensor with the hyperbolic cosine of the elements of -input.

    -

    $$ - \mbox{out}_{i} = \cosh(\mbox{input}_{i}) -$$

    - -

    Examples

    -
    # \dontrun{ - -a = torch_randn(c(4)) -a
    #> torch_tensor -#> -0.5781 -#> -0.2866 -#> -1.7790 -#> -0.1226 -#> [ CPUFloatType{4} ]
    torch_cosh(a)
    #> torch_tensor -#> 1.1718 -#> 1.0414 -#> 3.0465 -#> 1.0075 -#> [ CPUFloatType{4} ]
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_cosine_similarity.html b/docs/reference/torch_cosine_similarity.html deleted file mode 100644 index c5be3229f8f263b616241cf973d85eb08f7a22cb..0000000000000000000000000000000000000000 --- a/docs/reference/torch_cosine_similarity.html +++ /dev/null @@ -1,334 +0,0 @@ - - - - - - - - -Cosine_similarity — torch_cosine_similarity • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Cosine_similarity

    -
    - - -

    Arguments

    - - - - - - - - - - - - - - - - - - -
    x1

    (Tensor) First input.

    x2

    (Tensor) Second input (of size matching x1).

    dim

    (int, optional) Dimension of vectors. Default: 1

    eps

    (float, optional) Small value to avoid division by zero. Default: 1e-8

    - -

    cosine_similarity(x1, x2, dim=1, eps=1e-8) -> Tensor

    - - - - -

    Returns cosine similarity between x1 and x2, computed along dim.

    -

    $$ - \mbox{similarity} = \frac{x_1 \cdot x_2}{\max(\Vert x_1 \Vert _2 \cdot \Vert x_2 \Vert _2, \epsilon)} -$$

    - -

    Examples

    -
    # \dontrun{ - -input1 = torch_randn(c(100, 128)) -input2 = torch_randn(c(100, 128)) -output = torch_cosine_similarity(input1, input2) -output
    #> torch_tensor -#> 0.0628 -#> 0.0734 -#> 0.0535 -#> -0.0163 -#> -0.0151 -#> -0.0383 -#> -0.0902 -#> 0.0108 -#> -0.0247 -#> 0.0816 -#> 0.0586 -#> 0.0277 -#> -0.0927 -#> 0.0636 -#> 0.0423 -#> 0.0609 -#> 0.1381 -#> -0.0185 -#> 0.0668 -#> -0.0083 -#> -0.0827 -#> -0.0799 -#> 0.0255 -#> -0.0536 -#> 0.0417 -#> 0.1178 -#> -0.0586 -#> -0.0301 -#> -0.2182 -#> -0.0238 -#> 0.0960 -#> -0.1743 -#> 0.0430 -#> -0.0019 -#> -0.0712 -#> 0.1294 -#> -0.0705 -#> -0.0441 -#> -0.0381 -#> -0.0269 -#> 0.0380 -#> 0.2009 -#> 0.0309 -#> -0.0537 -#> 0.0422 -#> -0.0888 -#> -0.0909 -#> -0.0396 -#> -0.0815 -#> 0.0297 -#> -0.0226 -#> 0.0781 -#> -0.1015 -#> -0.0516 -#> 0.1183 -#> 0.1247 -#> -0.0117 -#> 0.0998 -#> 0.0107 -#> -0.1497 -#> -0.0889 -#> 0.0906 -#> -0.0145 -#> -0.1604 -#> -0.0323 -#> 0.0500 -#> -0.1800 -#> 0.0532 -#> 0.0932 -#> 0.0290 -#> 0.0148 -#> -0.0677 -#> 0.0150 -#> 0.1278 -#> 0.0463 -#> -0.0320 -#> 0.0187 -#> -0.0964 -#> 0.0039 -#> -0.0098 -#> -0.0187 -#> -0.1616 -#> -0.0879 -#> -0.0506 -#> 0.0167 -#> -0.0330 -#> -0.0717 -#> 0.1178 -#> -0.0280 -#> 0.0411 -#> -0.1074 -#> -0.0523 -#> -0.1518 -#> -0.0476 -#> -0.0382 -#> 0.0293 -#> 0.0484 -#> -0.0200 -#> -0.1260 -#> 0.0981 -#> [ CPUFloatType{100} ]
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_cross.html b/docs/reference/torch_cross.html deleted file mode 100644 index c2c946b9ff7fb556df0d840860118d32e34745df..0000000000000000000000000000000000000000 --- a/docs/reference/torch_cross.html +++ /dev/null @@ -1,254 +0,0 @@ - - - - - - - - -Cross — torch_cross • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Cross

    -
    - - -

    Arguments

    - - - - - - - - - - - - - - - - - - -
    input

    (Tensor) the input tensor.

    other

    (Tensor) the second input tensor

    dim

    (int, optional) the dimension to take the cross-product in.

    out

    (Tensor, optional) the output tensor.

    - -

    cross(input, other, dim=-1, out=None) -> Tensor

    - - - - -

    Returns the cross product of vectors in dimension dim of input -and other.

    -

    input and other must have the same size, and the size of their -dim dimension should be 3.

    -

    If dim is not given, it defaults to the first dimension found with the -size 3.

    - -

    Examples

    -
    # \dontrun{ - -a = torch_randn(c(4, 3)) -a
    #> torch_tensor -#> -1.1643 -0.5561 0.7230 -#> -0.0220 2.0844 -0.3671 -#> -1.4814 -0.9811 -1.6675 -#> 0.4909 0.6285 -0.6379 -#> [ CPUFloatType{4,3} ]
    b = torch_randn(c(4, 3)) -b
    #> torch_tensor -#> -1.5896 -1.4300 -0.5647 -#> -1.4339 -1.2857 0.3288 -#> -0.9875 0.1254 0.1272 -#> 0.6789 -0.6983 -0.5560 -#> [ CPUFloatType{4,3} ]
    torch_cross(a, b, dim=2)
    #> torch_tensor -#> 1.3479 -1.8068 0.7810 -#> 0.2135 0.5336 3.0171 -#> 0.0842 1.8351 -1.1546 -#> -0.7949 -0.1601 -0.7695 -#> [ CPUFloatType{4,3} ]
    torch_cross(a, b)
    #> torch_tensor -#> 1.3479 -1.8068 0.7810 -#> 0.2135 0.5336 3.0171 -#> 0.0842 1.8351 -1.1546 -#> -0.7949 -0.1601 -0.7695 -#> [ CPUFloatType{4,3} ]
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_cummax.html b/docs/reference/torch_cummax.html deleted file mode 100644 index 890e8b7cbdd7418c5dc2eb8c1e4c936f79db6ce0..0000000000000000000000000000000000000000 --- a/docs/reference/torch_cummax.html +++ /dev/null @@ -1,267 +0,0 @@ - - - - - - - - -Cummax — torch_cummax • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Cummax

    -
    - - -

    Arguments

    - - - - - - - - - - - - - - -
    input

    (Tensor) the input tensor.

    dim

    (int) the dimension to do the operation over

    out

    (tuple, optional) the result tuple of two output tensors (values, indices)

    - -

    cummax(input, dim, out=None) -> (Tensor, LongTensor)

    - - - - -

    Returns a namedtuple (values, indices) where values is the cumulative maximum of -elements of input in the dimension dim. And indices is the index -location of each maximum value found in the dimension dim.

    -

    $$ - y_i = max(x_1, x_2, x_3, \dots, x_i) -$$

    - -

    Examples

    -
    # \dontrun{ - -a = torch_randn(c(10)) -a
    #> torch_tensor -#> 0.4011 -#> 0.0282 -#> 1.4709 -#> -0.3322 -#> -1.1082 -#> -0.6219 -#> -1.1143 -#> 0.0945 -#> -0.5687 -#> -0.1941 -#> [ CPUFloatType{10} ]
    torch_cummax(a, dim=1)
    #> [[1]] -#> torch_tensor -#> 0.4011 -#> 0.4011 -#> 1.4709 -#> 1.4709 -#> 1.4709 -#> 1.4709 -#> 1.4709 -#> 1.4709 -#> 1.4709 -#> 1.4709 -#> [ CPUFloatType{10} ] -#> -#> [[2]] -#> torch_tensor -#> 0 -#> 0 -#> 2 -#> 2 -#> 2 -#> 2 -#> 2 -#> 2 -#> 2 -#> 2 -#> [ CPULongType{10} ] -#>
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_cummin.html b/docs/reference/torch_cummin.html deleted file mode 100644 index 32e90d79c960cef87df4fb1789b0886375a22d6f..0000000000000000000000000000000000000000 --- a/docs/reference/torch_cummin.html +++ /dev/null @@ -1,267 +0,0 @@ - - - - - - - - -Cummin — torch_cummin • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Cummin

    -
    - - -

    Arguments

    - - - - - - - - - - - - - - -
    input

    (Tensor) the input tensor.

    dim

    (int) the dimension to do the operation over

    out

    (tuple, optional) the result tuple of two output tensors (values, indices)

    - -

    cummin(input, dim, out=None) -> (Tensor, LongTensor)

    - - - - -

    Returns a namedtuple (values, indices) where values is the cumulative minimum of -elements of input in the dimension dim. And indices is the index -location of each maximum value found in the dimension dim.

    -

    $$ - y_i = min(x_1, x_2, x_3, \dots, x_i) -$$

    - -

    Examples

    -
    # \dontrun{ - -a = torch_randn(c(10)) -a
    #> torch_tensor -#> -1.1300 -#> -0.0916 -#> 1.2476 -#> 1.1859 -#> -0.8123 -#> -1.0110 -#> 0.5914 -#> 1.0707 -#> 1.3137 -#> -0.0139 -#> [ CPUFloatType{10} ]
    torch_cummin(a, dim=1)
    #> [[1]] -#> torch_tensor -#> -1.1300 -#> -1.1300 -#> -1.1300 -#> -1.1300 -#> -1.1300 -#> -1.1300 -#> -1.1300 -#> -1.1300 -#> -1.1300 -#> -1.1300 -#> [ CPUFloatType{10} ] -#> -#> [[2]] -#> torch_tensor -#> 0 -#> 0 -#> 0 -#> 0 -#> 0 -#> 0 -#> 0 -#> 0 -#> 0 -#> 0 -#> [ CPULongType{10} ] -#>
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_cumprod.html b/docs/reference/torch_cumprod.html deleted file mode 100644 index 12275140adcb3b01d031923db9f7cd9b6051dc88..0000000000000000000000000000000000000000 --- a/docs/reference/torch_cumprod.html +++ /dev/null @@ -1,256 +0,0 @@ - - - - - - - - -Cumprod — torch_cumprod • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Cumprod

    -
    - - -

    Arguments

    - - - - - - - - - - - - - - - - - - -
    input

    (Tensor) the input tensor.

    dim

    (int) the dimension to do the operation over

    dtype

    (torch.dtype, optional) the desired data type of returned tensor. If specified, the input tensor is casted to dtype before the operation is performed. This is useful for preventing data type overflows. Default: None.

    out

    (Tensor, optional) the output tensor.

    - -

    cumprod(input, dim, out=None, dtype=None) -> Tensor

    - - - - -

    Returns the cumulative product of elements of input in the dimension -dim.

    -

    For example, if input is a vector of size N, the result will also be -a vector of size N, with elements.

    -

    $$ - y_i = x_1 \times x_2\times x_3\times \dots \times x_i -$$

    - -

    Examples

    -
    # \dontrun{ - -a = torch_randn(c(10)) -a
    #> torch_tensor -#> -1.1828 -#> -0.2114 -#> 0.2918 -#> 0.5649 -#> -0.9559 -#> -0.9795 -#> -0.7152 -#> -0.2064 -#> 0.5134 -#> 0.5777 -#> [ CPUFloatType{10} ]
    torch_cumprod(a, dim=1)
    #> torch_tensor -#> -1.1828 -#> 0.2500 -#> 0.0729 -#> 0.0412 -#> -0.0394 -#> 0.0386 -#> -0.0276 -#> 0.0057 -#> 0.0029 -#> 0.0017 -#> [ CPUFloatType{10} ]
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_cumsum.html b/docs/reference/torch_cumsum.html deleted file mode 100644 index 42d0f64bbc0cad9f1d8bd08c204be6aa7105ad9a..0000000000000000000000000000000000000000 --- a/docs/reference/torch_cumsum.html +++ /dev/null @@ -1,256 +0,0 @@ - - - - - - - - -Cumsum — torch_cumsum • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Cumsum

    -
    - - -

    Arguments

    - - - - - - - - - - - - - - - - - - -
    input

    (Tensor) the input tensor.

    dim

    (int) the dimension to do the operation over

    dtype

    (torch.dtype, optional) the desired data type of returned tensor. If specified, the input tensor is casted to dtype before the operation is performed. This is useful for preventing data type overflows. Default: None.

    out

    (Tensor, optional) the output tensor.

    - -

    cumsum(input, dim, out=None, dtype=None) -> Tensor

    - - - - -

    Returns the cumulative sum of elements of input in the dimension -dim.

    -

    For example, if input is a vector of size N, the result will also be -a vector of size N, with elements.

    -

    $$ - y_i = x_1 + x_2 + x_3 + \dots + x_i -$$

    - -

    Examples

    -
    # \dontrun{ - -a = torch_randn(c(10)) -a
    #> torch_tensor -#> 0.2718 -#> 0.1169 -#> 0.3449 -#> -1.6346 -#> -0.0393 -#> -0.0197 -#> 0.0704 -#> 0.5245 -#> -1.4307 -#> -0.8103 -#> [ CPUFloatType{10} ]
    torch_cumsum(a, dim=1)
    #> torch_tensor -#> 0.2718 -#> 0.3887 -#> 0.7335 -#> -0.9010 -#> -0.9403 -#> -0.9600 -#> -0.8896 -#> -0.3651 -#> -1.7958 -#> -2.6060 -#> [ CPUFloatType{10} ]
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_det.html b/docs/reference/torch_det.html deleted file mode 100644 index 7353562c1a0dafa952af9355cbd68448d0914b9c..0000000000000000000000000000000000000000 --- a/docs/reference/torch_det.html +++ /dev/null @@ -1,244 +0,0 @@ - - - - - - - - -Det — torch_det • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Det

    -
    - - -

    Arguments

    - - - - - - -
    input

    (Tensor) the input tensor of size (*, n, n) where * is zero or more batch dimensions.

    - -

    Note

    - - -
    Backward through `det` internally uses SVD results when `input` is
    -not invertible. In this case, double backward through `det` will be
    -unstable in when `input` doesn't have distinct singular values. See
    -`~torch.svd` for details.
    -
    - -

    det(input) -> Tensor

    - - - - -

    Calculates determinant of a square matrix or batches of square matrices.

    - -

    Examples

    -
    # \dontrun{ - -A = torch_randn(c(3, 3)) -torch_det(A)
    #> torch_tensor -#> -0.485994 -#> [ CPUFloatType{} ]
    A = torch_randn(c(3, 2, 2)) -A
    #> torch_tensor -#> (1,.,.) = -#> -0.0024 -1.0185 -#> 0.3236 -0.2787 -#> -#> (2,.,.) = -#> 0.4021 -0.3008 -#> -0.8884 0.5782 -#> -#> (3,.,.) = -#> -0.3821 1.3521 -#> -0.3608 0.3485 -#> [ CPUFloatType{3,2,2} ]
    A$det()
    #> torch_tensor -#> 0.3302 -#> -0.0347 -#> 0.3547 -#> [ CPUFloatType{3} ]
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_device.html b/docs/reference/torch_device.html deleted file mode 100644 index c8fb5ab1f8479b3a49699402a7cbe8fb707b8abe..0000000000000000000000000000000000000000 --- a/docs/reference/torch_device.html +++ /dev/null @@ -1,225 +0,0 @@ - - - - - - - - -Create a Device object — torch_device • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    A torch_device is an object representing the device on which a torch_tensor -is or will be allocated.

    -
    - -
    torch_device(type, index = NULL)
    - -

    Arguments

    - - - - - - - - - - -
    type

    (character) a device type "cuda" or "cpu"

    index

    (integer) optional device ordinal for the device type. If the device ordinal -is not present, this object will always represent the current device for the device -type, even after torch_cuda_set_device() is called; e.g., a torch_tensor constructed -with device 'cuda' is equivalent to 'cuda:X' where X is the result of -torch_cuda_current_device().

    -

    A torch_device can be constructed via a string or via a string and device ordinal

    - - -

    Examples

    -
    # \dontrun{ - -# Via string -torch_device("cuda:1")
    #> torch_device(type='cuda', index=1)
    torch_device("cpu")
    #> torch_device(type='cpu')
    torch_device("cuda") # current cuda device
    #> torch_device(type='cuda')
    -# Via string and device ordinal -torch_device("cuda", 0)
    #> torch_device(type='cuda', index=0)
    torch_device("cpu", 0)
    #> torch_device(type='cpu', index=0)
    -# } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_diag.html b/docs/reference/torch_diag.html deleted file mode 100644 index 0b1d849b1ca420af018016e715fa3e8230c89252..0000000000000000000000000000000000000000 --- a/docs/reference/torch_diag.html +++ /dev/null @@ -1,229 +0,0 @@ - - - - - - - - -Diag — torch_diag • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Diag

    -
    - - -

    Arguments

    - - - - - - - - - - - - - - -
    input

    (Tensor) the input tensor.

    diagonal

    (int, optional) the diagonal to consider

    out

    (Tensor, optional) the output tensor.

    - -

    diag(input, diagonal=0, out=None) -> Tensor

    - - - -
      -
    • If input is a vector (1-D tensor), then returns a 2-D square tensor -with the elements of input as the diagonal.

    • -
    • If input is a matrix (2-D tensor), then returns a 1-D tensor with -the diagonal elements of input.

    • -
    - -

    The argument diagonal controls which diagonal to consider:

      -
    • If diagonal = 0, it is the main diagonal.

    • -
    • If diagonal > 0, it is above the main diagonal.

    • -
    • If diagonal < 0, it is below the main diagonal.

    • -
    - - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_diag_embed.html b/docs/reference/torch_diag_embed.html deleted file mode 100644 index 2b8049af1e030f672c1ec25f585e9b4eac4304ad..0000000000000000000000000000000000000000 --- a/docs/reference/torch_diag_embed.html +++ /dev/null @@ -1,272 +0,0 @@ - - - - - - - - -Diag_embed — torch_diag_embed • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Diag_embed

    -
    - - -

    Arguments

    - - - - - - - - - - - - - - - - - - -
    input

    (Tensor) the input tensor. Must be at least 1-dimensional.

    offset

    (int, optional) which diagonal to consider. Default: 0 (main diagonal).

    dim1

    (int, optional) first dimension with respect to which to take diagonal. Default: -2.

    dim2

    (int, optional) second dimension with respect to which to take diagonal. Default: -1.

    - -

    diag_embed(input, offset=0, dim1=-2, dim2=-1) -> Tensor

    - - - - -

    Creates a tensor whose diagonals of certain 2D planes (specified by -dim1 and dim2) are filled by input. -To facilitate creating batched diagonal matrices, the 2D planes formed by -the last two dimensions of the returned tensor are chosen by default.

    -

    The argument offset controls which diagonal to consider:

      -
    • If offset = 0, it is the main diagonal.

    • -
    • If offset > 0, it is above the main diagonal.

    • -
    • If offset < 0, it is below the main diagonal.

    • -
    - -

    The size of the new matrix will be calculated to make the specified diagonal -of the size of the last input dimension. -Note that for offset other than \(0\), the order of dim1 -and dim2 matters. Exchanging them is equivalent to changing the -sign of offset.

    -

    Applying torch_diagonal to the output of this function with -the same arguments yields a matrix identical to input. However, -torch_diagonal has different default dimensions, so those -need to be explicitly specified.

    - -

    Examples

    -
    # \dontrun{ - -a = torch_randn(c(2, 3)) -torch_diag_embed(a)
    #> torch_tensor -#> (1,.,.) = -#> -2.3038 0.0000 0.0000 -#> 0.0000 2.0129 0.0000 -#> 0.0000 0.0000 -1.6884 -#> -#> (2,.,.) = -#> 0.8534 0.0000 0.0000 -#> 0.0000 -0.5520 0.0000 -#> 0.0000 0.0000 2.2299 -#> [ CPUFloatType{2,3,3} ]
    torch_diag_embed(a, offset=1, dim1=1, dim2=3)
    #> torch_tensor -#> (1,.,.) = -#> 0.0000 -2.3038 0.0000 0.0000 -#> 0.0000 0.8534 0.0000 0.0000 -#> -#> (2,.,.) = -#> 0.0000 0.0000 2.0129 0.0000 -#> 0.0000 0.0000 -0.5520 0.0000 -#> -#> (3,.,.) = -#> 0.0000 0.0000 0.0000 -1.6884 -#> 0.0000 0.0000 0.0000 2.2299 -#> -#> (4,.,.) = -#> 0 0 0 0 -#> 0 0 0 0 -#> [ CPUFloatType{4,2,4} ]
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_diagflat.html b/docs/reference/torch_diagflat.html deleted file mode 100644 index de29c4bb07b6c448129f245d8fb211959a863a3e..0000000000000000000000000000000000000000 --- a/docs/reference/torch_diagflat.html +++ /dev/null @@ -1,253 +0,0 @@ - - - - - - - - -Diagflat — torch_diagflat • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Diagflat

    -
    - - -

    Arguments

    - - - - - - - - - - -
    input

    (Tensor) the input tensor.

    offset

    (int, optional) the diagonal to consider. Default: 0 (main diagonal).

    - -

    diagflat(input, offset=0) -> Tensor

    - - - -
      -
    • If input is a vector (1-D tensor), then returns a 2-D square tensor -with the elements of input as the diagonal.

    • -
    • If input is a tensor with more than one dimension, then returns a -2-D tensor with diagonal elements equal to a flattened input.

    • -
    - -

    The argument offset controls which diagonal to consider:

      -
    • If offset = 0, it is the main diagonal.

    • -
    • If offset > 0, it is above the main diagonal.

    • -
    • If offset < 0, it is below the main diagonal.

    • -
    - - -

    Examples

    -
    # \dontrun{ - -a = torch_randn(c(3)) -a
    #> torch_tensor -#> 1.2456 -#> -1.0479 -#> 0.2374 -#> [ CPUFloatType{3} ]
    torch_diagflat(a)
    #> torch_tensor -#> 1.2456 0.0000 0.0000 -#> 0.0000 -1.0479 0.0000 -#> 0.0000 0.0000 0.2374 -#> [ CPUFloatType{3,3} ]
    torch_diagflat(a, 1)
    #> torch_tensor -#> 0.0000 1.2456 0.0000 0.0000 -#> 0.0000 0.0000 -1.0479 0.0000 -#> 0.0000 0.0000 0.0000 0.2374 -#> 0.0000 0.0000 0.0000 0.0000 -#> [ CPUFloatType{4,4} ]
    a = torch_randn(c(2, 2)) -a
    #> torch_tensor -#> 0.5628 -0.2248 -#> 0.2077 -2.6745 -#> [ CPUFloatType{2,2} ]
    torch_diagflat(a)
    #> torch_tensor -#> 0.5628 0.0000 0.0000 0.0000 -#> 0.0000 -0.2248 0.0000 0.0000 -#> 0.0000 0.0000 0.2077 0.0000 -#> 0.0000 0.0000 0.0000 -2.6745 -#> [ CPUFloatType{4,4} ]
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_diagonal.html b/docs/reference/torch_diagonal.html deleted file mode 100644 index 5a4719b5eed2807bd01ddcdc225a02208a700339..0000000000000000000000000000000000000000 --- a/docs/reference/torch_diagonal.html +++ /dev/null @@ -1,268 +0,0 @@ - - - - - - - - -Diagonal — torch_diagonal • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Diagonal

    -
    - - -

    Arguments

    - - - - - - - - - - - - - - - - - - -
    input

    (Tensor) the input tensor. Must be at least 2-dimensional.

    offset

    (int, optional) which diagonal to consider. Default: 0 (main diagonal).

    dim1

    (int, optional) first dimension with respect to which to take diagonal. Default: 0.

    dim2

    (int, optional) second dimension with respect to which to take diagonal. Default: 1.

    - -

    diagonal(input, offset=0, dim1=0, dim2=1) -> Tensor

    - - - - -

    Returns a partial view of input with the its diagonal elements -with respect to dim1 and dim2 appended as a dimension -at the end of the shape.

    -

    The argument offset controls which diagonal to consider:

      -
    • If offset = 0, it is the main diagonal.

    • -
    • If offset > 0, it is above the main diagonal.

    • -
    • If offset < 0, it is below the main diagonal.

    • -
    - -

    Applying torch_diag_embed to the output of this function with -the same arguments yields a diagonal matrix with the diagonal entries -of the input. However, torch_diag_embed has different default -dimensions, so those need to be explicitly specified.

    - -

    Examples

    -
    # \dontrun{ - -a = torch_randn(c(3, 3)) -a
    #> torch_tensor -#> -0.4461 1.6781 0.7912 -#> -2.0005 -0.9287 -0.1604 -#> 0.6617 -0.6939 1.5567 -#> [ CPUFloatType{3,3} ]
    torch_diagonal(a, offset = 0)
    #> torch_tensor -#> -0.4461 -#> -0.9287 -#> 1.5567 -#> [ CPUFloatType{3} ]
    torch_diagonal(a, offset = 1)
    #> torch_tensor -#> 1.6781 -#> -0.1604 -#> [ CPUFloatType{2} ]
    x = torch_randn(c(2, 5, 4, 2)) -torch_diagonal(x, offset=-1, dim1=1, dim2=2)
    #> torch_tensor -#> (1,.,.) = -#> 0.3670 -#> -1.7063 -#> -#> (2,.,.) = -#> 0.6795 -#> 1.1359 -#> -#> (3,.,.) = -#> -0.5056 -#> -0.1993 -#> -#> (4,.,.) = -#> 0.5929 -#> -0.6632 -#> [ CPUFloatType{4,2,1} ]
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_digamma.html b/docs/reference/torch_digamma.html deleted file mode 100644 index 788c22aa1471de85ba0e041aaa90bc9c866b9a20..0000000000000000000000000000000000000000 --- a/docs/reference/torch_digamma.html +++ /dev/null @@ -1,222 +0,0 @@ - - - - - - - - -Digamma — torch_digamma • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Digamma

    -
    - - -

    Arguments

    - - - - - - -
    input

    (Tensor) the tensor to compute the digamma function on

    - -

    digamma(input, out=None) -> Tensor

    - - - - -

    Computes the logarithmic derivative of the gamma function on input.

    -

    $$ - \psi(x) = \frac{d}{dx} \ln\left(\Gamma\left(x\right)\right) = \frac{\Gamma'(x)}{\Gamma(x)} -$$

    - -

    Examples

    -
    # \dontrun{ - -a = torch_tensor(c(1, 0.5)) -torch_digamma(a)
    #> torch_tensor -#> -0.5772 -#> -1.9635 -#> [ CPUFloatType{2} ]
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_dist.html b/docs/reference/torch_dist.html deleted file mode 100644 index 0a515c428ecfed7dec343e1f09f42e63066fdca5..0000000000000000000000000000000000000000 --- a/docs/reference/torch_dist.html +++ /dev/null @@ -1,245 +0,0 @@ - - - - - - - - -Dist — torch_dist • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Dist

    -
    - - -

    Arguments

    - - - - - - - - - - - - - - -
    input

    (Tensor) the input tensor.

    other

    (Tensor) the Right-hand-side input tensor

    p

    (float, optional) the norm to be computed

    - -

    dist(input, other, p=2) -> Tensor

    - - - - -

    Returns the p-norm of (input - other)

    -

    The shapes of input and other must be -broadcastable .

    - -

    Examples

    -
    # \dontrun{ - -x = torch_randn(c(4)) -x
    #> torch_tensor -#> 0.3528 -#> -0.2518 -#> -0.8406 -#> -0.3756 -#> [ CPUFloatType{4} ]
    y = torch_randn(c(4)) -y
    #> torch_tensor -#> 0.5376 -#> -1.6162 -#> -0.8764 -#> 0.2081 -#> [ CPUFloatType{4} ]
    torch_dist(x, y, 3.5)
    #> torch_tensor -#> 1.38438 -#> [ CPUFloatType{} ]
    torch_dist(x, y, 3)
    #> torch_tensor -#> 1.40023 -#> [ CPUFloatType{} ]
    torch_dist(x, y, 0)
    #> torch_tensor -#> 4 -#> [ CPUFloatType{} ]
    torch_dist(x, y, 1)
    #> torch_tensor -#> 2.16879 -#> [ CPUFloatType{} ]
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_div.html b/docs/reference/torch_div.html deleted file mode 100644 index b5ef3803645d8b5713d41320f3333925b7f62750..0000000000000000000000000000000000000000 --- a/docs/reference/torch_div.html +++ /dev/null @@ -1,283 +0,0 @@ - - - - - - - - -Div — torch_div • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Div

    -
    - - -

    Arguments

    - - - - - - - - - - -
    input

    (Tensor) the input tensor.

    other

    (Number) the number to be divided to each element of input

    - -

    div(input, other, out=None) -> Tensor

    - - - - -

    Divides each element of the input input with the scalar other and -returns a new resulting tensor.

    - - -

    Each element of the tensor input is divided by each element of the tensor -other. The resulting tensor is returned.

    -

    $$ - \mbox{out}_i = \frac{\mbox{input}_i}{\mbox{other}_i} -$$ -The shapes of input and other must be broadcastable -. If the torch_dtype of input and -other differ, the torch_dtype of the result tensor is determined -following rules described in the type promotion documentation -. If out is specified, the result must be -castable to the torch_dtype of the -specified output tensor. Integral division by zero leads to undefined behavior.

    -

    Warning

    - - - -

    Integer division using div is deprecated, and in a future release div will -perform true division like torch_true_divide. -Use torch_floor_divide (// in Python) to perform integer division, -instead.

    -

    $$ - \mbox{out}_i = \frac{\mbox{input}_i}{\mbox{other}} -$$ -If the torch_dtype of input and other differ, the -torch_dtype of the result tensor is determined following rules -described in the type promotion documentation . If -out is specified, the result must be castable -to the torch_dtype of the specified output tensor. Integral division -by zero leads to undefined behavior.

    - -

    Examples

    -
    # \dontrun{ - -a = torch_randn(c(5)) -a
    #> torch_tensor -#> -1.2469 -#> -0.8301 -#> 0.6777 -#> 0.4991 -#> 2.3110 -#> [ CPUFloatType{5} ]
    torch_div(a, 0.5)
    #> torch_tensor -#> -2.4938 -#> -1.6601 -#> 1.3554 -#> 0.9983 -#> 4.6219 -#> [ CPUFloatType{5} ]
    - -a = torch_randn(c(4, 4)) -a
    #> torch_tensor -#> -0.1627 0.0346 0.2195 0.4264 -#> -1.3478 -2.4998 -1.5522 0.7143 -#> 0.2875 -1.3762 0.8071 -0.8691 -#> 0.7893 -1.7013 -0.2038 -0.3908 -#> [ CPUFloatType{4,4} ]
    b = torch_randn(c(4)) -b
    #> torch_tensor -#> -0.2924 -#> 1.8341 -#> 0.0283 -#> -0.6063 -#> [ CPUFloatType{4} ]
    torch_div(a, b)
    #> torch_tensor -#> 0.5566 0.0188 7.7430 -0.7033 -#> 4.6100 -1.3630 -54.7623 -1.1781 -#> -0.9833 -0.7503 28.4751 1.4334 -#> -2.6995 -0.9276 -7.1916 0.6445 -#> [ CPUFloatType{4,4} ]
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_dot.html b/docs/reference/torch_dot.html deleted file mode 100644 index aa3c0a296d402476edb01f36c62c1464f71596cc..0000000000000000000000000000000000000000 --- a/docs/reference/torch_dot.html +++ /dev/null @@ -1,212 +0,0 @@ - - - - - - - - -Dot — torch_dot • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Dot

    -
    - - - -

    Note

    - -

    This function does not broadcast .

    -

    dot(input, tensor) -> Tensor

    - - - - -

    Computes the dot product (inner product) of two tensors.

    - -

    Examples

    -
    # \dontrun{ - -torch_dot(torch_tensor(c(2, 3)), torch_tensor(c(2, 1)))
    #> torch_tensor -#> 7 -#> [ CPUFloatType{} ]
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_dtype.html b/docs/reference/torch_dtype.html deleted file mode 100644 index 2c8e5a6c5831632bd51eb4672e5d35f1d84be12d..0000000000000000000000000000000000000000 --- a/docs/reference/torch_dtype.html +++ /dev/null @@ -1,231 +0,0 @@ - - - - - - - - -Torch data types — torch_dtype • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Returns the correspondent data type.

    -
    - -
    torch_float32()
    -
    -torch_float()
    -
    -torch_float64()
    -
    -torch_double()
    -
    -torch_float16()
    -
    -torch_half()
    -
    -torch_uint8()
    -
    -torch_int8()
    -
    -torch_int16()
    -
    -torch_short()
    -
    -torch_int32()
    -
    -torch_int()
    -
    -torch_int64()
    -
    -torch_long()
    -
    -torch_bool()
    -
    -torch_quint8()
    -
    -torch_qint8()
    -
    -torch_qint32()
    - - - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_eig.html b/docs/reference/torch_eig.html deleted file mode 100644 index f59c5007519ab5c1c4c4a1d42efb425f4f0905b7..0000000000000000000000000000000000000000 --- a/docs/reference/torch_eig.html +++ /dev/null @@ -1,225 +0,0 @@ - - - - - - - - -Eig — torch_eig • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Eig

    -
    - - -

    Arguments

    - - - - - - - - - - - - - - -
    input

    (Tensor) the square matrix of shape \((n \times n)\) for which the eigenvalues and eigenvectors will be computed

    eigenvectors

    (bool) True to compute both eigenvalues and eigenvectors; otherwise, only eigenvalues will be computed

    out

    (tuple, optional) the output tensors

    - -

    Note

    - - -
    Since eigenvalues and eigenvectors might be complex, backward pass is supported only
    -for [`torch_symeig`]
    -
    - -

    eig(input, eigenvectors=False, out=None) -> (Tensor, Tensor)

    - - - - -

    Computes the eigenvalues and eigenvectors of a real square matrix.

    - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_einsum.html b/docs/reference/torch_einsum.html deleted file mode 100644 index e94b7849350d4f0c25810bec9fdf1839e43886f8..0000000000000000000000000000000000000000 --- a/docs/reference/torch_einsum.html +++ /dev/null @@ -1,260 +0,0 @@ - - - - - - - - -Einsum — torch_einsum • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Einsum

    -
    - - -

    Arguments

    - - - - - - - - - - -
    equation

    (string) The equation is given in terms of lower case letters (indices) to be associated with each dimension of the operands and result. The left hand side lists the operands dimensions, separated by commas. There should be one index letter per tensor dimension. The right hand side follows after -> and gives the indices for the output. If the -> and right hand side are omitted, it implicitly defined as the alphabetically sorted list of all indices appearing exactly once in the left hand side. The indices not apprearing in the output are summed over after multiplying the operands entries. If an index appears several times for the same operand, a diagonal is taken. Ellipses ... represent a fixed number of dimensions. If the right hand side is inferred, the ellipsis dimensions are at the beginning of the output.

    operands

    (Tensor) The operands to compute the Einstein sum of.

    - -

    einsum(equation, *operands) -> Tensor

    - - - - -

    This function provides a way of computing multilinear expressions (i.e. sums of products) using the -Einstein summation convention.

    - -

    Examples

    -
    # \dontrun{ - -x = torch_randn(c(5)) -y = torch_randn(c(4)) -torch_einsum('i,j->ij', list(x, y)) # outer product
    #> torch_tensor -#> 0.6481 -0.3038 0.7547 0.3279 -#> -0.7964 0.3732 -0.9273 -0.4029 -#> 0.2729 -0.1279 0.3178 0.1381 -#> -0.4882 0.2288 -0.5684 -0.2470 -#> 2.6219 -1.2288 3.0530 1.3266 -#> [ CPUFloatType{5,4} ]
    A = torch_randn(c(3,5,4)) -l = torch_randn(c(2,5)) -r = torch_randn(c(2,4)) -torch_einsum('bn,anm,bm->ba', list(l, A, r)) # compare torch_nn$functional$bilinear
    #> torch_tensor -#> 4.7791 -6.3600 2.0599 -#> 1.2294 0.6515 -3.1033 -#> [ CPUFloatType{2,3} ]
    As = torch_randn(c(3,2,5)) -Bs = torch_randn(c(3,5,4)) -torch_einsum('bij,bjk->bik', list(As, Bs)) # batch matrix multiplication
    #> torch_tensor -#> (1,.,.) = -#> 0.7724 0.1715 0.3778 0.5154 -#> -0.7152 3.1174 0.5206 -3.0151 -#> -#> (2,.,.) = -#> -2.1745 4.5514 0.2279 2.8187 -#> -0.4776 1.4331 -1.1076 0.3273 -#> -#> (3,.,.) = -#> 1.0775 0.4867 -2.7086 -0.8871 -#> -1.4834 -2.0710 -1.2663 0.2878 -#> [ CPUFloatType{3,2,4} ]
    A = torch_randn(c(3, 3)) -torch_einsum('ii->i', list(A)) # diagonal
    #> torch_tensor -#> 0.7209 -#> 0.7370 -#> 0.0642 -#> [ CPUFloatType{3} ]
    A = torch_randn(c(4, 3, 3)) -torch_einsum('...ii->...i', list(A)) # batch diagonal
    #> torch_tensor -#> -1.1323 2.2000 -0.2178 -#> 1.1550 -0.8415 -1.0067 -#> 2.1543 -0.0441 -1.3181 -#> -0.0296 -2.3285 0.7952 -#> [ CPUFloatType{4,3} ]
    A = torch_randn(c(2, 3, 4, 5)) -torch_einsum('...ij->...ji', list(A))$shape # batch permute
    #> [1] 2 3 5 4
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_empty.html b/docs/reference/torch_empty.html deleted file mode 100644 index dfb34f407e4b972228751fd1e105698b40d86b45..0000000000000000000000000000000000000000 --- a/docs/reference/torch_empty.html +++ /dev/null @@ -1,247 +0,0 @@ - - - - - - - - -Empty — torch_empty • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Empty

    -
    - - -

    Arguments

    - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    size

    (int...) a sequence of integers defining the shape of the output tensor. Can be a variable number of arguments or a collection like a list or tuple.

    out

    (Tensor, optional) the output tensor.

    dtype

    (torch.dtype, optional) the desired data type of returned tensor. Default: if None, uses a global default (see torch_set_default_tensor_type).

    layout

    (torch.layout, optional) the desired layout of returned Tensor. Default: torch_strided.

    device

    (torch.device, optional) the desired device of returned tensor. Default: if None, uses the current device for the default tensor type (see torch_set_default_tensor_type). device will be the CPU for CPU tensor types and the current CUDA device for CUDA tensor types.

    requires_grad

    (bool, optional) If autograd should record operations on the returned tensor. Default: False.

    pin_memory

    (bool, optional) If set, returned tensor would be allocated in the pinned memory. Works only for CPU tensors. Default: False.

    memory_format

    (torch.memory_format, optional) the desired memory format of returned Tensor. Default: torch_contiguous_format.

    - -

    empty(*size, out=None, dtype=None, layout=torch.strided, device=None, requires_grad=False, pin_memory=False) -> Tensor

    - - - - -

    Returns a tensor filled with uninitialized data. The shape of the tensor is -defined by the variable argument size.

    - -

    Examples

    -
    # \dontrun{ - -torch_empty(c(2, 3))
    #> torch_tensor -#> 0.0000e+00 1.0842e-19 -2.0454e-24 -#> 8.5920e+09 8.4078e-45 4.9045e-44 -#> [ CPUFloatType{2,3} ]
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_empty_like.html b/docs/reference/torch_empty_like.html deleted file mode 100644 index c501031f101914dcf07a28c2db38d20a5929f432..0000000000000000000000000000000000000000 --- a/docs/reference/torch_empty_like.html +++ /dev/null @@ -1,240 +0,0 @@ - - - - - - - - -Empty_like — torch_empty_like • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Empty_like

    -
    - - -

    Arguments

    - - - - - - - - - - - - - - - - - - - - - - - - - - -
    input

    (Tensor) the size of input will determine size of the output tensor.

    dtype

    (torch.dtype, optional) the desired data type of returned Tensor. Default: if None, defaults to the dtype of input.

    layout

    (torch.layout, optional) the desired layout of returned tensor. Default: if None, defaults to the layout of input.

    device

    (torch.device, optional) the desired device of returned tensor. Default: if None, defaults to the device of input.

    requires_grad

    (bool, optional) If autograd should record operations on the returned tensor. Default: False.

    memory_format

    (torch.memory_format, optional) the desired memory format of returned Tensor. Default: torch_preserve_format.

    - -

    empty_like(input, dtype=None, layout=None, device=None, requires_grad=False, memory_format=torch.preserve_format) -> Tensor

    - - - - -

    Returns an uninitialized tensor with the same size as input. -torch_empty_like(input) is equivalent to -torch_empty(input.size(), dtype=input.dtype, layout=input.layout, device=input.device).

    - -

    Examples

    -
    # \dontrun{ - -torch_empty(list(2,3), dtype = torch_int64())
    #> torch_tensor -#> 1.4057e+14 1.4057e+14 1.4057e+14 -#> 0.0000e+00 8.5899e+09 1.4057e+14 -#> [ CPULongType{2,3} ]
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_empty_strided.html b/docs/reference/torch_empty_strided.html deleted file mode 100644 index 784c58a7e3f23af088c878ca11084be9637494bc..0000000000000000000000000000000000000000 --- a/docs/reference/torch_empty_strided.html +++ /dev/null @@ -1,254 +0,0 @@ - - - - - - - - -Empty_strided — torch_empty_strided • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Empty_strided

    -
    - - -

    Arguments

    - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    size

    (tuple of ints) the shape of the output tensor

    stride

    (tuple of ints) the strides of the output tensor

    dtype

    (torch.dtype, optional) the desired data type of returned tensor. Default: if None, uses a global default (see torch_set_default_tensor_type).

    layout

    (torch.layout, optional) the desired layout of returned Tensor. Default: torch_strided.

    device

    (torch.device, optional) the desired device of returned tensor. Default: if None, uses the current device for the default tensor type (see torch_set_default_tensor_type). device will be the CPU for CPU tensor types and the current CUDA device for CUDA tensor types.

    requires_grad

    (bool, optional) If autograd should record operations on the returned tensor. Default: False.

    pin_memory

    (bool, optional) If set, returned tensor would be allocated in the pinned memory. Works only for CPU tensors. Default: False.

    - -

    empty_strided(size, stride, dtype=None, layout=None, device=None, requires_grad=False, pin_memory=False) -> Tensor

    - - - - -

    Returns a tensor filled with uninitialized data. The shape and strides of the tensor is -defined by the variable argument size and stride respectively. -torch_empty_strided(size, stride) is equivalent to -torch_empty(size).as_strided(size, stride).

    -

    Warning

    - - - -

    More than one element of the created tensor may refer to a single memory -location. As a result, in-place operations (especially ones that are -vectorized) may result in incorrect behavior. If you need to write to -the tensors, please clone them first.

    - -

    Examples

    -
    # \dontrun{ - -a = torch_empty_strided(list(2, 3), list(1, 2)) -a
    #> torch_tensor -#> 0.0000e+00 -4.2887e-24 2.2101e-10 -#> 1.0842e-19 2.0005e+00 4.5592e+30 -#> [ CPUFloatType{2,3} ]
    a$stride(1)
    #> [1] 1
    a$size(1)
    #> [1] 2
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_eq.html b/docs/reference/torch_eq.html deleted file mode 100644 index bd88820e697a2c600b18b9a665405c06947d4125..0000000000000000000000000000000000000000 --- a/docs/reference/torch_eq.html +++ /dev/null @@ -1,230 +0,0 @@ - - - - - - - - -Eq — torch_eq • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Eq

    -
    - - -

    Arguments

    - - - - - - - - - - - - - - -
    input

    (Tensor) the tensor to compare

    other

    (Tensor or float) the tensor or value to compare

    out

    (Tensor, optional) the output tensor. Must be a ByteTensor

    - -

    eq(input, other, out=None) -> Tensor

    - - - - -

    Computes element-wise equality

    -

    The second argument can be a number or a tensor whose shape is -broadcastable with the first argument.

    - -

    Examples

    -
    # \dontrun{ - -torch_eq(torch_tensor(c(1,2,3,4)), torch_tensor(c(1, 3, 2, 4)))
    #> torch_tensor -#> 1 -#> 0 -#> 0 -#> 1 -#> [ CPUBoolType{4} ]
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_equal.html b/docs/reference/torch_equal.html deleted file mode 100644 index 4b1c89e475167d1b152d9dc9d412c24e799aadfe..0000000000000000000000000000000000000000 --- a/docs/reference/torch_equal.html +++ /dev/null @@ -1,207 +0,0 @@ - - - - - - - - -Equal — torch_equal • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Equal

    -
    - - - -

    equal(input, other) -> bool

    - - - - -

    True if two tensors have the same size and elements, False otherwise.

    - -

    Examples

    -
    # \dontrun{ - -torch_equal(torch_tensor(c(1, 2)), torch_tensor(c(1, 2)))
    #> [1] TRUE
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_erf.html b/docs/reference/torch_erf.html deleted file mode 100644 index 76df3396baf323900e5334ac01b66060b8be65b7..0000000000000000000000000000000000000000 --- a/docs/reference/torch_erf.html +++ /dev/null @@ -1,226 +0,0 @@ - - - - - - - - -Erf — torch_erf • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Erf

    -
    - - -

    Arguments

    - - - - - - - - - - -
    input

    (Tensor) the input tensor.

    out

    (Tensor, optional) the output tensor.

    - -

    erf(input, out=None) -> Tensor

    - - - - -

    Computes the error function of each element. The error function is defined as follows:

    -

    $$ - \mathrm{erf}(x) = \frac{2}{\sqrt{\pi}} \int_{0}^{x} e^{-t^2} dt -$$

    - -

    Examples

    -
    # \dontrun{ - -torch_erf(torch_tensor(c(0, -1., 10.)))
    #> torch_tensor -#> 0.0000 -#> -0.8427 -#> 1.0000 -#> [ CPUFloatType{3} ]
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_erfc.html b/docs/reference/torch_erfc.html deleted file mode 100644 index b3eadedfefaaf6ef63f8f6dca0ea2499939307cf..0000000000000000000000000000000000000000 --- a/docs/reference/torch_erfc.html +++ /dev/null @@ -1,227 +0,0 @@ - - - - - - - - -Erfc — torch_erfc • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Erfc

    -
    - - -

    Arguments

    - - - - - - - - - - -
    input

    (Tensor) the input tensor.

    out

    (Tensor, optional) the output tensor.

    - -

    erfc(input, out=None) -> Tensor

    - - - - -

    Computes the complementary error function of each element of input. -The complementary error function is defined as follows:

    -

    $$ - \mathrm{erfc}(x) = 1 - \frac{2}{\sqrt{\pi}} \int_{0}^{x} e^{-t^2} dt -$$

    - -

    Examples

    -
    # \dontrun{ - -torch_erfc(torch_tensor(c(0, -1., 10.)))
    #> torch_tensor -#> 1.0000e+00 -#> 1.8427e+00 -#> 2.8026e-45 -#> [ CPUFloatType{3} ]
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_erfinv.html b/docs/reference/torch_erfinv.html deleted file mode 100644 index 9457adab71a7242a692170b9d73817861af72d67..0000000000000000000000000000000000000000 --- a/docs/reference/torch_erfinv.html +++ /dev/null @@ -1,227 +0,0 @@ - - - - - - - - -Erfinv — torch_erfinv • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Erfinv

    -
    - - -

    Arguments

    - - - - - - - - - - -
    input

    (Tensor) the input tensor.

    out

    (Tensor, optional) the output tensor.

    - -

    erfinv(input, out=None) -> Tensor

    - - - - -

    Computes the inverse error function of each element of input. -The inverse error function is defined in the range \((-1, 1)\) as:

    -

    $$ - \mathrm{erfinv}(\mathrm{erf}(x)) = x -$$

    - -

    Examples

    -
    # \dontrun{ - -torch_erfinv(torch_tensor(c(0, 0.5, -1.)))
    #> torch_tensor -#> 0.0000 -#> 0.4769 -#> -inf -#> [ CPUFloatType{3} ]
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_exp.html b/docs/reference/torch_exp.html deleted file mode 100644 index 4e5faacc5e727909e569793eba2ad0c98865757c..0000000000000000000000000000000000000000 --- a/docs/reference/torch_exp.html +++ /dev/null @@ -1,226 +0,0 @@ - - - - - - - - -Exp — torch_exp • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Exp

    -
    - - -

    Arguments

    - - - - - - - - - - -
    input

    (Tensor) the input tensor.

    out

    (Tensor, optional) the output tensor.

    - -

    exp(input, out=None) -> Tensor

    - - - - -

    Returns a new tensor with the exponential of the elements -of the input tensor input.

    -

    $$ - y_{i} = e^{x_{i}} -$$

    - -

    Examples

    -
    # \dontrun{ - -torch_exp(torch_tensor(c(0, log(2))))
    #> torch_tensor -#> 1 -#> 2 -#> [ CPUFloatType{2} ]
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_expm1.html b/docs/reference/torch_expm1.html deleted file mode 100644 index 2534f13a07ecb5467639acc46f5938cd2b1179e9..0000000000000000000000000000000000000000 --- a/docs/reference/torch_expm1.html +++ /dev/null @@ -1,226 +0,0 @@ - - - - - - - - -Expm1 — torch_expm1 • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Expm1

    -
    - - -

    Arguments

    - - - - - - - - - - -
    input

    (Tensor) the input tensor.

    out

    (Tensor, optional) the output tensor.

    - -

    expm1(input, out=None) -> Tensor

    - - - - -

    Returns a new tensor with the exponential of the elements minus 1 -of input.

    -

    $$ - y_{i} = e^{x_{i}} - 1 -$$

    - -

    Examples

    -
    # \dontrun{ - -torch_expm1(torch_tensor(c(0, log(2))))
    #> torch_tensor -#> 0 -#> 1 -#> [ CPUFloatType{2} ]
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_eye.html b/docs/reference/torch_eye.html deleted file mode 100644 index 581e793e38608c02983da0de7a38b39c9b5ff765..0000000000000000000000000000000000000000 --- a/docs/reference/torch_eye.html +++ /dev/null @@ -1,243 +0,0 @@ - - - - - - - - -Eye — torch_eye • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Eye

    -
    - - -

    Arguments

    - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    n

    (int) the number of rows

    m

    (int, optional) the number of columns with default being n

    out

    (Tensor, optional) the output tensor.

    dtype

    (torch.dtype, optional) the desired data type of returned tensor. Default: if None, uses a global default (see torch_set_default_tensor_type).

    layout

    (torch.layout, optional) the desired layout of returned Tensor. Default: torch_strided.

    device

    (torch.device, optional) the desired device of returned tensor. Default: if None, uses the current device for the default tensor type (see torch_set_default_tensor_type). device will be the CPU for CPU tensor types and the current CUDA device for CUDA tensor types.

    requires_grad

    (bool, optional) If autograd should record operations on the returned tensor. Default: False.

    - -

    eye(n, m=None, out=None, dtype=None, layout=torch.strided, device=None, requires_grad=False) -> Tensor

    - - - - -

    Returns a 2-D tensor with ones on the diagonal and zeros elsewhere.

    - -

    Examples

    -
    # \dontrun{ - -torch_eye(3)
    #> torch_tensor -#> 1 0 0 -#> 0 1 0 -#> 0 0 1 -#> [ CPUFloatType{3,3} ]
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_fft.html b/docs/reference/torch_fft.html deleted file mode 100644 index 2c18f52e6b9f6fcba5066a29cfc98f27831cca3e..0000000000000000000000000000000000000000 --- a/docs/reference/torch_fft.html +++ /dev/null @@ -1,618 +0,0 @@ - - - - - - - - -Fft — torch_fft • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Fft

    -
    - - -

    Arguments

    - - - - - - - - - - - - - - -
    input

    (Tensor) the input tensor of at least signal_ndim + 1 dimensions

    signal_ndim

    (int) the number of dimensions in each signal. signal_ndim can only be 1, 2 or 3

    normalized

    (bool, optional) controls whether to return normalized results. Default: False

    - -

    Note

    - - -
    For CUDA tensors, an LRU cache is used for cuFFT plans to speed up
    -repeatedly running FFT methods on tensors of same geometry with same
    -configuration. See cufft-plan-cache for more details on how to
    -monitor and control the cache.
    -
    - -

    fft(input, signal_ndim, normalized=False) -> Tensor

    - - - - -

    Complex-to-complex Discrete Fourier Transform

    -

    This method computes the complex-to-complex discrete Fourier transform. -Ignoring the batch dimensions, it computes the following expression:

    -

    $$ - X[\omega_1, \dots, \omega_d] = - \sum_{n_1=0}^{N_1-1} \dots \sum_{n_d=0}^{N_d-1} x[n_1, \dots, n_d] - e^{-j\ 2 \pi \sum_{i=0}^d \frac{\omega_i n_i}{N_i}}, -$$ -where \(d\) = signal_ndim is number of dimensions for the -signal, and \(N_i\) is the size of signal dimension \(i\).

    -

    This method supports 1D, 2D and 3D complex-to-complex transforms, indicated -by signal_ndim. input must be a tensor with last dimension -of size 2, representing the real and imaginary components of complex -numbers, and should have at least signal_ndim + 1 dimensions with optionally -arbitrary number of leading batch dimensions. If normalized is set to -True, this normalizes the result by dividing it with -\(\sqrt{\prod_{i=1}^K N_i}\) so that the operator is unitary.

    -

    Returns the real and the imaginary parts together as one tensor of the same -shape of input.

    -

    The inverse of this function is torch_ifft.

    -

    Warning

    - - - -

    For CPU tensors, this method is currently only available with MKL. Use -torch_backends.mkl.is_available to check if MKL is installed.

    - -

    Examples

    -
    # \dontrun{ - -# unbatched 2D FFT -x = torch_randn(c(4, 3, 2)) -torch_fft(x, 2)
    #> torch_tensor -#> (1,.,.) = -#> 1.6091 1.1177 -#> 1.7387 0.5245 -#> 1.3022 -7.5020 -#> -#> (2,.,.) = -#> -2.2896 1.5947 -#> 2.3894 -3.7256 -#> 2.4421 -6.0534 -#> -#> (3,.,.) = -#> -2.6684 0.2216 -#> 4.6351 2.1387 -#> 1.0104 2.1655 -#> -#> (4,.,.) = -#> 0.6484 1.5756 -#> 3.6644 0.9869 -#> 0.8331 1.2714 -#> [ CPUFloatType{4,3,2} ]
    # batched 1D FFT -torch_fft(x, 1)
    #> torch_tensor -#> (1,.,.) = -#> -0.6751 1.1274 -#> 3.1069 -0.0189 -#> 1.3969 -2.5296 -#> -#> (2,.,.) = -#> 1.0646 -0.5105 -#> 0.4540 -0.7223 -#> 1.9041 -2.0146 -#> -#> (3,.,.) = -#> 0.1455 -0.4577 -#> 0.0800 1.3505 -#> -0.2407 -0.1386 -#> -#> (4,.,.) = -#> 1.0741 0.9585 -#> -1.9022 -0.0848 -#> -1.7582 -2.8191 -#> [ CPUFloatType{4,3,2} ]
    # arbitrary number of batch dimensions, 2D FFT -x = torch_randn(c(3, 3, 5, 5, 2)) -torch_fft(x, 2)
    #> torch_tensor -#> (1,1,1,.,.) = -#> -5.4725 -1.0036 -#> -0.4072 -9.9106 -#> -6.2423 -6.4070 -#> -1.9919 -0.2080 -#> -1.5095 -4.7941 -#> -#> (2,1,1,.,.) = -#> -1.3392 3.2072 -#> -2.8398 7.8139 -#> 0.8047 -3.0656 -#> 10.5509 -0.0158 -#> 0.4441 1.4712 -#> -#> (3,1,1,.,.) = -#> 5.8889 -0.8239 -#> 0.6557 -4.5634 -#> 3.6018 -5.1202 -#> 6.7747 -1.8286 -#> -6.4471 3.7626 -#> -#> (1,2,1,.,.) = -#> -0.2606 1.6037 -#> 1.2933 2.7796 -#> -1.2262 -8.3113 -#> -2.9285 -0.8387 -#> -8.8958 7.9346 -#> -#> (2,2,1,.,.) = -#> -5.8665 -5.8427 -#> 3.5695 -0.4600 -#> 10.6353 -4.9917 -#> -1.8417 -2.2605 -#> 4.0205 3.3620 -#> -#> (3,2,1,.,.) = -#> -5.3450 5.1190 -#> -7.1003 3.1183 -#> 2.1035 -0.7317 -#> -1.7222 6.8806 -#> 1.1505 7.5300 -#> -#> (1,3,1,.,.) = -#> -9.1081 7.4756 -#> 5.1849 -1.2153 -#> -1.6891 -2.1615 -#> -4.9926 0.1991 -#> 5.4721 -2.1878 -#> -#> (2,3,1,.,.) = -#> -4.9459 0.0120 -#> -5.7250 -0.8370 -#> 0.7157 1.3166 -#> -9.2864 4.1198 -#> -2.5485 6.5091 -#> -#> (3,3,1,.,.) = -#> 2.9661 -4.1025 -#> 5.0720 3.2858 -#> -1.6322 -9.3722 -#> -0.3661 -5.5639 -#> 6.5081 -2.4129 -#> -#> (1,1,2,.,.) = -#> -2.3401 1.2643 -#> 5.3967 -7.4990 -#> 0.1779 3.4217 -#> 6.0804 6.1065 -#> 10.1270 -3.5843 -#> -#> (2,1,2,.,.) = -#> -9.0572 -1.9909 -#> 4.2537 -0.0385 -#> 5.9323 -1.1033 -#> -1.4235 11.0225 -#> 0.3750 3.9369 -#> -#> (3,1,2,.,.) = -#> 7.3848 -1.2410 -#> -1.4446 1.3323 -#> -1.4867 6.3887 -#> -1.3989 4.3918 -#> -1.6242 0.4001 -#> -#> (1,2,2,.,.) = -#> 2.2515 -11.7348 -#> -2.9592 -0.4199 -#> 6.0865 -2.5877 -#> -3.9066 0.8276 -#> -6.4707 -0.1932 -#> -#> (2,2,2,.,.) = -#> -3.5320 -15.9640 -#> -3.2020 -4.1973 -#> 3.5907 -1.9048 -#> 4.8187 10.5827 -#> 2.0354 3.6088 -#> -#> (3,2,2,.,.) = -#> -1.5620 2.4318 -#> -4.4488 -3.6563 -#> 1.4412 6.2894 -#> 3.7142 -4.4966 -#> -1.9539 0.9322 -#> -#> (1,3,2,.,.) = -#> -2.5739 11.5350 -#> -5.6638 -1.9311 -#> -7.1538 1.8841 -#> 9.9282 -5.0425 -#> 4.0550 -13.0364 -#> -#> (2,3,2,.,.) = -#> -4.4776 -3.1815 -#> 5.2017 7.8643 -#> -1.9960 -3.8433 -#> 1.8730 -3.0428 -#> 0.4528 1.8324 -#> -#> (3,3,2,.,.) = -#> -1.6393 3.0116 -#> -7.4910 4.6567 -#> 3.9892 -1.2984 -#> 2.3925 1.7542 -#> -0.4821 2.7867 -#> -#> (1,1,3,.,.) = -#> -1.9416 6.9300 -#> -5.9122 4.7269 -#> -6.1427 -3.6378 -#> 4.8333 -7.6424 -#> -0.8626 -7.3429 -#> -#> (2,1,3,.,.) = -#> 0.7169 7.3508 -#> -2.2415 0.6318 -#> 8.5924 -3.5120 -#> -1.8660 -2.9021 -#> 2.5242 6.2449 -#> -#> (3,1,3,.,.) = -#> 3.2523 -9.4747 -#> 6.7858 2.1141 -#> -3.1627 -8.3128 -#> 8.1738 1.7333 -#> -12.5794 7.6793 -#> -#> (1,2,3,.,.) = -#> 2.1388 -0.8053 -#> -7.5468 -4.9134 -#> -1.4956 0.7191 -#> 4.5485 1.3213 -#> 1.9775 3.2742 -#> -#> (2,2,3,.,.) = -#> 2.7705 -3.2113 -#> -7.1385 10.5362 -#> 2.1267 -3.4305 -#> -3.4051 -2.5411 -#> -2.7076 1.7485 -#> -#> (3,2,3,.,.) = -#> 3.3394 10.9207 -#> -0.5970 -0.5034 -#> 8.1799 3.0040 -#> -0.9163 -3.5497 -#> -3.1114 -5.1809 -#> -#> (1,3,3,.,.) = -#> -4.1027 -2.0095 -#> 2.7722 -4.7394 -#> -3.5039 -1.9563 -#> -4.3085 8.7966 -#> 2.7449 -4.6416 -#> -#> (2,3,3,.,.) = -#> -8.7689 -5.5072 -#> 0.0556 -7.0627 -#> 6.2366 2.7693 -#> -0.9913 0.2007 -#> 2.4020 2.7007 -#> -#> (3,3,3,.,.) = -#> 0.1972 -1.1722 -#> 7.4460 3.4332 -#> -4.7450 -11.9086 -#> 12.5079 -12.2708 -#> -2.9972 -0.8924 -#> -#> (1,1,4,.,.) = -#> -2.6641 -4.3296 -#> 3.0735 1.2215 -#> 0.2079 -4.6184 -#> -2.3720 -1.2896 -#> -3.9109 -9.9345 -#> -#> (2,1,4,.,.) = -#> -5.0131 -2.2123 -#> 2.7328 -6.4654 -#> -3.3791 -1.4795 -#> -3.0048 4.2683 -#> 8.6069 3.8774 -#> -#> (3,1,4,.,.) = -#> 0.5565 -3.4152 -#> 3.1825 -1.2100 -#> -5.6883 -4.5265 -#> -2.6019 6.2445 -#> -1.2380 0.0733 -#> -#> (1,2,4,.,.) = -#> -0.0619 -1.4112 -#> 3.0482 -12.0317 -#> 2.3142 -5.9550 -#> 1.6332 -4.9998 -#> 3.3110 3.2442 -#> -#> (2,2,4,.,.) = -#> 5.7054 -4.5310 -#> 4.0814 0.5770 -#> -7.4352 1.0886 -#> -4.4217 -0.5822 -#> -2.2274 -2.5406 -#> -#> (3,2,4,.,.) = -#> -6.0243 1.6214 -#> 0.9687 2.5785 -#> -9.2330 12.6507 -#> 0.1245 -3.6675 -#> 2.2133 -0.1646 -#> -#> (1,3,4,.,.) = -#> -9.8819 5.0379 -#> -15.9787 -0.0044 -#> -3.3972 -4.1292 -#> -0.2631 3.1780 -#> -8.3382 2.3260 -#> -#> (2,3,4,.,.) = -#> 2.6493 5.7085 -#> -0.8552 -6.9286 -#> 8.8915 1.1790 -#> -8.0308 -2.3641 -#> -5.5359 -0.2864 -#> -#> (3,3,4,.,.) = -#> 6.0509 -4.4619 -#> 2.5174 1.0450 -#> 3.2386 -2.9879 -#> -4.2332 -11.7932 -#> 8.3415 -11.6524 -#> -#> (1,1,5,.,.) = -#> -2.6344 8.4594 -#> -9.3032 -0.0826 -#> -0.7539 0.4355 -#> -5.7352 6.4314 -#> 1.3710 5.0478 -#> -#> (2,1,5,.,.) = -#> 5.7266 -5.1236 -#> -3.1017 -0.2088 -#> -6.6610 5.8354 -#> 3.1587 1.7131 -#> 3.4059 0.6908 -#> -#> (3,1,5,.,.) = -#> 12.7228 0.6750 -#> -4.4296 8.4994 -#> 2.5491 1.5131 -#> -1.2545 -5.1292 -#> 8.7953 3.5627 -#> -#> (1,2,5,.,.) = -#> -2.5561 -2.6800 -#> 3.0704 6.9316 -#> -4.2936 0.3370 -#> -3.2507 9.0391 -#> 7.8262 -2.8746 -#> -#> (2,2,5,.,.) = -#> -4.7915 -2.9852 -#> 0.4093 -1.2396 -#> 1.8851 0.8607 -#> -3.2552 -4.0731 -#> 7.0195 -3.8280 -#> -#> (3,2,5,.,.) = -#> -6.9953 -12.4542 -#> 7.8834 1.2176 -#> 9.6941 -9.1636 -#> -15.3205 2.1126 -#> 0.3537 -10.4525 -#> -#> (1,3,5,.,.) = -#> -10.7714 0.7571 -#> -0.7565 -0.4820 -#> 5.6134 -8.9826 -#> -1.1373 -4.2921 -#> 4.1096 -1.6137 -#> -#> (2,3,5,.,.) = -#> 3.5382 6.4175 -#> -2.8167 -1.5735 -#> 8.0241 -4.9140 -#> 1.6143 7.2625 -#> -2.5749 4.8191 -#> -#> (3,3,5,.,.) = -#> 9.8500 -0.4376 -#> -4.1255 -2.8707 -#> -4.6457 -0.7456 -#> -4.6735 2.8975 -#> 2.7895 9.0651 -#> [ CPUFloatType{3,3,5,5,2} ]
    -# } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_flatten.html b/docs/reference/torch_flatten.html deleted file mode 100644 index d2fb7847dbd03f949ae10d384d54d12e4476a5b5..0000000000000000000000000000000000000000 --- a/docs/reference/torch_flatten.html +++ /dev/null @@ -1,229 +0,0 @@ - - - - - - - - -Flatten — torch_flatten • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Flatten

    -
    - - -

    Arguments

    - - - - - - - - - - - - - - -
    input

    (Tensor) the input tensor.

    start_dim

    (int) the first dim to flatten

    end_dim

    (int) the last dim to flatten

    - -

    flatten(input, start_dim=0, end_dim=-1) -> Tensor

    - - - - -

    Flattens a contiguous range of dims in a tensor.

    - -

    Examples

    -
    # \dontrun{ - -t = torch_tensor(matrix(c(1, 2), ncol = 2)) -torch_flatten(t)
    #> torch_tensor -#> 1 -#> 2 -#> [ CPUFloatType{2} ]
    torch_flatten(t, start_dim=2)
    #> torch_tensor -#> 1 2 -#> [ CPUFloatType{1,2} ]
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_flip.html b/docs/reference/torch_flip.html deleted file mode 100644 index cbb9aa44dbbff541b8fcd7bf56102c46253728d9..0000000000000000000000000000000000000000 --- a/docs/reference/torch_flip.html +++ /dev/null @@ -1,236 +0,0 @@ - - - - - - - - -Flip — torch_flip • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Flip

    -
    - - -

    Arguments

    - - - - - - - - - - -
    input

    (Tensor) the input tensor.

    dims

    (a list or tuple) axis to flip on

    - -

    flip(input, dims) -> Tensor

    - - - - -

    Reverse the order of a n-D tensor along given axis in dims.

    - -

    Examples

    -
    # \dontrun{ - -x = torch_arange(0, 8)$view(c(2, 2, 2)) -x
    #> torch_tensor -#> (1,.,.) = -#> 0 1 -#> 2 3 -#> -#> (2,.,.) = -#> 4 5 -#> 6 7 -#> [ CPUFloatType{2,2,2} ]
    torch_flip(x, c(1, 2))
    #> torch_tensor -#> (1,.,.) = -#> 6 7 -#> 4 5 -#> -#> (2,.,.) = -#> 2 3 -#> 0 1 -#> [ CPUFloatType{2,2,2} ]
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_floor.html b/docs/reference/torch_floor.html deleted file mode 100644 index 049e2d127e3b15d57ca7300596c9e17bd2b88cb1..0000000000000000000000000000000000000000 --- a/docs/reference/torch_floor.html +++ /dev/null @@ -1,234 +0,0 @@ - - - - - - - - -Floor — torch_floor • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Floor

    -
    - - -

    Arguments

    - - - - - - - - - - -
    input

    (Tensor) the input tensor.

    out

    (Tensor, optional) the output tensor.

    - -

    floor(input, out=None) -> Tensor

    - - - - -

    Returns a new tensor with the floor of the elements of input, -the largest integer less than or equal to each element.

    -

    $$ - \mbox{out}_{i} = \left\lfloor \mbox{input}_{i} \right\rfloor -$$

    - -

    Examples

    -
    # \dontrun{ - -a = torch_randn(c(4)) -a
    #> torch_tensor -#> 0.4440 -#> 0.1195 -#> 2.0850 -#> 0.6923 -#> [ CPUFloatType{4} ]
    torch_floor(a)
    #> torch_tensor -#> 0 -#> 0 -#> 2 -#> 0 -#> [ CPUFloatType{4} ]
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_floor_divide.html b/docs/reference/torch_floor_divide.html deleted file mode 100644 index 5c92aad1e2d5f7561e79bea8bb07e86528f941c5..0000000000000000000000000000000000000000 --- a/docs/reference/torch_floor_divide.html +++ /dev/null @@ -1,231 +0,0 @@ - - - - - - - - -Floor_divide — torch_floor_divide • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Floor_divide

    -
    - - -

    Arguments

    - - - - - - - - - - -
    input

    (Tensor) the numerator tensor

    other

    (Tensor or Scalar) the denominator

    - -

    floor_divide(input, other, out=None) -> Tensor

    - - - - -

    Return the division of the inputs rounded down to the nearest integer. See torch_div -for type promotion and broadcasting rules.

    -

    $$ - \mbox{{out}}_i = \left\lfloor \frac{{\mbox{{input}}_i}}{{\mbox{{other}}_i}} \right\rfloor -$$

    - -

    Examples

    -
    # \dontrun{ - -a = torch_tensor(c(4.0, 3.0)) -b = torch_tensor(c(2.0, 2.0)) -torch_floor_divide(a, b)
    #> torch_tensor -#> 2 -#> 1 -#> [ CPUFloatType{2} ]
    torch_floor_divide(a, 1.4)
    #> torch_tensor -#> 2 -#> 2 -#> [ CPUFloatType{2} ]
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_fmod.html b/docs/reference/torch_fmod.html deleted file mode 100644 index a08d55660f228404209ca56760e806b1ea5881f4..0000000000000000000000000000000000000000 --- a/docs/reference/torch_fmod.html +++ /dev/null @@ -1,240 +0,0 @@ - - - - - - - - -Fmod — torch_fmod • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Fmod

    -
    - - -

    Arguments

    - - - - - - - - - - - - - - -
    input

    (Tensor) the dividend

    other

    (Tensor or float) the divisor, which may be either a number or a tensor of the same shape as the dividend

    out

    (Tensor, optional) the output tensor.

    - -

    fmod(input, other, out=None) -> Tensor

    - - - - -

    Computes the element-wise remainder of division.

    -

    The dividend and divisor may contain both for integer and floating point -numbers. The remainder has the same sign as the dividend input.

    -

    When other is a tensor, the shapes of input and -other must be broadcastable .

    - -

    Examples

    -
    # \dontrun{ - -torch_fmod(torch_tensor(c(-3., -2, -1, 1, 2, 3)), 2)
    #> torch_tensor -#> -1 -#> -0 -#> -1 -#> 1 -#> 0 -#> 1 -#> [ CPUFloatType{6} ]
    torch_fmod(torch_tensor(c(1., 2, 3, 4, 5)), 1.5)
    #> torch_tensor -#> 1.0000 -#> 0.5000 -#> 0.0000 -#> 1.0000 -#> 0.5000 -#> [ CPUFloatType{5} ]
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_frac.html b/docs/reference/torch_frac.html deleted file mode 100644 index 208d4c8a21dd88c704faefd94acdf61441e49a52..0000000000000000000000000000000000000000 --- a/docs/reference/torch_frac.html +++ /dev/null @@ -1,214 +0,0 @@ - - - - - - - - -Frac — torch_frac • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Frac

    -
    - - - -

    frac(input, out=None) -> Tensor

    - - - - -

    Computes the fractional portion of each element in input.

    -

    $$ - \mbox{out}_{i} = \mbox{input}_{i} - \left\lfloor |\mbox{input}_{i}| \right\rfloor * \mbox{sgn}(\mbox{input}_{i}) -$$

    - -

    Examples

    -
    # \dontrun{ - -torch_frac(torch_tensor(c(1, 2.5, -3.2)))
    #> torch_tensor -#> 0.0000 -#> 0.5000 -#> -0.2000 -#> [ CPUFloatType{3} ]
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_full.html b/docs/reference/torch_full.html deleted file mode 100644 index 92b7c788cdf3449499f538be20d42a97e13c34da..0000000000000000000000000000000000000000 --- a/docs/reference/torch_full.html +++ /dev/null @@ -1,251 +0,0 @@ - - - - - - - - -Full — torch_full • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Full

    -
    - - -

    Arguments

    - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    size

    (int...) a list, tuple, or torch_Size of integers defining the shape of the output tensor.

    fill_value

    NA the number to fill the output tensor with.

    out

    (Tensor, optional) the output tensor.

    dtype

    (torch.dtype, optional) the desired data type of returned tensor. Default: if None, uses a global default (see torch_set_default_tensor_type).

    layout

    (torch.layout, optional) the desired layout of returned Tensor. Default: torch_strided.

    device

    (torch.device, optional) the desired device of returned tensor. Default: if None, uses the current device for the default tensor type (see torch_set_default_tensor_type). device will be the CPU for CPU tensor types and the current CUDA device for CUDA tensor types.

    requires_grad

    (bool, optional) If autograd should record operations on the returned tensor. Default: False.

    - -

    full(size, fill_value, out=None, dtype=None, layout=torch.strided, device=None, requires_grad=False) -> Tensor

    - - - - -

    Returns a tensor of size size filled with fill_value.

    -

    Warning

    - - - -

    In PyTorch 1.5 a bool or integral fill_value will produce a warning if -dtype or out are not set. -In a future PyTorch release, when dtype and out are not set -a bool fill_value will return a tensor of torch.bool dtype, -and an integral fill_value will return a tensor of torch.long dtype.

    - -

    Examples

    -
    # \dontrun{ - -torch_full(list(2, 3), 3.141592)
    #> torch_tensor -#> 3.1416 3.1416 3.1416 -#> 3.1416 3.1416 3.1416 -#> [ CPUFloatType{2,3} ]
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_full_like.html b/docs/reference/torch_full_like.html deleted file mode 100644 index f72302efc4f0f8c1ae99520f25ba95d59d8db396..0000000000000000000000000000000000000000 --- a/docs/reference/torch_full_like.html +++ /dev/null @@ -1,237 +0,0 @@ - - - - - - - - -Full_like — torch_full_like • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Full_like

    -
    - - -

    Arguments

    - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    input

    (Tensor) the size of input will determine size of the output tensor.

    fill_value

    NA the number to fill the output tensor with.

    dtype

    (torch.dtype, optional) the desired data type of returned Tensor. Default: if None, defaults to the dtype of input.

    layout

    (torch.layout, optional) the desired layout of returned tensor. Default: if None, defaults to the layout of input.

    device

    (torch.device, optional) the desired device of returned tensor. Default: if None, defaults to the device of input.

    requires_grad

    (bool, optional) If autograd should record operations on the returned tensor. Default: False.

    memory_format

    (torch.memory_format, optional) the desired memory format of returned Tensor. Default: torch_preserve_format.

    - -

    full_like(input, fill_value, out=None, dtype=None, layout=torch.strided, device=None, requires_grad=False,

    - - - - -

    memory_format=torch.preserve_format) -> Tensor

    -

    Returns a tensor with the same size as input filled with fill_value. -torch_full_like(input, fill_value) is equivalent to -torch_full(input.size(), fill_value, dtype=input.dtype, layout=input.layout, device=input.device).

    - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_gather.html b/docs/reference/torch_gather.html deleted file mode 100644 index e558f4991afcbd474be4dbdadf603578b97183d8..0000000000000000000000000000000000000000 --- a/docs/reference/torch_gather.html +++ /dev/null @@ -1,244 +0,0 @@ - - - - - - - - -Gather — torch_gather • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Gather

    -
    - - -

    Arguments

    - - - - - - - - - - - - - - - - - - - - - - -
    input

    (Tensor) the source tensor

    dim

    (int) the axis along which to index

    index

    (LongTensor) the indices of elements to gather

    out

    (Tensor, optional) the destination tensor

    sparse_grad

    (bool,optional) If True, gradient w.r.t. input will be a sparse tensor.

    - -

    gather(input, dim, index, out=None, sparse_grad=False) -> Tensor

    - - - - -

    Gathers values along an axis specified by dim.

    -

    For a 3-D tensor the output is specified by::

    out[i][j][k] = input[index[i][j][k]][j][k]  # if dim == 0
    -out[i][j][k] = input[i][index[i][j][k]][k]  # if dim == 1
    -out[i][j][k] = input[i][j][index[i][j][k]]  # if dim == 2
    - -

    If input is an n-dimensional tensor with size -\((x_0, x_1..., x_{i-1}, x_i, x_{i+1}, ..., x_{n-1})\) -and dim = i, then index must be an \(n\)-dimensional tensor with -size \((x_0, x_1, ..., x_{i-1}, y, x_{i+1}, ..., x_{n-1})\) where \(y \geq 1\) -and out will have the same size as index.

    - -

    Examples

    -
    # \dontrun{ - -t = torch_tensor(matrix(c(1,2,3,4), ncol = 2, byrow = TRUE)) -torch_gather(t, 2, torch_tensor(matrix(c(1,1,2,1), ncol = 2, byrow=TRUE), dtype = torch_int64()))
    #> torch_tensor -#> 1 1 -#> 4 3 -#> [ CPUFloatType{2,2} ]
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_ge.html b/docs/reference/torch_ge.html deleted file mode 100644 index d7fe7ed8de7b2ea86535cd3bc02fc856e215e2d6..0000000000000000000000000000000000000000 --- a/docs/reference/torch_ge.html +++ /dev/null @@ -1,229 +0,0 @@ - - - - - - - - -Ge — torch_ge • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Ge

    -
    - - -

    Arguments

    - - - - - - - - - - - - - - -
    input

    (Tensor) the tensor to compare

    other

    (Tensor or float) the tensor or value to compare

    out

    (Tensor, optional) the output tensor that must be a BoolTensor

    - -

    ge(input, other, out=None) -> Tensor

    - - - - -

    Computes \(\mbox{input} \geq \mbox{other}\) element-wise.

    -

    The second argument can be a number or a tensor whose shape is -broadcastable with the first argument.

    - -

    Examples

    -
    # \dontrun{ - -torch_ge(torch_tensor(matrix(1:4, ncol = 2, byrow=TRUE)), - torch_tensor(matrix(c(1,1,4,4), ncol = 2, byrow=TRUE)))
    #> torch_tensor -#> 1 1 -#> 0 1 -#> [ CPUBoolType{2,2} ]
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_generator.html b/docs/reference/torch_generator.html deleted file mode 100644 index c1116a45ea61a15418ae77e92ecb7c93a02ee4be..0000000000000000000000000000000000000000 --- a/docs/reference/torch_generator.html +++ /dev/null @@ -1,221 +0,0 @@ - - - - - - - - -Create a Generator object — torch_generator • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    A torch_generator is an object which manages the state of the algorithm -that produces pseudo random numbers. Used as a keyword argument in many -In-place random sampling functions.

    -
    - -
    torch_generator()
    - - - -

    Examples

    -
    # \dontrun{ - -# Via string -generator <- torch_generator() -generator$current_seed()
    #> Loading required package: bit64
    #> Loading required package: bit
    #> Attaching package bit
    #> package:bit (c) 2008-2012 Jens Oehlschlaegel (GPL-2)
    #> creators: bit bitwhich
    #> coercion: as.logical as.integer as.bit as.bitwhich which
    #> operator: ! & | xor != ==
    #> querying: print length any all min max range sum summary
    #> bit access: length<- [ [<- [[ [[<-
    #> for more help type ?bit
    #> -#> Attaching package: ‘bit’
    #> The following object is masked from ‘package:base’: -#> -#> xor
    #> Attaching package bit64
    #> package:bit64 (c) 2011-2012 Jens Oehlschlaegel
    #> creators: integer64 seq :
    #> coercion: as.integer64 as.vector as.logical as.integer as.double as.character as.bin
    #> logical operator: ! & | xor != == < <= >= >
    #> arithmetic operator: + - * / %/% %% ^
    #> math: sign abs sqrt log log2 log10
    #> math: floor ceiling trunc round
    #> querying: is.integer64 is.vector [is.atomic} [length] format print str
    #> values: is.na is.nan is.finite is.infinite
    #> aggregation: any all min max range sum prod
    #> cumulation: diff cummin cummax cumsum cumprod
    #> access: length<- [ [<- [[ [[<-
    #> combine: c rep cbind rbind as.data.frame
    #> WARNING don't use as subscripts
    #> WARNING semantics differ from integer
    #> for more help type ?bit64
    #> -#> Attaching package: ‘bit64’
    #> The following object is masked from ‘package:bit’: -#> -#> still.identical
    #> The following objects are masked from ‘package:base’: -#> -#> :, %in%, is.double, match, order, rank
    #> integer64 -#> [1] 67280421310721
    generator$set_current_seed(1234567L) -generator$current_seed()
    #> integer64 -#> [1] 1234567
    - -# } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_geqrf.html b/docs/reference/torch_geqrf.html deleted file mode 100644 index fed6e043b5c98360e24c507d23f30f6faa920b47..0000000000000000000000000000000000000000 --- a/docs/reference/torch_geqrf.html +++ /dev/null @@ -1,221 +0,0 @@ - - - - - - - - -Geqrf — torch_geqrf • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Geqrf

    -
    - - -

    Arguments

    - - - - - - - - - - -
    input

    (Tensor) the input matrix

    out

    (tuple, optional) the output tuple of (Tensor, Tensor)

    - -

    geqrf(input, out=None) -> (Tensor, Tensor)

    - - - - -

    This is a low-level function for calling LAPACK directly. This function -returns a namedtuple (a, tau) as defined in LAPACK documentation for geqrf_ .

    -

    You'll generally want to use torch_qr instead.

    -

    Computes a QR decomposition of input, but without constructing -\(Q\) and \(R\) as explicit separate matrices.

    -

    Rather, this directly calls the underlying LAPACK function ?geqrf -which produces a sequence of 'elementary reflectors'.

    -

    See LAPACK documentation for geqrf_ for further details.

    - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_ger.html b/docs/reference/torch_ger.html deleted file mode 100644 index c95624c0c0cd51b650f229575a6b360f4cb0ef23..0000000000000000000000000000000000000000 --- a/docs/reference/torch_ger.html +++ /dev/null @@ -1,235 +0,0 @@ - - - - - - - - -Ger — torch_ger • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Ger

    -
    - - -

    Arguments

    - - - - - - - - - - - - - - -
    input

    (Tensor) 1-D input vector

    vec2

    (Tensor) 1-D input vector

    out

    (Tensor, optional) optional output matrix

    - -

    Note

    - -

    This function does not broadcast .

    -

    ger(input, vec2, out=None) -> Tensor

    - - - - -

    Outer product of input and vec2. -If input is a vector of size \(n\) and vec2 is a vector of -size \(m\), then out must be a matrix of size \((n \times m)\).

    - -

    Examples

    -
    # \dontrun{ - -v1 = torch_arange(1., 5.) -v2 = torch_arange(1., 4.) -torch_ger(v1, v2)
    #> torch_tensor -#> 1 2 3 -#> 2 4 6 -#> 3 6 9 -#> 4 8 12 -#> [ CPUFloatType{4,3} ]
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_gt.html b/docs/reference/torch_gt.html deleted file mode 100644 index 1a50fd6b30907e3c7f7f969932f75d4b8525be8f..0000000000000000000000000000000000000000 --- a/docs/reference/torch_gt.html +++ /dev/null @@ -1,229 +0,0 @@ - - - - - - - - -Gt — torch_gt • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Gt

    -
    - - -

    Arguments

    - - - - - - - - - - - - - - -
    input

    (Tensor) the tensor to compare

    other

    (Tensor or float) the tensor or value to compare

    out

    (Tensor, optional) the output tensor that must be a BoolTensor

    - -

    gt(input, other, out=None) -> Tensor

    - - - - -

    Computes \(\mbox{input} > \mbox{other}\) element-wise.

    -

    The second argument can be a number or a tensor whose shape is -broadcastable with the first argument.

    - -

    Examples

    -
    # \dontrun{ - -torch_gt(torch_tensor(matrix(1:4, ncol = 2, byrow=TRUE)), - torch_tensor(matrix(c(1,1,4,4), ncol = 2, byrow=TRUE)))
    #> torch_tensor -#> 0 1 -#> 0 0 -#> [ CPUBoolType{2,2} ]
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_hamming_window.html b/docs/reference/torch_hamming_window.html deleted file mode 100644 index 4c0a0128c5c059aa9bab996823d621d28e224808..0000000000000000000000000000000000000000 --- a/docs/reference/torch_hamming_window.html +++ /dev/null @@ -1,259 +0,0 @@ - - - - - - - - -Hamming_window — torch_hamming_window • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Hamming_window

    -
    - - -

    Arguments

    - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    window_length

    (int) the size of returned window

    periodic

    (bool, optional) If True, returns a window to be used as periodic function. If False, return a symmetric window.

    alpha

    (float, optional) The coefficient \(\alpha\) in the equation above

    beta

    (float, optional) The coefficient \(\beta\) in the equation above

    dtype

    (torch.dtype, optional) the desired data type of returned tensor. Default: if None, uses a global default (see torch_set_default_tensor_type). Only floating point types are supported.

    layout

    (torch.layout, optional) the desired layout of returned window tensor. Only torch_strided (dense layout) is supported.

    device

    (torch.device, optional) the desired device of returned tensor. Default: if None, uses the current device for the default tensor type (see torch_set_default_tensor_type). device will be the CPU for CPU tensor types and the current CUDA device for CUDA tensor types.

    requires_grad

    (bool, optional) If autograd should record operations on the returned tensor. Default: False.

    - -

    Note

    - - -
    If `window_length` \eqn{=1}, the returned window contains a single value 1.
    -
    - -
    This is a generalized version of `torch_hann_window`.
    -
    - -

    hamming_window(window_length, periodic=True, alpha=0.54, beta=0.46, dtype=None, layout=torch.strided, device=None, requires_grad=False) -> Tensor

    - - - - -

    Hamming window function.

    -

    $$ - w[n] = \alpha - \beta\ \cos \left( \frac{2 \pi n}{N - 1} \right), -$$ -where \(N\) is the full window size.

    -

    The input window_length is a positive integer controlling the -returned window size. periodic flag determines whether the returned -window trims off the last duplicate value from the symmetric window and is -ready to be used as a periodic window with functions like -torch_stft. Therefore, if periodic is true, the \(N\) in -above formula is in fact \(\mbox{window\_length} + 1\). Also, we always have -torch_hamming_window(L, periodic=True) equal to -torch_hamming_window(L + 1, periodic=False)[:-1]).

    - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_hann_window.html b/docs/reference/torch_hann_window.html deleted file mode 100644 index bba67df29516a0e266ba171b4fb594d9bdc0625b..0000000000000000000000000000000000000000 --- a/docs/reference/torch_hann_window.html +++ /dev/null @@ -1,249 +0,0 @@ - - - - - - - - -Hann_window — torch_hann_window • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Hann_window

    -
    - - -

    Arguments

    - - - - - - - - - - - - - - - - - - - - - - - - - - -
    window_length

    (int) the size of returned window

    periodic

    (bool, optional) If True, returns a window to be used as periodic function. If False, return a symmetric window.

    dtype

    (torch.dtype, optional) the desired data type of returned tensor. Default: if None, uses a global default (see torch_set_default_tensor_type). Only floating point types are supported.

    layout

    (torch.layout, optional) the desired layout of returned window tensor. Only torch_strided (dense layout) is supported.

    device

    (torch.device, optional) the desired device of returned tensor. Default: if None, uses the current device for the default tensor type (see torch_set_default_tensor_type). device will be the CPU for CPU tensor types and the current CUDA device for CUDA tensor types.

    requires_grad

    (bool, optional) If autograd should record operations on the returned tensor. Default: False.

    - -

    Note

    - - -
    If `window_length` \eqn{=1}, the returned window contains a single value 1.
    -
    - -

    hann_window(window_length, periodic=True, dtype=None, layout=torch.strided, device=None, requires_grad=False) -> Tensor

    - - - - -

    Hann window function.

    -

    $$ - w[n] = \frac{1}{2}\ \left[1 - \cos \left( \frac{2 \pi n}{N - 1} \right)\right] = - \sin^2 \left( \frac{\pi n}{N - 1} \right), -$$ -where \(N\) is the full window size.

    -

    The input window_length is a positive integer controlling the -returned window size. periodic flag determines whether the returned -window trims off the last duplicate value from the symmetric window and is -ready to be used as a periodic window with functions like -torch_stft. Therefore, if periodic is true, the \(N\) in -above formula is in fact \(\mbox{window\_length} + 1\). Also, we always have -torch_hann_window(L, periodic=True) equal to -torch_hann_window(L + 1, periodic=False)[:-1]).

    - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_histc.html b/docs/reference/torch_histc.html deleted file mode 100644 index 6478ebf8d54f697232666d6e5309b594aeb3be5e..0000000000000000000000000000000000000000 --- a/docs/reference/torch_histc.html +++ /dev/null @@ -1,239 +0,0 @@ - - - - - - - - -Histc — torch_histc • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Histc

    -
    - - -

    Arguments

    - - - - - - - - - - - - - - - - - - - - - - -
    input

    (Tensor) the input tensor.

    bins

    (int) number of histogram bins

    min

    (int) lower end of the range (inclusive)

    max

    (int) upper end of the range (inclusive)

    out

    (Tensor, optional) the output tensor.

    - -

    histc(input, bins=100, min=0, max=0, out=None) -> Tensor

    - - - - -

    Computes the histogram of a tensor.

    -

    The elements are sorted into equal width bins between min and -max. If min and max are both zero, the minimum and -maximum values of the data are used.

    - -

    Examples

    -
    # \dontrun{ - -torch_histc(torch_tensor(c(1., 2, 1)), bins=4, min=0, max=3)
    #> torch_tensor -#> 0 -#> 2 -#> 1 -#> 0 -#> [ CPUFloatType{4} ]
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_ifft.html b/docs/reference/torch_ifft.html deleted file mode 100644 index bd4e5d815fce4b92b04f1c6134b37c2e08fe28f9..0000000000000000000000000000000000000000 --- a/docs/reference/torch_ifft.html +++ /dev/null @@ -1,288 +0,0 @@ - - - - - - - - -Ifft — torch_ifft • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Ifft

    -
    - - -

    Arguments

    - - - - - - - - - - - - - - -
    input

    (Tensor) the input tensor of at least signal_ndim + 1 dimensions

    signal_ndim

    (int) the number of dimensions in each signal. signal_ndim can only be 1, 2 or 3

    normalized

    (bool, optional) controls whether to return normalized results. Default: False

    - -

    Note

    - - -
    For CUDA tensors, an LRU cache is used for cuFFT plans to speed up
    -repeatedly running FFT methods on tensors of same geometry with same
    -configuration. See cufft-plan-cache for more details on how to
    -monitor and control the cache.
    -
    - -

    ifft(input, signal_ndim, normalized=False) -> Tensor

    - - - - -

    Complex-to-complex Inverse Discrete Fourier Transform

    -

    This method computes the complex-to-complex inverse discrete Fourier -transform. Ignoring the batch dimensions, it computes the following -expression:

    -

    $$ - X[\omega_1, \dots, \omega_d] = - \frac{1}{\prod_{i=1}^d N_i} \sum_{n_1=0}^{N_1-1} \dots \sum_{n_d=0}^{N_d-1} x[n_1, \dots, n_d] - e^{\ j\ 2 \pi \sum_{i=0}^d \frac{\omega_i n_i}{N_i}}, -$$ -where \(d\) = signal_ndim is number of dimensions for the -signal, and \(N_i\) is the size of signal dimension \(i\).

    -

    The argument specifications are almost identical with torch_fft. -However, if normalized is set to True, this instead returns the -results multiplied by \(\sqrt{\prod_{i=1}^d N_i}\), to become a unitary -operator. Therefore, to invert a torch_fft, the normalized -argument should be set identically for torch_fft.

    -

    Returns the real and the imaginary parts together as one tensor of the same -shape of input.

    -

    The inverse of this function is torch_fft.

    -

    Warning

    - - - -

    For CPU tensors, this method is currently only available with MKL. Use -torch_backends.mkl.is_available to check if MKL is installed.

    - -

    Examples

    -
    # \dontrun{ - -x = torch_randn(c(3, 3, 2)) -x
    #> torch_tensor -#> (1,.,.) = -#> -0.4097 -0.4074 -#> 0.4656 0.3306 -#> -0.5886 -2.1190 -#> -#> (2,.,.) = -#> 0.3168 -1.2703 -#> -0.7706 -0.4545 -#> 0.5461 1.3650 -#> -#> (3,.,.) = -#> -0.0778 -0.2976 -#> -0.1324 0.2705 -#> 1.3860 -0.0941 -#> [ CPUFloatType{3,3,2} ]
    y = torch_fft(x, 2) -torch_ifft(y, 2) # recover x
    #> torch_tensor -#> (1,.,.) = -#> -0.4097 -0.4074 -#> 0.4656 0.3306 -#> -0.5886 -2.1190 -#> -#> (2,.,.) = -#> 0.3168 -1.2703 -#> -0.7706 -0.4545 -#> 0.5461 1.3650 -#> -#> (3,.,.) = -#> -0.0778 -0.2976 -#> -0.1324 0.2705 -#> 1.3860 -0.0941 -#> [ CPUFloatType{3,3,2} ]
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_imag.html b/docs/reference/torch_imag.html deleted file mode 100644 index 4a96e543ad2850c2c3c1d8eed2ab3fc9351d1b5a..0000000000000000000000000000000000000000 --- a/docs/reference/torch_imag.html +++ /dev/null @@ -1,224 +0,0 @@ - - - - - - - - -Imag — torch_imag • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Imag

    -
    - - -

    Arguments

    - - - - - - - - - - -
    input

    (Tensor) the input tensor.

    out

    (Tensor, optional) the output tensor.

    - -

    imag(input, out=None) -> Tensor

    - - - - -

    Returns the imaginary part of the input tensor.

    -

    Warning

    - - - -

    Not yet implemented.

    -

    $$ - \mbox{out}_{i} = imag(\mbox{input}_{i}) -$$

    - -

    Examples

    -
    
    -  
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_index_select.html b/docs/reference/torch_index_select.html deleted file mode 100644 index 1b2735804c94f75c80dadcd8fe49aba53e122bbf..0000000000000000000000000000000000000000 --- a/docs/reference/torch_index_select.html +++ /dev/null @@ -1,250 +0,0 @@ - - - - - - - - -Index_select — torch_index_select • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Index_select

    -
    - - -

    Arguments

    - - - - - - - - - - - - - - - - - - -
    input

    (Tensor) the input tensor.

    dim

    (int) the dimension in which we index

    index

    (LongTensor) the 1-D tensor containing the indices to index

    out

    (Tensor, optional) the output tensor.

    - -

    Note

    - -

    The returned tensor does not use the same storage as the original -tensor. If out has a different shape than expected, we -silently change it to the correct shape, reallocating the underlying -storage if necessary.

    -

    index_select(input, dim, index, out=None) -> Tensor

    - - - - -

    Returns a new tensor which indexes the input tensor along dimension -dim using the entries in index which is a LongTensor.

    -

    The returned tensor has the same number of dimensions as the original tensor -(input). The dim\ th dimension has the same size as the length -of index; other dimensions have the same size as in the original tensor.

    - -

    Examples

    -
    # \dontrun{ - -x = torch_randn(c(3, 4)) -x
    #> torch_tensor -#> -0.6090 0.2050 -0.2528 0.6190 -#> 0.3118 1.3114 -1.0626 -1.2048 -#> 0.3125 -0.2527 0.4871 -0.3394 -#> [ CPUFloatType{3,4} ]
    indices = torch_tensor(c(1, 3), dtype = torch_int64()) -torch_index_select(x, 1, indices)
    #> torch_tensor -#> -0.6090 0.2050 -0.2528 0.6190 -#> 0.3125 -0.2527 0.4871 -0.3394 -#> [ CPUFloatType{2,4} ]
    torch_index_select(x, 2, indices)
    #> torch_tensor -#> -0.6090 -0.2528 -#> 0.3118 -1.0626 -#> 0.3125 0.4871 -#> [ CPUFloatType{3,2} ]
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_inverse.html b/docs/reference/torch_inverse.html deleted file mode 100644 index f1bf73a3551704a6bee4a0bcb87f0eca8b771f38..0000000000000000000000000000000000000000 --- a/docs/reference/torch_inverse.html +++ /dev/null @@ -1,225 +0,0 @@ - - - - - - - - -Inverse — torch_inverse • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Inverse

    -
    - - -

    Arguments

    - - - - - - - - - - -
    input

    (Tensor) the input tensor of size \((*, n, n)\) where * is zero or more batch dimensions

    out

    (Tensor, optional) the output tensor.

    - -

    Note

    - - -
    Irrespective of the original strides, the returned tensors will be
    -transposed, i.e. with strides like `input.contiguous().transpose(-2, -1).stride()`
    -
    - -

    inverse(input, out=None) -> Tensor

    - - - - -

    Takes the inverse of the square matrix input. input can be batches -of 2D square tensors, in which case this function would return a tensor composed of -individual inverses.

    - -

    Examples

    -
    
    -  
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_irfft.html b/docs/reference/torch_irfft.html deleted file mode 100644 index eed178faf78db3fb8aa6981e329631d138d24375..0000000000000000000000000000000000000000 --- a/docs/reference/torch_irfft.html +++ /dev/null @@ -1,328 +0,0 @@ - - - - - - - - -Irfft — torch_irfft • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Irfft

    -
    - - -

    Arguments

    - - - - - - - - - - - - - - - - - - - - - - -
    input

    (Tensor) the input tensor of at least signal_ndim + 1 dimensions

    signal_ndim

    (int) the number of dimensions in each signal. signal_ndim can only be 1, 2 or 3

    normalized

    (bool, optional) controls whether to return normalized results. Default: False

    onesided

    (bool, optional) controls whether input was halfed to avoid redundancy, e.g., by torch_rfft(). Default: True

    signal_sizes

    (list or torch.Size, optional) the size of the original signal (without batch dimension). Default: None

    - -

    Note

    - - -
    Due to the conjugate symmetry, `input` do not need to contain the full
    -complex frequency values. Roughly half of the values will be sufficient, as
    -is the case when `input` is given by [`~torch.rfft`] with
    -``rfft(signal, onesided=True)``. In such case, set the `onesided`
    -argument of this method to ``True``. Moreover, the original signal shape
    -information can sometimes be lost, optionally set `signal_sizes` to be
    -the size of the original signal (without the batch dimensions if in batched
    -mode) to recover it with correct shape.
    -
    -Therefore, to invert an [torch_rfft()], the `normalized` and
    -`onesided` arguments should be set identically for [torch_irfft()],
    -and preferably a `signal_sizes` is given to avoid size mismatch. See the
    -example below for a case of size mismatch.
    -
    -See [torch_rfft()] for details on conjugate symmetry.
    -
    - -

    The inverse of this function is torch_rfft().

    -
    For CUDA tensors, an LRU cache is used for cuFFT plans to speed up
    -repeatedly running FFT methods on tensors of same geometry with same
    -configuration. See cufft-plan-cache for more details on how to
    -monitor and control the cache.
    -
    - -

    irfft(input, signal_ndim, normalized=False, onesided=True, signal_sizes=None) -> Tensor

    - - - - -

    Complex-to-real Inverse Discrete Fourier Transform

    -

    This method computes the complex-to-real inverse discrete Fourier transform. -It is mathematically equivalent with torch_ifft with differences only in -formats of the input and output.

    -

    The argument specifications are almost identical with torch_ifft. -Similar to torch_ifft, if normalized is set to True, -this normalizes the result by multiplying it with -\(\sqrt{\prod_{i=1}^K N_i}\) so that the operator is unitary, where -\(N_i\) is the size of signal dimension \(i\).

    -

    Warning

    - - - -

    Generally speaking, input to this function should contain values -following conjugate symmetry. Note that even if onesided is -True, often symmetry on some part is still needed. When this -requirement is not satisfied, the behavior of torch_irfft is -undefined. Since torch_autograd.gradcheck estimates numerical -Jacobian with point perturbations, torch_irfft will almost -certainly fail the check.

    - -

    For CPU tensors, this method is currently only available with MKL. Use -torch_backends.mkl.is_available to check if MKL is installed.

    - -

    Examples

    -
    # \dontrun{ - -x = torch_randn(c(4, 4)) -torch_rfft(x, 2, onesided=TRUE)
    #> torch_tensor -#> (1,.,.) = -#> -2.6442 0.0000 -#> 3.2951 -1.0012 -#> -10.0001 0.0000 -#> -#> (2,.,.) = -#> 1.3956 -4.9153 -#> 3.1204 7.3054 -#> -1.9820 2.2872 -#> -#> (3,.,.) = -#> 3.3926 0.0000 -#> 5.2495 2.0603 -#> -4.0185 0.0000 -#> -#> (4,.,.) = -#> 1.3956 4.9153 -#> 4.1712 1.2315 -#> -1.9820 -2.2872 -#> [ CPUFloatType{4,3,2} ]
    x = torch_randn(c(4, 5)) -torch_rfft(x, 2, onesided=TRUE)
    #> torch_tensor -#> (1,.,.) = -#> 6.1729 0.0000 -#> 2.7828 -0.8086 -#> 0.0001 -0.7514 -#> -#> (2,.,.) = -#> -0.3550 1.3322 -#> -1.3598 3.0191 -#> 1.5158 -3.2844 -#> -#> (3,.,.) = -#> 5.1938 0.0000 -#> -2.5545 1.6405 -#> 7.3076 5.2401 -#> -#> (4,.,.) = -#> -0.3550 -1.3322 -#> 0.9130 -2.5817 -#> -0.4966 -2.0859 -#> [ CPUFloatType{4,3,2} ]
    y = torch_rfft(x, 2, onesided=TRUE) -torch_irfft(y, 2, onesided=TRUE, signal_sizes=c(4,5)) # recover x
    #> torch_tensor -#> 1.3437 -0.2165 0.6494 0.9663 -0.0787 -#> -0.7215 1.0848 -0.3526 -0.1508 -0.2811 -#> 1.3002 -0.5011 1.6580 -0.1709 0.7330 -#> 0.4254 1.6956 -1.7164 -0.2147 0.7211 -#> [ CPUFloatType{4,5} ]
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_is_complex.html b/docs/reference/torch_is_complex.html deleted file mode 100644 index da9faf7a48193aaec641f6b5f796ced4438bc95b..0000000000000000000000000000000000000000 --- a/docs/reference/torch_is_complex.html +++ /dev/null @@ -1,211 +0,0 @@ - - - - - - - - -Is_complex — torch_is_complex • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Is_complex

    -
    - - -

    Arguments

    - - - - - - -
    input

    (Tensor) the PyTorch tensor to test

    - -

    is_complex(input) -> (bool)

    - - - - -

    Returns True if the data type of input is a complex data type i.e., -one of torch_complex64, and torch.complex128.

    - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_is_floating_point.html b/docs/reference/torch_is_floating_point.html deleted file mode 100644 index 8aa3704de70ad06ad6960b16af914ac8c2f903a7..0000000000000000000000000000000000000000 --- a/docs/reference/torch_is_floating_point.html +++ /dev/null @@ -1,211 +0,0 @@ - - - - - - - - -Is_floating_point — torch_is_floating_point • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Is_floating_point

    -
    - - -

    Arguments

    - - - - - - -
    input

    (Tensor) the PyTorch tensor to test

    - -

    is_floating_point(input) -> (bool)

    - - - - -

    Returns True if the data type of input is a floating point data type i.e., -one of torch_float64, torch.float32 and torch.float16.

    - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_isfinite.html b/docs/reference/torch_isfinite.html deleted file mode 100644 index 96e5a15008991f5366b21c710628b0f91520fdfb..0000000000000000000000000000000000000000 --- a/docs/reference/torch_isfinite.html +++ /dev/null @@ -1,221 +0,0 @@ - - - - - - - - -Isfinite — torch_isfinite • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Isfinite

    -
    - - -

    Arguments

    - - - - - - -
    tensor

    (Tensor) A tensor to check

    - -

    TEST

    - - - - -

    Returns a new tensor with boolean elements representing if each element is Finite or not.

    - -

    Examples

    -
    # \dontrun{ - -torch_isfinite(torch_tensor(c(1, Inf, 2, -Inf, NaN)))
    #> torch_tensor -#> 1 -#> 0 -#> 1 -#> 0 -#> 0 -#> [ CPUBoolType{5} ]
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_isinf.html b/docs/reference/torch_isinf.html deleted file mode 100644 index 3d36a65f54873b6b5f3381d97730722b114d1b69..0000000000000000000000000000000000000000 --- a/docs/reference/torch_isinf.html +++ /dev/null @@ -1,221 +0,0 @@ - - - - - - - - -Isinf — torch_isinf • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Isinf

    -
    - - -

    Arguments

    - - - - - - -
    tensor

    (Tensor) A tensor to check

    - -

    TEST

    - - - - -

    Returns a new tensor with boolean elements representing if each element is +/-INF or not.

    - -

    Examples

    -
    # \dontrun{ - -torch_isinf(torch_tensor(c(1, Inf, 2, -Inf, NaN)))
    #> torch_tensor -#> 0 -#> 1 -#> 0 -#> 1 -#> 0 -#> [ CPUBoolType{5} ]
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_isnan.html b/docs/reference/torch_isnan.html deleted file mode 100644 index 776322050102cf40398be9dfa218b3a3e939c865..0000000000000000000000000000000000000000 --- a/docs/reference/torch_isnan.html +++ /dev/null @@ -1,219 +0,0 @@ - - - - - - - - -Isnan — torch_isnan • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Isnan

    -
    - - -

    Arguments

    - - - - - - -
    input

    (Tensor) A tensor to check

    - -

    TEST

    - - - - -

    Returns a new tensor with boolean elements representing if each element is NaN or not.

    - -

    Examples

    -
    # \dontrun{ - -torch_isnan(torch_tensor(c(1, NaN, 2)))
    #> torch_tensor -#> 0 -#> 1 -#> 0 -#> [ CPUBoolType{3} ]
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_kthvalue.html b/docs/reference/torch_kthvalue.html deleted file mode 100644 index 03ee76beda94ab4ca7e1c6a4f1713b6d86916ad1..0000000000000000000000000000000000000000 --- a/docs/reference/torch_kthvalue.html +++ /dev/null @@ -1,268 +0,0 @@ - - - - - - - - -Kthvalue — torch_kthvalue • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Kthvalue

    -
    - - -

    Arguments

    - - - - - - - - - - - - - - - - - - - - - - -
    input

    (Tensor) the input tensor.

    k

    (int) k for the k-th smallest element

    dim

    (int, optional) the dimension to find the kth value along

    keepdim

    (bool) whether the output tensor has dim retained or not.

    out

    (tuple, optional) the output tuple of (Tensor, LongTensor) can be optionally given to be used as output buffers

    - -

    kthvalue(input, k, dim=None, keepdim=False, out=None) -> (Tensor, LongTensor)

    - - - - -

    Returns a namedtuple (values, indices) where values is the k th -smallest element of each row of the input tensor in the given dimension -dim. And indices is the index location of each element found.

    -

    If dim is not given, the last dimension of the input is chosen.

    -

    If keepdim is True, both the values and indices tensors -are the same size as input, except in the dimension dim where -they are of size 1. Otherwise, dim is squeezed -(see torch_squeeze), resulting in both the values and -indices tensors having 1 fewer dimension than the input tensor.

    - -

    Examples

    -
    # \dontrun{ - -x = torch_arange(1., 6.) -x
    #> torch_tensor -#> 1 -#> 2 -#> 3 -#> 4 -#> 5 -#> [ CPUFloatType{5} ]
    torch_kthvalue(x, 4)
    #> [[1]] -#> torch_tensor -#> 4 -#> [ CPUFloatType{} ] -#> -#> [[2]] -#> torch_tensor -#> 3 -#> [ CPULongType{} ] -#>
    x=torch_arange(1.,7.)$resize_(c(2,3)) -x
    #> torch_tensor -#> 1 2 3 -#> 4 5 6 -#> [ CPUFloatType{2,3} ]
    torch_kthvalue(x, 2, 1, TRUE)
    #> [[1]] -#> torch_tensor -#> 4 5 6 -#> [ CPUFloatType{1,3} ] -#> -#> [[2]] -#> torch_tensor -#> 1 1 1 -#> [ CPULongType{1,3} ] -#>
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_layout.html b/docs/reference/torch_layout.html deleted file mode 100644 index 0dc25d4209b841deb0a17dfa0a0f6c60ecb7b5dc..0000000000000000000000000000000000000000 --- a/docs/reference/torch_layout.html +++ /dev/null @@ -1,199 +0,0 @@ - - - - - - - - -Creates the corresponding layout — torch_layout • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Creates the corresponding layout

    -
    - -
    torch_strided()
    -
    -torch_sparse_coo()
    - - - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_le.html b/docs/reference/torch_le.html deleted file mode 100644 index ca7881c968a3e7b6502e797d7f8918ba50825e42..0000000000000000000000000000000000000000 --- a/docs/reference/torch_le.html +++ /dev/null @@ -1,229 +0,0 @@ - - - - - - - - -Le — torch_le • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Le

    -
    - - -

    Arguments

    - - - - - - - - - - - - - - -
    input

    (Tensor) the tensor to compare

    other

    (Tensor or float) the tensor or value to compare

    out

    (Tensor, optional) the output tensor that must be a BoolTensor

    - -

    le(input, other, out=None) -> Tensor

    - - - - -

    Computes \(\mbox{input} \leq \mbox{other}\) element-wise.

    -

    The second argument can be a number or a tensor whose shape is -broadcastable with the first argument.

    - -

    Examples

    -
    # \dontrun{ - -torch_le(torch_tensor(matrix(1:4, ncol = 2, byrow=TRUE)), - torch_tensor(matrix(c(1,1,4,4), ncol = 2, byrow=TRUE)))
    #> torch_tensor -#> 1 0 -#> 1 1 -#> [ CPUBoolType{2,2} ]
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_lerp.html b/docs/reference/torch_lerp.html deleted file mode 100644 index bc8f3af9cfb01b3d7ac9af9ae4cf05d01b397056..0000000000000000000000000000000000000000 --- a/docs/reference/torch_lerp.html +++ /dev/null @@ -1,256 +0,0 @@ - - - - - - - - -Lerp — torch_lerp • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Lerp

    -
    - - -

    Arguments

    - - - - - - - - - - - - - - - - - - -
    input

    (Tensor) the tensor with the starting points

    end

    (Tensor) the tensor with the ending points

    weight

    (float or tensor) the weight for the interpolation formula

    out

    (Tensor, optional) the output tensor.

    - -

    lerp(input, end, weight, out=None)

    - - - - -

    Does a linear interpolation of two tensors start (given by input) and end based -on a scalar or tensor weight and returns the resulting out tensor.

    -

    $$ - \mbox{out}_i = \mbox{start}_i + \mbox{weight}_i \times (\mbox{end}_i - \mbox{start}_i) -$$ -The shapes of start and end must be -broadcastable . If weight is a tensor, then -the shapes of weight, start, and end must be broadcastable .

    - -

    Examples

    -
    # \dontrun{ - -start = torch_arange(1., 5.) -end = torch_empty(4)$fill_(10) -start
    #> torch_tensor -#> 1 -#> 2 -#> 3 -#> 4 -#> [ CPUFloatType{4} ]
    end
    #> torch_tensor -#> 10 -#> 10 -#> 10 -#> 10 -#> [ CPUFloatType{4} ]
    torch_lerp(start, end, 0.5)
    #> torch_tensor -#> 5.5000 -#> 6.0000 -#> 6.5000 -#> 7.0000 -#> [ CPUFloatType{4} ]
    torch_lerp(start, end, torch_full_like(start, 0.5))
    #> torch_tensor -#> 5.5000 -#> 6.0000 -#> 6.5000 -#> 7.0000 -#> [ CPUFloatType{4} ]
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_lgamma.html b/docs/reference/torch_lgamma.html deleted file mode 100644 index 8b3cf96c0eb1d27932186a8f80de74cef07745a8..0000000000000000000000000000000000000000 --- a/docs/reference/torch_lgamma.html +++ /dev/null @@ -1,227 +0,0 @@ - - - - - - - - -Lgamma — torch_lgamma • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Lgamma

    -
    - - -

    Arguments

    - - - - - - - - - - -
    input

    (Tensor) the input tensor.

    out

    (Tensor, optional) the output tensor.

    - -

    lgamma(input, out=None) -> Tensor

    - - - - -

    Computes the logarithm of the gamma function on input.

    -

    $$ - \mbox{out}_{i} = \log \Gamma(\mbox{input}_{i}) -$$

    - -

    Examples

    -
    # \dontrun{ - -a = torch_arange(0.5, 2, 0.5) -torch_lgamma(a)
    #> torch_tensor -#> 0.5724 -#> 0.0000 -#> -0.1208 -#> [ CPUFloatType{3} ]
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_linspace.html b/docs/reference/torch_linspace.html deleted file mode 100644 index fe80f7dcf75a4d6f0ed8a24fde5ac4ce43f59cda..0000000000000000000000000000000000000000 --- a/docs/reference/torch_linspace.html +++ /dev/null @@ -1,265 +0,0 @@ - - - - - - - - -Linspace — torch_linspace • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Linspace

    -
    - - -

    Arguments

    - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    start

    (float) the starting value for the set of points

    end

    (float) the ending value for the set of points

    steps

    (int) number of points to sample between start and end. Default: 100.

    out

    (Tensor, optional) the output tensor.

    dtype

    (torch.dtype, optional) the desired data type of returned tensor. Default: if None, uses a global default (see torch_set_default_tensor_type).

    layout

    (torch.layout, optional) the desired layout of returned Tensor. Default: torch_strided.

    device

    (torch.device, optional) the desired device of returned tensor. Default: if None, uses the current device for the default tensor type (see torch_set_default_tensor_type). device will be the CPU for CPU tensor types and the current CUDA device for CUDA tensor types.

    requires_grad

    (bool, optional) If autograd should record operations on the returned tensor. Default: False.

    - -

    linspace(start, end, steps=100, out=None, dtype=None, layout=torch.strided, device=None, requires_grad=False) -> Tensor

    - - - - -

    Returns a one-dimensional tensor of steps -equally spaced points between start and end.

    -

    The output tensor is 1-D of size steps.

    - -

    Examples

    -
    # \dontrun{ - -torch_linspace(3, 10, steps=5)
    #> torch_tensor -#> 3.0000 -#> 4.7500 -#> 6.5000 -#> 8.2500 -#> 10.0000 -#> [ CPUFloatType{5} ]
    torch_linspace(-10, 10, steps=5)
    #> torch_tensor -#> -10 -#> -5 -#> 0 -#> 5 -#> 10 -#> [ CPUFloatType{5} ]
    torch_linspace(start=-10, end=10, steps=5)
    #> torch_tensor -#> -10 -#> -5 -#> 0 -#> 5 -#> 10 -#> [ CPUFloatType{5} ]
    torch_linspace(start=-10, end=10, steps=1)
    #> torch_tensor -#> -10 -#> [ CPUFloatType{1} ]
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_load.html b/docs/reference/torch_load.html deleted file mode 100644 index cbb7eee5dfbd9415dfd02531c8f8ec5a5091c44a..0000000000000000000000000000000000000000 --- a/docs/reference/torch_load.html +++ /dev/null @@ -1,209 +0,0 @@ - - - - - - - - -Loads a saved object — torch_load • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Loads a saved object

    -
    - -
    torch_load(path)
    - -

    Arguments

    - - - - - - -
    path

    a path to the saved object

    - -

    See also

    - -

    Other torch_save: -torch_save()

    - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_log.html b/docs/reference/torch_log.html deleted file mode 100644 index 7db29c704cfd6c15613e25f1c34b3b1d4b72b7a8..0000000000000000000000000000000000000000 --- a/docs/reference/torch_log.html +++ /dev/null @@ -1,236 +0,0 @@ - - - - - - - - -Log — torch_log • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Log

    -
    - - -

    Arguments

    - - - - - - - - - - -
    input

    (Tensor) the input tensor.

    out

    (Tensor, optional) the output tensor.

    - -

    log(input, out=None) -> Tensor

    - - - - -

    Returns a new tensor with the natural logarithm of the elements -of input.

    -

    $$ - y_{i} = \log_{e} (x_{i}) -$$

    - -

    Examples

    -
    # \dontrun{ - -a = torch_randn(c(5)) -a
    #> torch_tensor -#> 1.6609 -#> -1.5017 -#> 0.7439 -#> -0.6239 -#> 1.2395 -#> [ CPUFloatType{5} ]
    torch_log(a)
    #> torch_tensor -#> 0.5074 -#> nan -#> -0.2958 -#> nan -#> 0.2147 -#> [ CPUFloatType{5} ]
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_log10.html b/docs/reference/torch_log10.html deleted file mode 100644 index 48657c4cfe8aed3cdd36ed1faa25135ad30203f9..0000000000000000000000000000000000000000 --- a/docs/reference/torch_log10.html +++ /dev/null @@ -1,236 +0,0 @@ - - - - - - - - -Log10 — torch_log10 • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Log10

    -
    - - -

    Arguments

    - - - - - - - - - - -
    input

    (Tensor) the input tensor.

    out

    (Tensor, optional) the output tensor.

    - -

    log10(input, out=None) -> Tensor

    - - - - -

    Returns a new tensor with the logarithm to the base 10 of the elements -of input.

    -

    $$ - y_{i} = \log_{10} (x_{i}) -$$

    - -

    Examples

    -
    # \dontrun{ - -a = torch_rand(5) -a
    #> torch_tensor -#> 0.6619 -#> 0.0908 -#> 0.8331 -#> 0.1240 -#> 0.1908 -#> [ CPUFloatType{5} ]
    torch_log10(a)
    #> torch_tensor -#> -0.1792 -#> -1.0419 -#> -0.0793 -#> -0.9067 -#> -0.7195 -#> [ CPUFloatType{5} ]
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_log1p.html b/docs/reference/torch_log1p.html deleted file mode 100644 index caeac9d767d1d317aa8658a546d873e459cd4b38..0000000000000000000000000000000000000000 --- a/docs/reference/torch_log1p.html +++ /dev/null @@ -1,239 +0,0 @@ - - - - - - - - -Log1p — torch_log1p • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Log1p

    -
    - - -

    Arguments

    - - - - - - - - - - -
    input

    (Tensor) the input tensor.

    out

    (Tensor, optional) the output tensor.

    - -

    Note

    - -

    This function is more accurate than torch_log for small -values of input

    -

    log1p(input, out=None) -> Tensor

    - - - - -

    Returns a new tensor with the natural logarithm of (1 + input).

    -

    $$ - y_i = \log_{e} (x_i + 1) -$$

    - -

    Examples

    -
    # \dontrun{ - -a = torch_randn(c(5)) -a
    #> torch_tensor -#> -0.7431 -#> -0.9335 -#> 0.2461 -#> 0.9212 -#> -0.0972 -#> [ CPUFloatType{5} ]
    torch_log1p(a)
    #> torch_tensor -#> -1.3591 -#> -2.7110 -#> 0.2200 -#> 0.6530 -#> -0.1023 -#> [ CPUFloatType{5} ]
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_log2.html b/docs/reference/torch_log2.html deleted file mode 100644 index e3f5e274e590ce76b44683e381467712c2a3acd3..0000000000000000000000000000000000000000 --- a/docs/reference/torch_log2.html +++ /dev/null @@ -1,236 +0,0 @@ - - - - - - - - -Log2 — torch_log2 • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Log2

    -
    - - -

    Arguments

    - - - - - - - - - - -
    input

    (Tensor) the input tensor.

    out

    (Tensor, optional) the output tensor.

    - -

    log2(input, out=None) -> Tensor

    - - - - -

    Returns a new tensor with the logarithm to the base 2 of the elements -of input.

    -

    $$ - y_{i} = \log_{2} (x_{i}) -$$

    - -

    Examples

    -
    # \dontrun{ - -a = torch_rand(5) -a
    #> torch_tensor -#> 0.5313 -#> 0.0120 -#> 0.0760 -#> 0.6910 -#> 0.9385 -#> [ CPUFloatType{5} ]
    torch_log2(a)
    #> torch_tensor -#> -0.9125 -#> -6.3856 -#> -3.7179 -#> -0.5332 -#> -0.0915 -#> [ CPUFloatType{5} ]
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_logdet.html b/docs/reference/torch_logdet.html deleted file mode 100644 index a462b2f2f85a78971434dd09ccdeecc69d2b7091..0000000000000000000000000000000000000000 --- a/docs/reference/torch_logdet.html +++ /dev/null @@ -1,241 +0,0 @@ - - - - - - - - -Logdet — torch_logdet • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Logdet

    -
    - - -

    Arguments

    - - - - - - -
    input

    (Tensor) the input tensor of size (*, n, n) where * is zero or more batch dimensions.

    - -

    Note

    - - -
    Result is ``-inf`` if `input` has zero log determinant, and is ``nan`` if
    -`input` has negative determinant.
    -
    - -
    Backward through `logdet` internally uses SVD results when `input`
    -is not invertible. In this case, double backward through `logdet` will
    -be unstable in when `input` doesn't have distinct singular values. See
    -`~torch.svd` for details.
    -
    - -

    logdet(input) -> Tensor

    - - - - -

    Calculates log determinant of a square matrix or batches of square matrices.

    - -

    Examples

    -
    # \dontrun{ - -A = torch_randn(c(3, 3)) -torch_det(A)
    #> torch_tensor -#> -0.779207 -#> [ CPUFloatType{} ]
    torch_logdet(A)
    #> torch_tensor -#> nan -#> [ CPUFloatType{} ]
    A
    #> torch_tensor -#> -0.1953 -0.9165 -0.4366 -#> 0.0674 0.1296 -1.5398 -#> -0.7582 -1.0511 -0.3637 -#> [ CPUFloatType{3,3} ]
    A$det()
    #> torch_tensor -#> -0.779207 -#> [ CPUFloatType{} ]
    A$det()$log()
    #> torch_tensor -#> nan -#> [ CPUFloatType{} ]
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_logical_and.html b/docs/reference/torch_logical_and.html deleted file mode 100644 index ece3997c1fcc14ea33c2831d304181e814e76c67..0000000000000000000000000000000000000000 --- a/docs/reference/torch_logical_and.html +++ /dev/null @@ -1,221 +0,0 @@ - - - - - - - - -Logical_and — torch_logical_and • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Logical_and

    -
    - - -

    Arguments

    - - - - - - - - - - - - - - -
    input

    (Tensor) the input tensor.

    other

    (Tensor) the tensor to compute AND with

    out

    (Tensor, optional) the output tensor.

    - -

    logical_and(input, other, out=None) -> Tensor

    - - - - -

    Computes the element-wise logical AND of the given input tensors. Zeros are treated as False and nonzeros are -treated as True.

    - -

    Examples

    -
    
    -  
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_logical_not.html b/docs/reference/torch_logical_not.html deleted file mode 100644 index c218ddb0a1cf5e2a5117d656f2cf68d57dd38aef..0000000000000000000000000000000000000000 --- a/docs/reference/torch_logical_not.html +++ /dev/null @@ -1,231 +0,0 @@ - - - - - - - - -Logical_not — torch_logical_not • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Logical_not

    -
    - - -

    Arguments

    - - - - - - - - - - -
    input

    (Tensor) the input tensor.

    out

    (Tensor, optional) the output tensor.

    - -

    logical_not(input, out=None) -> Tensor

    - - - - -

    Computes the element-wise logical NOT of the given input tensor. If not specified, the output tensor will have the bool -dtype. If the input tensor is not a bool tensor, zeros are treated as False and non-zeros are treated as True.

    - -

    Examples

    -
    # \dontrun{ - -torch_logical_not(torch_tensor(c(TRUE, FALSE)))
    #> torch_tensor -#> 0 -#> 1 -#> [ CPUBoolType{2} ]
    torch_logical_not(torch_tensor(c(0, 1, -10), dtype=torch_int8()))
    #> torch_tensor -#> 1 -#> 0 -#> 0 -#> [ CPUBoolType{3} ]
    torch_logical_not(torch_tensor(c(0., 1.5, -10.), dtype=torch_double()))
    #> torch_tensor -#> 1 -#> 0 -#> 0 -#> [ CPUBoolType{3} ]
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_logical_or.html b/docs/reference/torch_logical_or.html deleted file mode 100644 index 91c92daceb0f0a5d5c3b387cfe78ae41aef5e33f..0000000000000000000000000000000000000000 --- a/docs/reference/torch_logical_or.html +++ /dev/null @@ -1,221 +0,0 @@ - - - - - - - - -Logical_or — torch_logical_or • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Logical_or

    -
    - - -

    Arguments

    - - - - - - - - - - - - - - -
    input

    (Tensor) the input tensor.

    other

    (Tensor) the tensor to compute OR with

    out

    (Tensor, optional) the output tensor.

    - -

    logical_or(input, other, out=None) -> Tensor

    - - - - -

    Computes the element-wise logical OR of the given input tensors. Zeros are treated as False and nonzeros are -treated as True.

    - -

    Examples

    -
    
    -  
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_logical_xor.html b/docs/reference/torch_logical_xor.html deleted file mode 100644 index d53d734c81936a01033e7d1506ae545365a9963f..0000000000000000000000000000000000000000 --- a/docs/reference/torch_logical_xor.html +++ /dev/null @@ -1,245 +0,0 @@ - - - - - - - - -Logical_xor — torch_logical_xor • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Logical_xor

    -
    - - -

    Arguments

    - - - - - - - - - - - - - - -
    input

    (Tensor) the input tensor.

    other

    (Tensor) the tensor to compute XOR with

    out

    (Tensor, optional) the output tensor.

    - -

    logical_xor(input, other, out=None) -> Tensor

    - - - - -

    Computes the element-wise logical XOR of the given input tensors. Zeros are treated as False and nonzeros are -treated as True.

    - -

    Examples

    -
    # \dontrun{ - -torch_logical_xor(torch_tensor(c(TRUE, FALSE, TRUE)), torch_tensor(c(TRUE, FALSE, FALSE)))
    #> torch_tensor -#> 0 -#> 0 -#> 1 -#> [ CPUBoolType{3} ]
    a = torch_tensor(c(0, 1, 10, 0), dtype=torch_int8()) -b = torch_tensor(c(4, 0, 1, 0), dtype=torch_int8()) -torch_logical_xor(a, b)
    #> torch_tensor -#> 1 -#> 1 -#> 0 -#> 0 -#> [ CPUBoolType{4} ]
    torch_logical_xor(a$to(dtype=torch_double()), b$to(dtype=torch_double()))
    #> torch_tensor -#> 1 -#> 1 -#> 0 -#> 0 -#> [ CPUBoolType{4} ]
    torch_logical_xor(a$to(dtype=torch_double()), b)
    #> torch_tensor -#> 1 -#> 1 -#> 0 -#> 0 -#> [ CPUBoolType{4} ]
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_logspace.html b/docs/reference/torch_logspace.html deleted file mode 100644 index 7e289e71334e3118a21df2775404e45bdb215820..0000000000000000000000000000000000000000 --- a/docs/reference/torch_logspace.html +++ /dev/null @@ -1,266 +0,0 @@ - - - - - - - - -Logspace — torch_logspace • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Logspace

    -
    - - -

    Arguments

    - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    start

    (float) the starting value for the set of points

    end

    (float) the ending value for the set of points

    steps

    (int) number of points to sample between start and end. Default: 100.

    base

    (float) base of the logarithm function. Default: 10.0.

    out

    (Tensor, optional) the output tensor.

    dtype

    (torch.dtype, optional) the desired data type of returned tensor. Default: if None, uses a global default (see torch_set_default_tensor_type).

    layout

    (torch.layout, optional) the desired layout of returned Tensor. Default: torch_strided.

    device

    (torch.device, optional) the desired device of returned tensor. Default: if None, uses the current device for the default tensor type (see torch_set_default_tensor_type). device will be the CPU for CPU tensor types and the current CUDA device for CUDA tensor types.

    requires_grad

    (bool, optional) If autograd should record operations on the returned tensor. Default: False.

    - -

    logspace(start, end, steps=100, base=10.0, out=None, dtype=None, layout=torch.strided, device=None, requires_grad=False) -> Tensor

    - - - - -

    Returns a one-dimensional tensor of steps points -logarithmically spaced with base base between -\({\mbox{base}}^{\mbox{start}}\) and \({\mbox{base}}^{\mbox{end}}\).

    -

    The output tensor is 1-D of size steps.

    - -

    Examples

    -
    # \dontrun{ - -torch_logspace(start=-10, end=10, steps=5)
    #> torch_tensor -#> 1.0000e-10 -#> 1.0000e-05 -#> 1.0000e+00 -#> 1.0000e+05 -#> 1.0000e+10 -#> [ CPUFloatType{5} ]
    torch_logspace(start=0.1, end=1.0, steps=5)
    #> torch_tensor -#> 1.2589 -#> 2.1135 -#> 3.5481 -#> 5.9566 -#> 10.0000 -#> [ CPUFloatType{5} ]
    torch_logspace(start=0.1, end=1.0, steps=1)
    #> torch_tensor -#> 1.2589 -#> [ CPUFloatType{1} ]
    torch_logspace(start=2, end=2, steps=1, base=2)
    #> torch_tensor -#> 4 -#> [ CPUFloatType{1} ]
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_logsumexp.html b/docs/reference/torch_logsumexp.html deleted file mode 100644 index 956a821f1883c55b11a81053ab8e3a6596786d07..0000000000000000000000000000000000000000 --- a/docs/reference/torch_logsumexp.html +++ /dev/null @@ -1,242 +0,0 @@ - - - - - - - - -Logsumexp — torch_logsumexp • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Logsumexp

    -
    - - -

    Arguments

    - - - - - - - - - - - - - - - - - - -
    input

    (Tensor) the input tensor.

    dim

    (int or tuple of ints) the dimension or dimensions to reduce.

    keepdim

    (bool) whether the output tensor has dim retained or not.

    out

    (Tensor, optional) the output tensor.

    - -

    logsumexp(input, dim, keepdim=False, out=None)

    - - - - -

    Returns the log of summed exponentials of each row of the input -tensor in the given dimension dim. The computation is numerically -stabilized.

    -

    For summation index \(j\) given by dim and other indices \(i\), the result is

    -

    $$ - \mbox{logsumexp}(x)_{i} = \log \sum_j \exp(x_{ij}) -$$

    -

    If keepdim is True, the output tensor is of the same size -as input except in the dimension(s) dim where it is of size 1. -Otherwise, dim is squeezed (see torch_squeeze), resulting in the -output tensor having 1 (or len(dim)) fewer dimension(s).

    - -

    Examples

    -
    # \dontrun{ - -a = torch_randn(c(3, 3)) -torch_logsumexp(a, 1)
    #> torch_tensor -#> 1.9705 -#> 0.9976 -#> 1.5809 -#> [ CPUFloatType{3} ]
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_lstsq.html b/docs/reference/torch_lstsq.html deleted file mode 100644 index cab0c34ff337e28058a8f2014db9f68c9967435c..0000000000000000000000000000000000000000 --- a/docs/reference/torch_lstsq.html +++ /dev/null @@ -1,268 +0,0 @@ - - - - - - - - -Lstsq — torch_lstsq • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Lstsq

    -
    - - -

    Arguments

    - - - - - - - - - - - - - - -
    input

    (Tensor) the matrix \(B\)

    A

    (Tensor) the \(m\) by \(n\) matrix \(A\)

    out

    (tuple, optional) the optional destination tensor

    - -

    Note

    - - -
    The case when \eqn{m &lt; n} is not supported on the GPU.
    -
    - -

    lstsq(input, A, out=None) -> Tensor

    - - - - -

    Computes the solution to the least squares and least norm problems for a full -rank matrix \(A\) of size \((m \times n)\) and a matrix \(B\) of -size \((m \times k)\).

    -

    If \(m \geq n\), torch_lstsq() solves the least-squares problem:

    -

    $$ - \begin{array}{ll} - \min_X & \|AX-B\|_2. - \end{array} -$$ -If \(m < n\), torch_lstsq() solves the least-norm problem:

    -

    $$ - \begin{array}{llll} - \min_X & \|X\|_2 & \mbox{subject to} & AX = B. - \end{array} -$$ -Returned tensor \(X\) has shape \((\mbox{max}(m, n) \times k)\). The first \(n\) -rows of \(X\) contains the solution. If \(m \geq n\), the residual sum of squares -for the solution in each column is given by the sum of squares of elements in the -remaining \(m - n\) rows of that column.

    - -

    Examples

    -
    # \dontrun{ - -A = torch_tensor(rbind( - c(1,1,1), - c(2,3,4), - c(3,5,2), - c(4,2,5), - c(5,4,3) -)) -B = torch_tensor(rbind( - c(-10, -3), - c(12, 14), - c(14, 12), - c(16, 16), - c(18, 16) -)) -out = torch_lstsq(B, A) -out[[1]]
    #> torch_tensor -#> 2.0000 1.0000 -#> 1.0000 1.0000 -#> 1.0000 2.0000 -#> 10.9635 4.8501 -#> 8.9332 5.2418 -#> [ CPUFloatType{5,2} ]
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_lt.html b/docs/reference/torch_lt.html deleted file mode 100644 index 1184e0a50d5799f2e1d75530b1e59e67447c175e..0000000000000000000000000000000000000000 --- a/docs/reference/torch_lt.html +++ /dev/null @@ -1,229 +0,0 @@ - - - - - - - - -Lt — torch_lt • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Lt

    -
    - - -

    Arguments

    - - - - - - - - - - - - - - -
    input

    (Tensor) the tensor to compare

    other

    (Tensor or float) the tensor or value to compare

    out

    (Tensor, optional) the output tensor that must be a BoolTensor

    - -

    lt(input, other, out=None) -> Tensor

    - - - - -

    Computes \(\mbox{input} < \mbox{other}\) element-wise.

    -

    The second argument can be a number or a tensor whose shape is -broadcastable with the first argument.

    - -

    Examples

    -
    # \dontrun{ - -torch_lt(torch_tensor(matrix(1:4, ncol = 2, byrow=TRUE)), - torch_tensor(matrix(c(1,1,4,4), ncol = 2, byrow=TRUE)))
    #> torch_tensor -#> 0 0 -#> 1 0 -#> [ CPUBoolType{2,2} ]
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_lu.html b/docs/reference/torch_lu.html deleted file mode 100644 index 3965e722d3d4cacea6034b8d68835e6175ac7b38..0000000000000000000000000000000000000000 --- a/docs/reference/torch_lu.html +++ /dev/null @@ -1,248 +0,0 @@ - - - - - - - - -LU — torch_lu • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Computes the LU factorization of a matrix or batches of matrices A. Returns a -tuple containing the LU factorization and pivots of A. Pivoting is done if pivot -is set to True.

    -
    - -
    torch_lu(A, pivot = TRUE, get_infos = FALSE, out = NULL)
    - -

    Arguments

    - - - - - - - - - - - - - - - - - - -
    A

    (Tensor) the tensor to factor of size (, m, n)(,m,n)

    pivot

    (bool, optional) – controls whether pivoting is done. Default: TRUE

    get_infos

    (bool, optional) – if set to True, returns an info IntTensor. Default: FALSE

    out

    (tuple, optional) – optional output tuple. If get_infos is True, then the elements -in the tuple are Tensor, IntTensor, and IntTensor. If get_infos is False, then the -elements in the tuple are Tensor, IntTensor. Default: NULL

    - - -

    Examples

    -
    # \dontrun{ - -A = torch_randn(c(2, 3, 3)) -torch_lu(A)
    #> [[1]] -#> torch_tensor -#> (1,.,.) = -#> 1.5274 -0.8578 -1.5278 -#> -0.5446 0.3682 -0.1468 -#> 0.6892 0.4103 1.5956 -#> -#> (2,.,.) = -#> -1.3121 -0.1896 -2.3469 -#> -0.7969 1.0315 -0.9561 -#> 0.7149 -0.0635 1.4525 -#> [ CPUFloatType{2,3,3} ] -#> -#> [[2]] -#> torch_tensor -#> 3 2 3 -#> 3 2 3 -#> [ CPUIntType{2,3} ] -#>
    -# } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_lu_solve.html b/docs/reference/torch_lu_solve.html deleted file mode 100644 index e3752572d1ec7fb1c25a9e077d351a1a5cf2fb53..0000000000000000000000000000000000000000 --- a/docs/reference/torch_lu_solve.html +++ /dev/null @@ -1,233 +0,0 @@ - - - - - - - - -Lu_solve — torch_lu_solve • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Lu_solve

    -
    - - -

    Arguments

    - - - - - - - - - - - - - - - - - - -
    b

    (Tensor) the RHS tensor of size \((*, m, k)\), where \(*\) is zero or more batch dimensions.

    LU_data

    (Tensor) the pivoted LU factorization of A from torch_lu of size \((*, m, m)\), where \(*\) is zero or more batch dimensions.

    LU_pivots

    (IntTensor) the pivots of the LU factorization from torch_lu of size \((*, m)\), where \(*\) is zero or more batch dimensions. The batch dimensions of LU_pivots must be equal to the batch dimensions of LU_data.

    out

    (Tensor, optional) the output tensor.

    - -

    lu_solve(input, LU_data, LU_pivots, out=None) -> Tensor

    - - - - -

    Returns the LU solve of the linear system \(Ax = b\) using the partially pivoted -LU factorization of A from torch_lu.

    - -

    Examples

    -
    # \dontrun{ -A = torch_randn(c(2, 3, 3)) -b = torch_randn(c(2, 3, 1)) -out = torch_lu(A) -x = torch_lu_solve(b, out[[1]], out[[2]]) -torch_norm(torch_bmm(A, x) - b)
    #> torch_tensor -#> 7.02886e-08 -#> [ CPUFloatType{} ]
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_masked_select.html b/docs/reference/torch_masked_select.html deleted file mode 100644 index 2c614b5dfec2b38cce1eaf2a938fabc718db643d..0000000000000000000000000000000000000000 --- a/docs/reference/torch_masked_select.html +++ /dev/null @@ -1,245 +0,0 @@ - - - - - - - - -Masked_select — torch_masked_select • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Masked_select

    -
    - - -

    Arguments

    - - - - - - - - - - - - - - -
    input

    (Tensor) the input tensor.

    mask

    (BoolTensor) the tensor containing the binary mask to index with

    out

    (Tensor, optional) the output tensor.

    - -

    Note

    - -

    The returned tensor does not use the same storage -as the original tensor

    -

    masked_select(input, mask, out=None) -> Tensor

    - - - - -

    Returns a new 1-D tensor which indexes the input tensor according to -the boolean mask mask which is a BoolTensor.

    -

    The shapes of the mask tensor and the input tensor don't need -to match, but they must be broadcastable .

    - -

    Examples

    -
    # \dontrun{ - -x = torch_randn(c(3, 4)) -x
    #> torch_tensor -#> 0.1475 -0.7929 -0.3681 1.0487 -#> -1.1304 1.8525 -0.8021 0.2346 -#> 1.8222 0.7533 0.4753 0.3604 -#> [ CPUFloatType{3,4} ]
    mask = x$ge(0.5) -mask
    #> torch_tensor -#> 0 0 0 1 -#> 0 1 0 0 -#> 1 1 0 0 -#> [ CPUBoolType{3,4} ]
    torch_masked_select(x, mask)
    #> torch_tensor -#> 1.0487 -#> 1.8525 -#> 1.8222 -#> 0.7533 -#> [ CPUFloatType{4} ]
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_matmul.html b/docs/reference/torch_matmul.html deleted file mode 100644 index 3bc18b9eac0b2c62a34cf6937e68e64121270de6..0000000000000000000000000000000000000000 --- a/docs/reference/torch_matmul.html +++ /dev/null @@ -1,380 +0,0 @@ - - - - - - - - -Matmul — torch_matmul • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Matmul

    -
    - - -

    Arguments

    - - - - - - - - - - - - - - -
    input

    (Tensor) the first tensor to be multiplied

    other

    (Tensor) the second tensor to be multiplied

    out

    (Tensor, optional) the output tensor.

    - -

    Note

    - - -
    The 1-dimensional dot product version of this function does not support an `out` parameter.
    -
    - -

    matmul(input, other, out=None) -> Tensor

    - - - - -

    Matrix product of two tensors.

    -

    The behavior depends on the dimensionality of the tensors as follows:

      -
    • If both tensors are 1-dimensional, the dot product (scalar) is returned.

    • -
    • If both arguments are 2-dimensional, the matrix-matrix product is returned.

    • -
    • If the first argument is 1-dimensional and the second argument is 2-dimensional, -a 1 is prepended to its dimension for the purpose of the matrix multiply. -After the matrix multiply, the prepended dimension is removed.

    • -
    • If the first argument is 2-dimensional and the second argument is 1-dimensional, -the matrix-vector product is returned.

    • -
    • If both arguments are at least 1-dimensional and at least one argument is -N-dimensional (where N > 2), then a batched matrix multiply is returned. If the first -argument is 1-dimensional, a 1 is prepended to its dimension for the purpose of the -batched matrix multiply and removed after. If the second argument is 1-dimensional, a -1 is appended to its dimension for the purpose of the batched matrix multiple and removed after. -The non-matrix (i.e. batch) dimensions are broadcasted (and thus -must be broadcastable). For example, if input is a -\((j \times 1 \times n \times m)\) tensor and other is a \((k \times m \times p)\) -tensor, out will be an \((j \times k \times n \times p)\) tensor.

    • -
    - - -

    Examples

    -
    # \dontrun{ - -# vector x vector -tensor1 = torch_randn(c(3)) -tensor2 = torch_randn(c(3)) -torch_matmul(tensor1, tensor2)
    #> torch_tensor -#> -2.01064 -#> [ CPUFloatType{} ]
    # matrix x vector -tensor1 = torch_randn(c(3, 4)) -tensor2 = torch_randn(c(4)) -torch_matmul(tensor1, tensor2)
    #> torch_tensor -#> 0.8498 -#> -0.1725 -#> -0.2413 -#> [ CPUFloatType{3} ]
    # batched matrix x broadcasted vector -tensor1 = torch_randn(c(10, 3, 4)) -tensor2 = torch_randn(c(4)) -torch_matmul(tensor1, tensor2)
    #> torch_tensor -#> -0.3051 0.8335 0.7817 -#> -1.7290 -0.1315 -1.0013 -#> -2.2799 -2.0186 -1.7212 -#> -2.0462 2.6530 2.2152 -#> 1.0726 -1.8564 -0.0805 -#> 2.5083 0.1464 -1.9015 -#> 0.1432 1.3745 0.8548 -#> -0.5524 0.7222 0.1316 -#> -1.5975 0.3861 -2.5685 -#> -3.1016 2.3700 0.1975 -#> [ CPUFloatType{10,3} ]
    # batched matrix x batched matrix -tensor1 = torch_randn(c(10, 3, 4)) -tensor2 = torch_randn(c(10, 4, 5)) -torch_matmul(tensor1, tensor2)
    #> torch_tensor -#> (1,.,.) = -#> 2.7383 3.7776 1.1984 -0.0462 5.9024 -#> 0.1247 -2.9988 0.8401 -1.2394 -2.9948 -#> 3.0933 3.5992 3.5134 -3.3567 6.0899 -#> -#> (2,.,.) = -#> -0.9786 -0.3967 -0.3440 0.3824 -1.2635 -#> 0.4944 0.7149 0.5734 -1.1592 0.6260 -#> 1.3268 0.8352 1.9060 -3.7034 0.1600 -#> -#> (3,.,.) = -#> 0.4321 0.9564 1.6369 -0.1843 -0.3877 -#> -1.4475 -2.3336 -2.1501 -1.6377 3.0016 -#> -1.9856 0.2134 1.8436 1.6812 1.5892 -#> -#> (4,.,.) = -#> 0.7306 -0.1213 0.6505 -2.7818 -0.1171 -#> -1.4915 3.2909 0.5003 2.5146 1.3735 -#> 0.0837 -0.1117 0.8951 3.6226 1.1665 -#> -#> (5,.,.) = -#> -1.9671 0.1609 2.2518 0.7587 2.6455 -#> -0.5965 -1.3687 -2.4585 -1.6623 -3.6542 -#> 0.0899 1.6690 0.2659 -2.6994 2.8543 -#> -#> (6,.,.) = -#> -4.5552 0.7339 -2.3152 -5.9254 -1.6420 -#> -0.8628 0.0829 -0.6564 0.9928 -0.0438 -#> -4.1770 0.7927 -2.3129 -4.7196 -0.5457 -#> -#> (7,.,.) = -#> 1.3550 -0.9541 -1.7768 2.5931 1.6567 -#> 4.4144 1.4763 1.1692 -2.8514 -1.1695 -#> 4.2236 0.0365 -0.3523 1.0364 0.0855 -#> -#> (8,.,.) = -#> -3.3315 1.4609 -1.3792 -2.9510 -0.7443 -#> 1.3569 -2.2547 0.7054 -2.1291 5.2801 -#> -0.0102 1.3483 -0.3866 2.5474 -3.7250 -#> -#> (9,.,.) = -#> 0.8332 2.8945 -1.7215 1.0412 1.7159 -#> -1.0112 -5.9806 1.3106 3.5058 -2.5003 -#> 0.3979 0.5793 -0.9296 -2.5669 -3.1548 -#> -#> (10,.,.) = -#> 1.3155 -0.3605 0.2988 -0.0092 -0.7717 -#> 0.6723 1.2307 -1.8691 0.1663 -1.7657 -#> -1.2338 0.4575 0.0627 1.7217 2.0104 -#> [ CPUFloatType{10,3,5} ]
    # batched matrix x broadcasted matrix -tensor1 = torch_randn(c(10, 3, 4)) -tensor2 = torch_randn(c(4, 5)) -torch_matmul(tensor1, tensor2)
    #> torch_tensor -#> (1,.,.) = -#> -5.1244 2.2971 -1.6275 2.7803 -4.4830 -#> 1.8873 -3.6560 1.6412 0.0507 0.1216 -#> -0.5433 1.9666 -1.4803 -0.5111 0.0638 -#> -#> (2,.,.) = -#> 0.0601 -4.0015 2.7077 1.7992 -1.7779 -#> -1.6433 1.4184 -1.3365 0.5739 -1.9782 -#> -3.4216 6.9532 -3.3050 -0.2836 0.0183 -#> -#> (3,.,.) = -#> -1.8677 -0.3711 0.3040 1.6625 -2.7921 -#> 3.2992 -0.5994 -1.6079 -2.2177 -1.6740 -#> -3.7891 4.2637 2.9312 2.2233 0.8338 -#> -#> (4,.,.) = -#> -2.1277 3.2238 0.0312 0.5080 1.3242 -#> 2.2424 -0.8814 0.2856 -1.2700 1.1283 -#> -1.0811 2.6808 -0.7240 -0.2231 1.0720 -#> -#> (5,.,.) = -#> -3.4333 2.0806 -0.9037 1.4697 -0.1231 -#> 2.3334 -3.4847 1.1505 -0.3664 -0.2367 -#> 1.2297 -2.4053 2.6418 0.6376 -1.1939 -#> -#> (6,.,.) = -#> -2.2066 0.7770 1.6664 1.7391 -0.3200 -#> -2.7109 -1.6160 -1.9334 1.4483 0.3787 -#> -1.2449 -0.9002 -1.0765 0.8913 -1.9713 -#> -#> (7,.,.) = -#> 0.2789 -1.0134 -0.5889 0.0808 -1.8249 -#> 0.6334 -0.8853 2.2413 0.6094 -1.5061 -#> -0.1358 0.5410 0.6620 0.1202 0.3539 -#> -#> (8,.,.) = -#> 0.5331 0.8902 1.7452 -0.4616 3.8193 -#> 0.4330 2.0916 -0.6593 -0.9295 0.1894 -#> 0.8105 -0.3216 0.7608 -0.4534 2.0104 -#> -#> (9,.,.) = -#> 2.2947 1.8882 2.8289 -1.2676 2.0636 -#> -0.6974 -2.0088 -1.8360 0.1831 1.3406 -#> 4.7011 -3.7311 0.0230 -1.9564 -1.7259 -#> -#> (10,.,.) = -#> 2.7739 -3.8386 -1.8082 -1.4422 -0.0495 -#> 1.3869 -3.8928 0.8670 0.1969 0.0855 -#> -3.0783 1.1996 -2.8824 0.9550 -0.9141 -#> [ CPUFloatType{10,3,5} ]
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_matrix_power.html b/docs/reference/torch_matrix_power.html deleted file mode 100644 index f137dc80a7f13a461b443264c4de2be1fdb0739c..0000000000000000000000000000000000000000 --- a/docs/reference/torch_matrix_power.html +++ /dev/null @@ -1,242 +0,0 @@ - - - - - - - - -Matrix_power — torch_matrix_power • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Matrix_power

    -
    - - -

    Arguments

    - - - - - - - - - - -
    input

    (Tensor) the input tensor.

    n

    (int) the power to raise the matrix to

    - -

    matrix_power(input, n) -> Tensor

    - - - - -

    Returns the matrix raised to the power n for square matrices. -For batch of matrices, each individual matrix is raised to the power n.

    -

    If n is negative, then the inverse of the matrix (if invertible) is -raised to the power n. For a batch of matrices, the batched inverse -(if invertible) is raised to the power n. If n is 0, then an identity matrix -is returned.

    - -

    Examples

    -
    # \dontrun{ - -a = torch_randn(c(2, 2, 2)) -a
    #> torch_tensor -#> (1,.,.) = -#> 0.0871 0.0197 -#> 0.3185 0.4297 -#> -#> (2,.,.) = -#> -0.7818 -0.6402 -#> 0.2400 -0.4367 -#> [ CPUFloatType{2,2,2} ]
    torch_matrix_power(a, 3)
    #> torch_tensor -#> (1,.,.) = -#> 0.01 * -#> 0.4446 0.4644 -#> 7.5134 8.5264 -#> -#> (2,.,.) = -#> -0.1705 -0.6335 -#> 0.2375 0.1711 -#> [ CPUFloatType{2,2,2} ]
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_matrix_rank.html b/docs/reference/torch_matrix_rank.html deleted file mode 100644 index f2f515a4a965ea35017696d58e2127ced6c90d1f..0000000000000000000000000000000000000000 --- a/docs/reference/torch_matrix_rank.html +++ /dev/null @@ -1,234 +0,0 @@ - - - - - - - - -Matrix_rank — torch_matrix_rank • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Matrix_rank

    -
    - - -

    Arguments

    - - - - - - - - - - - - - - -
    input

    (Tensor) the input 2-D tensor

    tol

    (float, optional) the tolerance value. Default: None

    symmetric

    (bool, optional) indicates whether input is symmetric. Default: False

    - -

    matrix_rank(input, tol=None, symmetric=False) -> Tensor

    - - - - -

    Returns the numerical rank of a 2-D tensor. The method to compute the -matrix rank is done using SVD by default. If symmetric is True, -then input is assumed to be symmetric, and the computation of the -rank is done by obtaining the eigenvalues.

    -

    tol is the threshold below which the singular values (or the eigenvalues -when symmetric is True) are considered to be 0. If tol is not -specified, tol is set to S.max() * max(S.size()) * eps where S is the -singular values (or the eigenvalues when symmetric is True), and eps -is the epsilon value for the datatype of input.

    - -

    Examples

    -
    # \dontrun{ - -a = torch_eye(10) -torch_matrix_rank(a)
    #> torch_tensor -#> 10 -#> [ CPULongType{} ]
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_max.html b/docs/reference/torch_max.html deleted file mode 100644 index 00d5f5b8bc65cf2dcc6d1366a9f6001ff923fadd..0000000000000000000000000000000000000000 --- a/docs/reference/torch_max.html +++ /dev/null @@ -1,315 +0,0 @@ - - - - - - - - -Max — torch_max • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Max

    -
    - - -

    Arguments

    - - - - - - - - - - - - - - - - - - - - - - -
    input

    (Tensor) the input tensor.

    dim

    (int) the dimension to reduce.

    keepdim

    (bool) whether the output tensor has dim retained or not. Default: False.

    out

    (tuple, optional) the result tuple of two output tensors (max, max_indices)

    other

    (Tensor) the second input tensor

    - -

    Note

    - -

    When the shapes do not match, the shape of the returned output tensor -follows the broadcasting rules .

    -

    max(input) -> Tensor

    - - - - -

    Returns the maximum value of all elements in the input tensor.

    -

    max(input, dim, keepdim=False, out=None) -> (Tensor, LongTensor)

    - - - - -

    Returns a namedtuple (values, indices) where values is the maximum -value of each row of the input tensor in the given dimension -dim. And indices is the index location of each maximum value found -(argmax).

    -

    Warning

    - - - -

    indices does not necessarily contain the first occurrence of each -maximal value found, unless it is unique. -The exact implementation details are device-specific. -Do not expect the same result when run on CPU and GPU in general.

    -

    If keepdim is True, the output tensors are of the same size -as input except in the dimension dim where they are of size 1. -Otherwise, dim is squeezed (see torch_squeeze), resulting -in the output tensors having 1 fewer dimension than input.

    -

    max(input, other, out=None) -> Tensor

    - - - - -

    Each element of the tensor input is compared with the corresponding -element of the tensor other and an element-wise maximum is taken.

    -

    The shapes of input and other don't need to match, -but they must be broadcastable .

    -

    $$ - \mbox{out}_i = \max(\mbox{tensor}_i, \mbox{other}_i) -$$

    - -

    Examples

    -
    # \dontrun{ - -a = torch_randn(c(1, 3)) -a
    #> torch_tensor -#> -1.7404 0.4095 0.0815 -#> [ CPUFloatType{1,3} ]
    torch_max(a)
    #> torch_tensor -#> 0.409483 -#> [ CPUFloatType{} ]
    - -a = torch_randn(c(4, 4)) -a
    #> torch_tensor -#> -0.4374 0.4921 0.5690 -0.5727 -#> -0.7344 1.4273 0.3648 1.0731 -#> 0.4967 0.2687 0.5694 -0.4233 -#> 0.2623 0.0037 -1.2246 -0.3742 -#> [ CPUFloatType{4,4} ]
    torch_max(a, dim = 1)
    #> [[1]] -#> torch_tensor -#> 0.4967 -#> 1.4273 -#> 0.5694 -#> 1.0731 -#> [ CPUFloatType{4} ] -#> -#> [[2]] -#> torch_tensor -#> 3 -#> 2 -#> 3 -#> 2 -#> [ CPULongType{4} ] -#>
    - -a = torch_randn(c(4)) -a
    #> torch_tensor -#> -0.0640 -#> -0.2356 -#> -0.5395 -#> 1.8484 -#> [ CPUFloatType{4} ]
    b = torch_randn(c(4)) -b
    #> torch_tensor -#> 0.4095 -#> -0.6778 -#> -0.0908 -#> -0.2021 -#> [ CPUFloatType{4} ]
    torch_max(a, other = b)
    #> torch_tensor -#> 0.4095 -#> -0.2356 -#> -0.0908 -#> 1.8484 -#> [ CPUFloatType{4} ]
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_mean.html b/docs/reference/torch_mean.html deleted file mode 100644 index 02f865f28ae5fb368d44f77c39768631369595c8..0000000000000000000000000000000000000000 --- a/docs/reference/torch_mean.html +++ /dev/null @@ -1,259 +0,0 @@ - - - - - - - - -Mean — torch_mean • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Mean

    -
    - - -

    Arguments

    - - - - - - - - - - - - - - - - - - -
    input

    (Tensor) the input tensor.

    dim

    (int or tuple of ints) the dimension or dimensions to reduce.

    keepdim

    (bool) whether the output tensor has dim retained or not.

    out

    (Tensor, optional) the output tensor.

    - -

    mean(input) -> Tensor

    - - - - -

    Returns the mean value of all elements in the input tensor.

    -

    mean(input, dim, keepdim=False, out=None) -> Tensor

    - - - - -

    Returns the mean value of each row of the input tensor in the given -dimension dim. If dim is a list of dimensions, -reduce over all of them.

    -

    If keepdim is True, the output tensor is of the same size -as input except in the dimension(s) dim where it is of size 1. -Otherwise, dim is squeezed (see torch_squeeze), resulting in the -output tensor having 1 (or len(dim)) fewer dimension(s).

    - -

    Examples

    -
    # \dontrun{ - -a = torch_randn(c(1, 3)) -a
    #> torch_tensor -#> -0.0395 1.7826 1.2161 -#> [ CPUFloatType{1,3} ]
    torch_mean(a)
    #> torch_tensor -#> 0.986383 -#> [ CPUFloatType{} ]
    - -a = torch_randn(c(4, 4)) -a
    #> torch_tensor -#> 0.0216 -1.8383 -0.8037 -1.2803 -#> 1.6988 -1.2344 -0.5559 0.7407 -#> -1.5668 0.8250 -0.0814 0.6922 -#> -0.7400 -0.0428 0.7179 -0.2121 -#> [ CPUFloatType{4,4} ]
    torch_mean(a, 1)
    #> torch_tensor -#> -0.1466 -#> -0.5726 -#> -0.1808 -#> -0.0149 -#> [ CPUFloatType{4} ]
    torch_mean(a, 1, TRUE)
    #> torch_tensor -#> -0.1466 -0.5726 -0.1808 -0.0149 -#> [ CPUFloatType{1,4} ]
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_median.html b/docs/reference/torch_median.html deleted file mode 100644 index af58b14ded20202a5991e67260487b2fef13831c..0000000000000000000000000000000000000000 --- a/docs/reference/torch_median.html +++ /dev/null @@ -1,270 +0,0 @@ - - - - - - - - -Median — torch_median • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Median

    -
    - - -

    Arguments

    - - - - - - - - - - - - - - - - - - -
    input

    (Tensor) the input tensor.

    dim

    (int) the dimension to reduce.

    keepdim

    (bool) whether the output tensor has dim retained or not.

    out

    (tuple, optional) the result tuple of two output tensors (max, max_indices)

    - -

    median(input) -> Tensor

    - - - - -

    Returns the median value of all elements in the input tensor.

    -

    median(input, dim=-1, keepdim=False, out=None) -> (Tensor, LongTensor)

    - - - - -

    Returns a namedtuple (values, indices) where values is the median -value of each row of the input tensor in the given dimension -dim. And indices is the index location of each median value found.

    -

    By default, dim is the last dimension of the input tensor.

    -

    If keepdim is True, the output tensors are of the same size -as input except in the dimension dim where they are of size 1. -Otherwise, dim is squeezed (see torch_squeeze), resulting in -the outputs tensor having 1 fewer dimension than input.

    - -

    Examples

    -
    # \dontrun{ - -a = torch_randn(c(1, 3)) -a
    #> torch_tensor -#> -1.1294 0.8996 0.0937 -#> [ CPUFloatType{1,3} ]
    torch_median(a)
    #> torch_tensor -#> 0.0937234 -#> [ CPUFloatType{} ]
    - -a = torch_randn(c(4, 5)) -a
    #> torch_tensor -#> 0.2865 -0.8226 0.6805 0.3636 -0.6890 -#> 0.8853 0.3427 0.8220 -2.2562 -1.8976 -#> -1.3180 0.3580 1.1346 -0.5496 -0.2493 -#> -1.1359 0.0354 -0.3702 -0.0126 1.0450 -#> [ CPUFloatType{4,5} ]
    torch_median(a, 1)
    #> [[1]] -#> torch_tensor -#> -1.1359 -#> 0.0354 -#> 0.6805 -#> -0.5496 -#> -0.6890 -#> [ CPUFloatType{5} ] -#> -#> [[2]] -#> torch_tensor -#> 3 -#> 3 -#> 0 -#> 2 -#> 0 -#> [ CPULongType{5} ] -#>
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_memory_format.html b/docs/reference/torch_memory_format.html deleted file mode 100644 index fbf7b1e14e14adc9dafca647844bc27ae11b40ff..0000000000000000000000000000000000000000 --- a/docs/reference/torch_memory_format.html +++ /dev/null @@ -1,201 +0,0 @@ - - - - - - - - -Memory format — torch_memory_format • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Returns the correspondent memory format.

    -
    - -
    torch_contiguous_format()
    -
    -torch_preserve_format()
    -
    -torch_channels_last_format()
    - - - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_meshgrid.html b/docs/reference/torch_meshgrid.html deleted file mode 100644 index 383807ea40af23e81709071a63b56b5678dc7f72..0000000000000000000000000000000000000000 --- a/docs/reference/torch_meshgrid.html +++ /dev/null @@ -1,237 +0,0 @@ - - - - - - - - -Meshgrid — torch_meshgrid • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Meshgrid

    -
    - - -

    Arguments

    - - - - - - - - - - -
    tensors

    (list of Tensor) list of scalars or 1 dimensional tensors. Scalars will be

    treated

    (1,)

    - -

    TEST

    - - - - -

    Take \(N\) tensors, each of which can be either scalar or 1-dimensional -vector, and create \(N\) N-dimensional grids, where the \(i\) th grid is defined by -expanding the \(i\) th input over dimensions defined by other inputs.

    - -

    Examples

    -
    # \dontrun{ - -x = torch_tensor(c(1, 2, 3)) -y = torch_tensor(c(4, 5, 6)) -out = torch_meshgrid(list(x, y)) -out
    #> [[1]] -#> torch_tensor -#> 1 1 1 -#> 2 2 2 -#> 3 3 3 -#> [ CPUFloatType{3,3} ] -#> -#> [[2]] -#> torch_tensor -#> 4 5 6 -#> 4 5 6 -#> 4 5 6 -#> [ CPUFloatType{3,3} ] -#>
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_min.html b/docs/reference/torch_min.html deleted file mode 100644 index 4de5eea85f50f4f6ddf7e9d7f1cbd1d37cabc6ab..0000000000000000000000000000000000000000 --- a/docs/reference/torch_min.html +++ /dev/null @@ -1,316 +0,0 @@ - - - - - - - - -Min — torch_min • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Min

    -
    - - -

    Arguments

    - - - - - - - - - - - - - - - - - - - - - - -
    input

    (Tensor) the input tensor.

    dim

    (int) the dimension to reduce.

    keepdim

    (bool) whether the output tensor has dim retained or not.

    out

    (tuple, optional) the tuple of two output tensors (min, min_indices)

    other

    (Tensor) the second input tensor

    - -

    Note

    - -

    When the shapes do not match, the shape of the returned output tensor -follows the broadcasting rules .

    -

    min(input) -> Tensor

    - - - - -

    Returns the minimum value of all elements in the input tensor.

    -

    min(input, dim, keepdim=False, out=None) -> (Tensor, LongTensor)

    - - - - -

    Returns a namedtuple (values, indices) where values is the minimum -value of each row of the input tensor in the given dimension -dim. And indices is the index location of each minimum value found -(argmin).

    -

    Warning

    - - - -

    indices does not necessarily contain the first occurrence of each -minimal value found, unless it is unique. -The exact implementation details are device-specific. -Do not expect the same result when run on CPU and GPU in general.

    -

    If keepdim is True, the output tensors are of the same size as -input except in the dimension dim where they are of size 1. -Otherwise, dim is squeezed (see torch_squeeze), resulting in -the output tensors having 1 fewer dimension than input.

    -

    min(input, other, out=None) -> Tensor

    - - - - -

    Each element of the tensor input is compared with the corresponding -element of the tensor other and an element-wise minimum is taken. -The resulting tensor is returned.

    -

    The shapes of input and other don't need to match, -but they must be broadcastable .

    -

    $$ - \mbox{out}_i = \min(\mbox{tensor}_i, \mbox{other}_i) -$$

    - -

    Examples

    -
    # \dontrun{ - -a = torch_randn(c(1, 3)) -a
    #> torch_tensor -#> -1.0189 1.0439 1.3884 -#> [ CPUFloatType{1,3} ]
    torch_min(a)
    #> torch_tensor -#> -1.01891 -#> [ CPUFloatType{} ]
    - -a = torch_randn(c(4, 4)) -a
    #> torch_tensor -#> 0.9733 2.4571 1.7912 -1.4290 -#> 0.5607 -0.5847 -0.4779 -0.7823 -#> -0.7391 0.6672 -0.9647 0.1703 -#> -0.5473 -0.2047 -0.1148 1.4254 -#> [ CPUFloatType{4,4} ]
    torch_min(a, dim = 1)
    #> [[1]] -#> torch_tensor -#> -0.7391 -#> -0.5847 -#> -0.9647 -#> -1.4290 -#> [ CPUFloatType{4} ] -#> -#> [[2]] -#> torch_tensor -#> 3 -#> 2 -#> 3 -#> 1 -#> [ CPULongType{4} ] -#>
    - -a = torch_randn(c(4)) -a
    #> torch_tensor -#> 0.2877 -#> -1.1317 -#> -1.0846 -#> 0.0735 -#> [ CPUFloatType{4} ]
    b = torch_randn(c(4)) -b
    #> torch_tensor -#> -1.2118 -#> -0.7290 -#> -0.8948 -#> 0.5896 -#> [ CPUFloatType{4} ]
    torch_min(a, other = b)
    #> torch_tensor -#> -1.2118 -#> -1.1317 -#> -1.0846 -#> 0.0735 -#> [ CPUFloatType{4} ]
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_mm.html b/docs/reference/torch_mm.html deleted file mode 100644 index d2edb9063eeec1bfce726e3b98d3e31a6ad5ad45..0000000000000000000000000000000000000000 --- a/docs/reference/torch_mm.html +++ /dev/null @@ -1,234 +0,0 @@ - - - - - - - - -Mm — torch_mm • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Mm

    -
    - - -

    Arguments

    - - - - - - - - - - - - - - -
    input

    (Tensor) the first matrix to be multiplied

    mat2

    (Tensor) the second matrix to be multiplied

    out

    (Tensor, optional) the output tensor.

    - -

    Note

    - -

    This function does not broadcast . -For broadcasting matrix products, see torch_matmul.

    -

    mm(input, mat2, out=None) -> Tensor

    - - - - -

    Performs a matrix multiplication of the matrices input and mat2.

    -

    If input is a \((n \times m)\) tensor, mat2 is a -\((m \times p)\) tensor, out will be a \((n \times p)\) tensor.

    - -

    Examples

    -
    # \dontrun{ - -mat1 = torch_randn(c(2, 3)) -mat2 = torch_randn(c(3, 3)) -torch_mm(mat1, mat2)
    #> torch_tensor -#> -4.2557 -0.6099 -0.8782 -#> -3.0933 0.3112 -0.9391 -#> [ CPUFloatType{2,3} ]
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_mode.html b/docs/reference/torch_mode.html deleted file mode 100644 index 31d6e15f0f06882d04ff9acd1d76bf376dd99ad6..0000000000000000000000000000000000000000 --- a/docs/reference/torch_mode.html +++ /dev/null @@ -1,254 +0,0 @@ - - - - - - - - -Mode — torch_mode • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Mode

    -
    - - -

    Arguments

    - - - - - - - - - - - - - - - - - - -
    input

    (Tensor) the input tensor.

    dim

    (int) the dimension to reduce.

    keepdim

    (bool) whether the output tensor has dim retained or not.

    out

    (tuple, optional) the result tuple of two output tensors (values, indices)

    - -

    Note

    - -

    This function is not defined for torch_cuda.Tensor yet.

    -

    mode(input, dim=-1, keepdim=False, out=None) -> (Tensor, LongTensor)

    - - - - -

    Returns a namedtuple (values, indices) where values is the mode -value of each row of the input tensor in the given dimension -dim, i.e. a value which appears most often -in that row, and indices is the index location of each mode value found.

    -

    By default, dim is the last dimension of the input tensor.

    -

    If keepdim is True, the output tensors are of the same size as -input except in the dimension dim where they are of size 1. -Otherwise, dim is squeezed (see torch_squeeze), resulting -in the output tensors having 1 fewer dimension than input.

    - -

    Examples

    -
    # \dontrun{ - -a = torch_randint(0, 50, size = list(5)) -a
    #> torch_tensor -#> 18 -#> 0 -#> 7 -#> 4 -#> 19 -#> [ CPUFloatType{5} ]
    torch_mode(a, 1)
    #> [[1]] -#> torch_tensor -#> 0 -#> [ CPUFloatType{} ] -#> -#> [[2]] -#> torch_tensor -#> 1 -#> [ CPULongType{} ] -#>
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_mul.html b/docs/reference/torch_mul.html deleted file mode 100644 index e3ae512a55b72a397bb34d547cc3bf7161f1cdfd..0000000000000000000000000000000000000000 --- a/docs/reference/torch_mul.html +++ /dev/null @@ -1,275 +0,0 @@ - - - - - - - - -Mul — torch_mul • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Mul

    -
    - - -

    Arguments

    - - - - - - - - - - - - - - - - - - - - - - - - - - -
    input

    NA

    value

    (Number) the number to be multiplied to each element of input

    out

    NA

    input

    (Tensor) the first multiplicand tensor

    other

    (Tensor) the second multiplicand tensor

    out

    (Tensor, optional) the output tensor.

    - -

    mul(input, other, out=None)

    - - - - -

    Multiplies each element of the input input with the scalar -other and returns a new resulting tensor.

    -

    $$ - \mbox{out}_i = \mbox{other} \times \mbox{input}_i -$$ -If input is of type FloatTensor or DoubleTensor, other -should be a real number, otherwise it should be an integer

    - - -

    Each element of the tensor input is multiplied by the corresponding -element of the Tensor other. The resulting tensor is returned.

    -

    The shapes of input and other must be -broadcastable .

    -

    $$ - \mbox{out}_i = \mbox{input}_i \times \mbox{other}_i -$$

    - -

    Examples

    -
    # \dontrun{ - -a = torch_randn(c(3)) -a
    #> torch_tensor -#> 0.7353 -#> 0.3087 -#> 0.8232 -#> [ CPUFloatType{3} ]
    torch_mul(a, 100)
    #> torch_tensor -#> 73.5282 -#> 30.8688 -#> 82.3200 -#> [ CPUFloatType{3} ]
    - -a = torch_randn(c(4, 1)) -a
    #> torch_tensor -#> 0.1683 -#> 0.6845 -#> 1.4773 -#> 1.1179 -#> [ CPUFloatType{4,1} ]
    b = torch_randn(c(1, 4)) -b
    #> torch_tensor -#> -1.4203 0.6324 -0.8087 -0.5061 -#> [ CPUFloatType{1,4} ]
    torch_mul(a, b)
    #> torch_tensor -#> -0.2390 0.1064 -0.1361 -0.0852 -#> -0.9722 0.4329 -0.5535 -0.3464 -#> -2.0981 0.9343 -1.1946 -0.7476 -#> -1.5877 0.7070 -0.9040 -0.5657 -#> [ CPUFloatType{4,4} ]
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_multinomial.html b/docs/reference/torch_multinomial.html deleted file mode 100644 index 8fbaa1192289e423f8f1882d7db9da6c97d2ded0..0000000000000000000000000000000000000000 --- a/docs/reference/torch_multinomial.html +++ /dev/null @@ -1,263 +0,0 @@ - - - - - - - - -Multinomial — torch_multinomial • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Multinomial

    -
    - - -

    Arguments

    - - - - - - - - - - - - - - - - - - - - - - -
    input

    (Tensor) the input tensor containing probabilities

    num_samples

    (int) number of samples to draw

    replacement

    (bool, optional) whether to draw with replacement or not

    generator

    (torch.Generator, optional) a pseudorandom number generator for sampling

    out

    (Tensor, optional) the output tensor.

    - -

    Note

    - - -
    The rows of `input` do not need to sum to one (in which case we use
    -the values as weights), but must be non-negative, finite and have
    -a non-zero sum.
    -
    - -

    Indices are ordered from left to right according to when each was sampled -(first samples are placed in first column).

    -

    If input is a vector, out is a vector of size num_samples.

    -

    If input is a matrix with m rows, out is an matrix of shape -\((m \times \mbox{num\_samples})\).

    -

    If replacement is True, samples are drawn with replacement.

    -

    If not, they are drawn without replacement, which means that when a -sample index is drawn for a row, it cannot be drawn again for that row.

    -
    When drawn without replacement, `num_samples` must be lower than
    -number of non-zero elements in `input` (or the min number of non-zero
    -elements in each row of `input` if it is a matrix).
    -
    - -

    multinomial(input, num_samples, replacement=False, *, generator=None, out=None) -> LongTensor

    - - - - -

    Returns a tensor where each row contains num_samples indices sampled -from the multinomial probability distribution located in the corresponding row -of tensor input.

    - -

    Examples

    -
    # \dontrun{ - -weights = torch_tensor(c(0, 10, 3, 0), dtype=torch_float()) # create a tensor of weights -torch_multinomial(weights, 2)
    #> torch_tensor -#> 1 -#> 2 -#> [ CPULongType{2} ]
    torch_multinomial(weights, 4, replacement=TRUE)
    #> torch_tensor -#> 1 -#> 2 -#> 1 -#> 2 -#> [ CPULongType{4} ]
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_mv.html b/docs/reference/torch_mv.html deleted file mode 100644 index bf1d66bbc20b3a0e3610203e7bf059d0077400cb..0000000000000000000000000000000000000000 --- a/docs/reference/torch_mv.html +++ /dev/null @@ -1,234 +0,0 @@ - - - - - - - - -Mv — torch_mv • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Mv

    -
    - - -

    Arguments

    - - - - - - - - - - - - - - -
    input

    (Tensor) matrix to be multiplied

    vec

    (Tensor) vector to be multiplied

    out

    (Tensor, optional) the output tensor.

    - -

    Note

    - -

    This function does not broadcast .

    -

    mv(input, vec, out=None) -> Tensor

    - - - - -

    Performs a matrix-vector product of the matrix input and the vector -vec.

    -

    If input is a \((n \times m)\) tensor, vec is a 1-D tensor of -size \(m\), out will be 1-D of size \(n\).

    - -

    Examples

    -
    # \dontrun{ - -mat = torch_randn(c(2, 3)) -vec = torch_randn(c(3)) -torch_mv(mat, vec)
    #> torch_tensor -#> -0.9277 -#> 1.8568 -#> [ CPUFloatType{2} ]
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_mvlgamma.html b/docs/reference/torch_mvlgamma.html deleted file mode 100644 index b5100cece3ce4ad2ea63f4f297c51c5fbb71878f..0000000000000000000000000000000000000000 --- a/docs/reference/torch_mvlgamma.html +++ /dev/null @@ -1,232 +0,0 @@ - - - - - - - - -Mvlgamma — torch_mvlgamma • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Mvlgamma

    -
    - - -

    Arguments

    - - - - - - - - - - -
    input

    (Tensor) the tensor to compute the multivariate log-gamma function

    p

    (int) the number of dimensions

    - -

    mvlgamma(input, p) -> Tensor

    - - - - -

    Computes the multivariate log-gamma function <https://en.wikipedia.org/wiki/Multivariate_gamma_function>_) with dimension -\(p\) element-wise, given by

    -

    $$ - \log(\Gamma_{p}(a)) = C + \displaystyle \sum_{i=1}^{p} \log\left(\Gamma\left(a - \frac{i - 1}{2}\right)\right) -$$ -where \(C = \log(\pi) \times \frac{p (p - 1)}{4}\) and \(\Gamma(\cdot)\) is the Gamma function.

    -

    All elements must be greater than \(\frac{p - 1}{2}\), otherwise an error would be thrown.

    - -

    Examples

    -
    # \dontrun{ - -a = torch_empty(c(2, 3))$uniform_(1, 2) -a
    #> torch_tensor -#> 1.2019 1.8425 1.1256 -#> 1.9082 1.8734 1.5464 -#> [ CPUFloatType{2,3} ]
    torch_mvlgamma(a, 2)
    #> torch_tensor -#> 0.7450 0.3997 0.8720 -#> 0.4162 0.4065 0.4292 -#> [ CPUFloatType{2,3} ]
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_narrow.html b/docs/reference/torch_narrow.html deleted file mode 100644 index f69cc0e9528b5ddf8fe51667c84bc124723f39e6..0000000000000000000000000000000000000000 --- a/docs/reference/torch_narrow.html +++ /dev/null @@ -1,237 +0,0 @@ - - - - - - - - -Narrow — torch_narrow • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Narrow

    -
    - - -

    Arguments

    - - - - - - - - - - - - - - - - - - -
    input

    (Tensor) the tensor to narrow

    dim

    (int) the dimension along which to narrow

    start

    (int) the starting dimension

    length

    (int) the distance to the ending dimension

    - -

    narrow(input, dim, start, length) -> Tensor

    - - - - -

    Returns a new tensor that is a narrowed version of input tensor. The -dimension dim is input from start to start + length. The -returned tensor and input tensor share the same underlying storage.

    - -

    Examples

    -
    # \dontrun{ - -x = torch_tensor(matrix(c(1:9), ncol = 3, byrow= TRUE)) -torch_narrow(x, 1, torch_tensor(0L)$sum(dim = 1), 2)
    #> torch_tensor -#> 1 2 3 -#> 4 5 6 -#> [ CPUIntType{2,3} ]
    torch_narrow(x, 2, torch_tensor(1L)$sum(dim = 1), 2)
    #> torch_tensor -#> 2 3 -#> 5 6 -#> 8 9 -#> [ CPUIntType{3,2} ]
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_ne.html b/docs/reference/torch_ne.html deleted file mode 100644 index cb157d58de99971d9a761af2d89cc2e28ee425a8..0000000000000000000000000000000000000000 --- a/docs/reference/torch_ne.html +++ /dev/null @@ -1,229 +0,0 @@ - - - - - - - - -Ne — torch_ne • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Ne

    -
    - - -

    Arguments

    - - - - - - - - - - - - - - -
    input

    (Tensor) the tensor to compare

    other

    (Tensor or float) the tensor or value to compare

    out

    (Tensor, optional) the output tensor that must be a BoolTensor

    - -

    ne(input, other, out=None) -> Tensor

    - - - - -

    Computes \(input \neq other\) element-wise.

    -

    The second argument can be a number or a tensor whose shape is -broadcastable with the first argument.

    - -

    Examples

    -
    # \dontrun{ - -torch_ne(torch_tensor(matrix(1:4, ncol = 2, byrow=TRUE)), - torch_tensor(matrix(rep(c(1,4), each = 2), ncol = 2, byrow=TRUE)))
    #> torch_tensor -#> 0 1 -#> 1 0 -#> [ CPUBoolType{2,2} ]
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_neg.html b/docs/reference/torch_neg.html deleted file mode 100644 index 7300b45ce504ef7f01412d3e67d081fa8e465f82..0000000000000000000000000000000000000000 --- a/docs/reference/torch_neg.html +++ /dev/null @@ -1,235 +0,0 @@ - - - - - - - - -Neg — torch_neg • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Neg

    -
    - - -

    Arguments

    - - - - - - - - - - -
    input

    (Tensor) the input tensor.

    out

    (Tensor, optional) the output tensor.

    - -

    neg(input, out=None) -> Tensor

    - - - - -

    Returns a new tensor with the negative of the elements of input.

    -

    $$ - \mbox{out} = -1 \times \mbox{input} -$$

    - -

    Examples

    -
    # \dontrun{ - -a = torch_randn(c(5)) -a
    #> torch_tensor -#> 0.3160 -#> -0.4731 -#> 0.1641 -#> 0.6355 -#> 0.2480 -#> [ CPUFloatType{5} ]
    torch_neg(a)
    #> torch_tensor -#> -0.3160 -#> 0.4731 -#> -0.1641 -#> -0.6355 -#> -0.2480 -#> [ CPUFloatType{5} ]
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_nonzero.html b/docs/reference/torch_nonzero.html deleted file mode 100644 index 6ce09e3548f3ebc4588d5f942d5baabb8a6d9832..0000000000000000000000000000000000000000 --- a/docs/reference/torch_nonzero.html +++ /dev/null @@ -1,254 +0,0 @@ - - - - - - - - -Nonzero — torch_nonzero • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Nonzero

    -
    - - -

    Arguments

    - - - - - - - - - - -
    input

    (Tensor) the input tensor.

    out

    (LongTensor, optional) the output tensor containing indices

    - -

    Note

    - - -
    [`torch_nonzero(..., as_tuple=False) &lt;torch.nonzero&gt;`] (default) returns a
    -2-D tensor where each row is the index for a nonzero value.
    -
    -[`torch_nonzero(..., as_tuple=True) &lt;torch.nonzero&gt;`] returns a tuple of 1-D
    -index tensors, allowing for advanced indexing, so ``x[x.nonzero(as_tuple=True)]``
    -gives all nonzero values of tensor ``x``. Of the returned tuple, each index tensor
    -contains nonzero indices for a certain dimension.
    -
    -See below for more details on the two behaviors.
    -
    - -

    nonzero(input, *, out=None, as_tuple=False) -> LongTensor or tuple of LongTensors

    - - - - -

    When as_tuple is False (default):

    -

    Returns a tensor containing the indices of all non-zero elements of -input. Each row in the result contains the indices of a non-zero -element in input. The result is sorted lexicographically, with -the last index changing the fastest (C-style).

    -

    If input has \(n\) dimensions, then the resulting indices tensor -out is of size \((z \times n)\), where \(z\) is the total number of -non-zero elements in the input tensor.

    -

    When as_tuple is True:

    -

    Returns a tuple of 1-D tensors, one for each dimension in input, -each containing the indices (in that dimension) of all non-zero elements of -input .

    -

    If input has \(n\) dimensions, then the resulting tuple contains \(n\) -tensors of size \(z\), where \(z\) is the total number of -non-zero elements in the input tensor.

    -

    As a special case, when input has zero dimensions and a nonzero scalar -value, it is treated as a one-dimensional tensor with one element.

    - -

    Examples

    -
    # \dontrun{ - -torch_nonzero(torch_tensor(c(1, 1, 1, 0, 1)))
    #> torch_tensor -#> 0 -#> 1 -#> 2 -#> 4 -#> [ CPULongType{4,1} ]
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_norm.html b/docs/reference/torch_norm.html deleted file mode 100644 index 4bce07336b3b2f07c7eedfb0c31c197bccf25bfd..0000000000000000000000000000000000000000 --- a/docs/reference/torch_norm.html +++ /dev/null @@ -1,246 +0,0 @@ - - - - - - - - -Norm — torch_norm • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Norm

    -
    - - -

    Arguments

    - - - - - - - - - - - - - - - - - - - - - - - - - - -
    input

    (Tensor) the input tensor

    p

    (int, float, inf, -inf, 'fro', 'nuc', optional) the order of norm. Default: 'fro' The following norms can be calculated: ===== ============================ ========================== ord matrix norm vector norm ===== ============================ ========================== None Frobenius norm 2-norm 'fro' Frobenius norm -- 'nuc' nuclear norm -- Other as vec norm when dim is None sum(abs(x)ord)(1./ord) ===== ============================ ==========================

    dim

    (int, 2-tuple of ints, 2-list of ints, optional) If it is an int, vector norm will be calculated, if it is 2-tuple of ints, matrix norm will be calculated. If the value is None, matrix norm will be calculated when the input tensor only has two dimensions, vector norm will be calculated when the input tensor only has one dimension. If the input tensor has more than two dimensions, the vector norm will be applied to last dimension.

    keepdim

    (bool, optional) whether the output tensors have dim retained or not. Ignored if dim = None and out = None. Default: False

    out

    (Tensor, optional) the output tensor. Ignored if dim = None and out = None.

    dtype

    (torch.dtype, optional) the desired data type of returned tensor. If specified, the input tensor is casted to 'dtype' while performing the operation. Default: None.

    - -

    TEST

    - - - - -

    Returns the matrix norm or vector norm of a given tensor.

    - -

    Examples

    -
    # \dontrun{ - -a = torch_arange(0, 9, dtype = torch_float()) -b = a$reshape(list(3, 3)) -torch_norm(a)
    #> torch_tensor -#> 14.2829 -#> [ CPUFloatType{} ]
    torch_norm(b)
    #> torch_tensor -#> 14.2829 -#> [ CPUFloatType{} ]
    torch_norm(a, Inf)
    #> torch_tensor -#> 8 -#> [ CPUFloatType{} ]
    torch_norm(b, Inf)
    #> torch_tensor -#> 8 -#> [ CPUFloatType{} ]
    -# } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_normal.html b/docs/reference/torch_normal.html deleted file mode 100644 index c99cfa0b73aa1eb060e01ea6f01c1fc766e325e1..0000000000000000000000000000000000000000 --- a/docs/reference/torch_normal.html +++ /dev/null @@ -1,260 +0,0 @@ - - - - - - - - -Normal — torch_normal • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Normal

    -
    - - -

    Arguments

    - - - - - - - - - - - - - - - - - - - - - - -
    mean

    (Tensor) the tensor of per-element means

    std

    (Tensor) the tensor of per-element standard deviations

    generator

    (torch.Generator, optional) a pseudorandom number generator for sampling

    out

    (Tensor, optional) the output tensor.

    size

    (int...) a sequence of integers defining the shape of the output tensor.

    - -

    Note

    - -

    When the shapes do not match, the shape of mean -is used as the shape for the returned output tensor

    -

    normal(mean, std, *, generator=None, out=None) -> Tensor

    - - - - -

    Returns a tensor of random numbers drawn from separate normal distributions -whose mean and standard deviation are given.

    -

    The mean is a tensor with the mean of -each output element's normal distribution

    -

    The std is a tensor with the standard deviation of -each output element's normal distribution

    -

    The shapes of mean and std don't need to match, but the -total number of elements in each tensor need to be the same.

    -

    normal(mean=0.0, std, out=None) -> Tensor

    - - - - -

    Similar to the function above, but the means are shared among all drawn -elements.

    -

    normal(mean, std=1.0, out=None) -> Tensor

    - - - - -

    Similar to the function above, but the standard-deviations are shared among -all drawn elements.

    -

    normal(mean, std, size, *, out=None) -> Tensor

    - - - - -

    Similar to the function above, but the means and standard deviations are shared -among all drawn elements. The resulting tensor has size given by size.

    - -

    Examples

    -
    
    -  
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_ones.html b/docs/reference/torch_ones.html deleted file mode 100644 index 550be18be3e6da3fc810e46be6012433d30865c9..0000000000000000000000000000000000000000 --- a/docs/reference/torch_ones.html +++ /dev/null @@ -1,245 +0,0 @@ - - - - - - - - -Ones — torch_ones • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Ones

    -
    - - -

    Arguments

    - - - - - - - - - - - - - - - - - - - - - - - - - - -
    size

    (int...) a sequence of integers defining the shape of the output tensor. Can be a variable number of arguments or a collection like a list or tuple.

    out

    (Tensor, optional) the output tensor.

    dtype

    (torch.dtype, optional) the desired data type of returned tensor. Default: if None, uses a global default (see torch_set_default_tensor_type).

    layout

    (torch.layout, optional) the desired layout of returned Tensor. Default: torch_strided.

    device

    (torch.device, optional) the desired device of returned tensor. Default: if None, uses the current device for the default tensor type (see torch_set_default_tensor_type). device will be the CPU for CPU tensor types and the current CUDA device for CUDA tensor types.

    requires_grad

    (bool, optional) If autograd should record operations on the returned tensor. Default: False.

    - -

    ones(*size, out=None, dtype=None, layout=torch.strided, device=None, requires_grad=False) -> Tensor

    - - - - -

    Returns a tensor filled with the scalar value 1, with the shape defined -by the variable argument size.

    - -

    Examples

    -
    # \dontrun{ - -torch_ones(c(2, 3))
    #> torch_tensor -#> 1 1 1 -#> 1 1 1 -#> [ CPUFloatType{2,3} ]
    torch_ones(c(5))
    #> torch_tensor -#> 1 -#> 1 -#> 1 -#> 1 -#> 1 -#> [ CPUFloatType{5} ]
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_ones_like.html b/docs/reference/torch_ones_like.html deleted file mode 100644 index b469f101ed6b38f3ba502d9b6669020a996fe423..0000000000000000000000000000000000000000 --- a/docs/reference/torch_ones_like.html +++ /dev/null @@ -1,248 +0,0 @@ - - - - - - - - -Ones_like — torch_ones_like • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Ones_like

    -
    - - -

    Arguments

    - - - - - - - - - - - - - - - - - - - - - - - - - - -
    input

    (Tensor) the size of input will determine size of the output tensor.

    dtype

    (torch.dtype, optional) the desired data type of returned Tensor. Default: if None, defaults to the dtype of input.

    layout

    (torch.layout, optional) the desired layout of returned tensor. Default: if None, defaults to the layout of input.

    device

    (torch.device, optional) the desired device of returned tensor. Default: if None, defaults to the device of input.

    requires_grad

    (bool, optional) If autograd should record operations on the returned tensor. Default: False.

    memory_format

    (torch.memory_format, optional) the desired memory format of returned Tensor. Default: torch_preserve_format.

    - -

    ones_like(input, dtype=None, layout=None, device=None, requires_grad=False, memory_format=torch.preserve_format) -> Tensor

    - - - - -

    Returns a tensor filled with the scalar value 1, with the same size as -input. torch_ones_like(input) is equivalent to -torch_ones(input.size(), dtype=input.dtype, layout=input.layout, device=input.device).

    -

    Warning

    - - - -

    As of 0.4, this function does not support an out keyword. As an alternative, -the old torch_ones_like(input, out=output) is equivalent to -torch_ones(input.size(), out=output).

    - -

    Examples

    -
    # \dontrun{ - -input = torch_empty(c(2, 3)) -torch_ones_like(input)
    #> torch_tensor -#> 1 1 1 -#> 1 1 1 -#> [ CPUFloatType{2,3} ]
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_orgqr.html b/docs/reference/torch_orgqr.html deleted file mode 100644 index d2b07ae4d4d07c50c9f7e17b508c239ba2bac247..0000000000000000000000000000000000000000 --- a/docs/reference/torch_orgqr.html +++ /dev/null @@ -1,217 +0,0 @@ - - - - - - - - -Orgqr — torch_orgqr • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Orgqr

    -
    - - -

    Arguments

    - - - - - - - - - - -
    input

    (Tensor) the a from torch_geqrf.

    input2

    (Tensor) the tau from torch_geqrf.

    - -

    orgqr(input, input2) -> Tensor

    - - - - -

    Computes the orthogonal matrix Q of a QR factorization, from the (input, input2) -tuple returned by torch_geqrf.

    -

    This directly calls the underlying LAPACK function ?orgqr. -See LAPACK documentation for orgqr_ for further details.

    - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_ormqr.html b/docs/reference/torch_ormqr.html deleted file mode 100644 index f2e374bca0bbb0663443869d0365522d2b90766f..0000000000000000000000000000000000000000 --- a/docs/reference/torch_ormqr.html +++ /dev/null @@ -1,221 +0,0 @@ - - - - - - - - -Ormqr — torch_ormqr • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Ormqr

    -
    - - -

    Arguments

    - - - - - - - - - - - - - - -
    input

    (Tensor) the a from torch_geqrf.

    input2

    (Tensor) the tau from torch_geqrf.

    input3

    (Tensor) the matrix to be multiplied.

    - -

    ormqr(input, input2, input3, left=True, transpose=False) -> Tensor

    - - - - -

    Multiplies mat (given by input3) by the orthogonal Q matrix of the QR factorization -formed by torch_geqrf that is represented by (a, tau) (given by (input, input2)).

    -

    This directly calls the underlying LAPACK function ?ormqr. -See LAPACK documentation for ormqr_ for further details.

    - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_pdist.html b/docs/reference/torch_pdist.html deleted file mode 100644 index 7138206700eb2164b4ad0f57ea01b1f1c4949dd7..0000000000000000000000000000000000000000 --- a/docs/reference/torch_pdist.html +++ /dev/null @@ -1,223 +0,0 @@ - - - - - - - - -Pdist — torch_pdist • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Pdist

    -
    - - -

    Arguments

    - - - - - - - - - - -
    input

    NA input tensor of shape \(N \times M\).

    p

    NA p value for the p-norm distance to calculate between each vector pair \(\in [0, \infty]\).

    - -

    pdist(input, p=2) -> Tensor

    - - - - -

    Computes the p-norm distance between every pair of row vectors in the input. -This is identical to the upper triangular portion, excluding the diagonal, of -torch_norm(input[:, None] - input, dim=2, p=p). This function will be faster -if the rows are contiguous.

    -

    If input has shape \(N \times M\) then the output will have shape -\(\frac{1}{2} N (N - 1)\).

    -

    This function is equivalent to scipy.spatial.distance.pdist(input, 'minkowski', p=p) if \(p \in (0, \infty)\). When \(p = 0\) it is -equivalent to scipy.spatial.distance.pdist(input, 'hamming') * M. -When \(p = \infty\), the closest scipy function is -scipy.spatial.distance.pdist(xn, lambda x, y: np.abs(x - y).max()).

    - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_pinverse.html b/docs/reference/torch_pinverse.html deleted file mode 100644 index 6a20d16f0f81a26f5251bf0d07648cd8a0dca1c7..0000000000000000000000000000000000000000 --- a/docs/reference/torch_pinverse.html +++ /dev/null @@ -1,257 +0,0 @@ - - - - - - - - -Pinverse — torch_pinverse • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Pinverse

    -
    - - -

    Arguments

    - - - - - - - - - - -
    input

    (Tensor) The input tensor of size \((*, m, n)\) where \(*\) is zero or more batch dimensions

    rcond

    (float) A floating point value to determine the cutoff for small singular values. Default: 1e-15

    - -

    Note

    - - -
    This method is implemented using the Singular Value Decomposition.
    -
    - -
    The pseudo-inverse is not necessarily a continuous function in the elements of the matrix `[1]`_.
    -Therefore, derivatives are not always existent, and exist for a constant rank only `[2]`_.
    -However, this method is backprop-able due to the implementation by using SVD results, and
    -could be unstable. Double-backward will also be unstable due to the usage of SVD internally.
    -See `~torch.svd` for more details.
    -
    - -

    pinverse(input, rcond=1e-15) -> Tensor

    - - - - -

    Calculates the pseudo-inverse (also known as the Moore-Penrose inverse) of a 2D tensor. -Please look at Moore-Penrose inverse_ for more details

    - -

    Examples

    -
    # \dontrun{ - -input = torch_randn(c(3, 5)) -input
    #> torch_tensor -#> 0.0625 0.0470 -0.6356 1.0166 -0.2998 -#> 0.2736 -0.5027 2.6768 0.3714 0.5533 -#> -0.5951 1.2603 0.2886 0.6099 -1.3339 -#> [ CPUFloatType{3,5} ]
    torch_pinverse(input)
    #> torch_tensor -#> 0.1974 0.0598 -0.1754 -#> -0.2446 -0.0864 0.3409 -#> -0.1949 0.3106 0.1489 -#> 0.8721 0.2108 -0.0062 -#> 0.0375 0.0552 -0.3200 -#> [ CPUFloatType{5,3} ]
    # Batched pinverse example -a = torch_randn(c(2,6,3)) -b = torch_pinverse(a) -torch_matmul(b, a)
    #> torch_tensor -#> (1,.,.) = -#> 1.0000e+00 5.2154e-08 -1.0431e-07 -#> 2.9802e-08 1.0000e+00 3.7253e-08 -#> 2.9802e-08 -6.7055e-08 1.0000e+00 -#> -#> (2,.,.) = -#> 1.0000e+00 2.3283e-08 1.9372e-07 -#> 2.9802e-08 1.0000e+00 -5.9605e-07 -#> 2.9802e-08 1.5087e-07 1.0000e+00 -#> [ CPUFloatType{2,3,3} ]
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_pixel_shuffle.html b/docs/reference/torch_pixel_shuffle.html deleted file mode 100644 index 86844bb6a8b005bffd5f623fd54cf44355d37c40..0000000000000000000000000000000000000000 --- a/docs/reference/torch_pixel_shuffle.html +++ /dev/null @@ -1,221 +0,0 @@ - - - - - - - - -Pixel_shuffle — torch_pixel_shuffle • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Pixel_shuffle

    -
    - - -

    Arguments

    - - - - - - - - - - -
    input

    (Tensor) the input tensor

    upscale_factor

    (int) factor to increase spatial resolution by

    - -

    Rearranges elements in a tensor of shape

    - -

    math:(*, C \times r^2, H, W) to a :

    -

    Rearranges elements in a tensor of shape \((*, C \times r^2, H, W)\) to a -tensor of shape \((*, C, H \times r, W \times r)\).

    -

    See ~torch.nn.PixelShuffle for details.

    - -

    Examples

    -
    # \dontrun{ - -input = torch_randn(c(1, 9, 4, 4)) -output = nnf_pixel_shuffle(input, 3) -print(output$size())
    #> [1] 1 1 12 12
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_poisson.html b/docs/reference/torch_poisson.html deleted file mode 100644 index f6be2c4d6230305532d35026988996d8ebc02e75..0000000000000000000000000000000000000000 --- a/docs/reference/torch_poisson.html +++ /dev/null @@ -1,230 +0,0 @@ - - - - - - - - -Poisson — torch_poisson • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Poisson

    -
    - - -

    Arguments

    - - - - - - - - - - -
    input

    (Tensor) the input tensor containing the rates of the Poisson distribution

    generator

    (torch.Generator, optional) a pseudorandom number generator for sampling

    - -

    poisson(input *, generator=None) -> Tensor

    - - - - -

    Returns a tensor of the same size as input with each element -sampled from a Poisson distribution with rate parameter given by the corresponding -element in input i.e.,

    -

    $$ - \mbox{out}_i \sim \mbox{Poisson}(\mbox{input}_i) -$$

    - -

    Examples

    -
    # \dontrun{ - -rates = torch_rand(c(4, 4)) * 5 # rate parameter between 0 and 5 -torch_poisson(rates)
    #> torch_tensor -#> 1 4 0 4 -#> 6 0 4 2 -#> 1 0 1 0 -#> 1 3 3 4 -#> [ CPUFloatType{4,4} ]
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_polygamma.html b/docs/reference/torch_polygamma.html deleted file mode 100644 index a9bc1038008dd4b94e5a17273523816cdf6f5996..0000000000000000000000000000000000000000 --- a/docs/reference/torch_polygamma.html +++ /dev/null @@ -1,230 +0,0 @@ - - - - - - - - -Polygamma — torch_polygamma • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Polygamma

    -
    - - -

    Arguments

    - - - - - - - - - - - - - - -
    n

    (int) the order of the polygamma function

    input

    (Tensor) the input tensor.

    out

    (Tensor, optional) the output tensor.

    - -

    Note

    - - -
    This function is not implemented for \eqn{n \geq 2}.
    -
    - -

    polygamma(n, input, out=None) -> Tensor

    - - - - -

    Computes the \(n^{th}\) derivative of the digamma function on input. -\(n \geq 0\) is called the order of the polygamma function.

    -

    $$ - \psi^{(n)}(x) = \frac{d^{(n)}}{dx^{(n)}} \psi(x) -$$

    - -

    Examples

    -
    
    -  
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_pow.html b/docs/reference/torch_pow.html deleted file mode 100644 index ea1963ff6afc0d4e13c1d94486c90406cb702dec..0000000000000000000000000000000000000000 --- a/docs/reference/torch_pow.html +++ /dev/null @@ -1,288 +0,0 @@ - - - - - - - - -Pow — torch_pow • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Pow

    -
    - - -

    Arguments

    - - - - - - - - - - - - - - - - - - -
    input

    (Tensor) the input tensor.

    exponent

    (float or tensor) the exponent value

    out

    (Tensor, optional) the output tensor.

    self

    (float) the scalar base value for the power operation

    - -

    pow(input, exponent, out=None) -> Tensor

    - - - - -

    Takes the power of each element in input with exponent and -returns a tensor with the result.

    -

    exponent can be either a single float number or a Tensor -with the same number of elements as input.

    -

    When exponent is a scalar value, the operation applied is:

    -

    $$ - \mbox{out}_i = x_i^{\mbox{exponent}} -$$ -When exponent is a tensor, the operation applied is:

    -

    $$ - \mbox{out}_i = x_i^{\mbox{exponent}_i} -$$ -When exponent is a tensor, the shapes of input -and exponent must be broadcastable .

    -

    pow(self, exponent, out=None) -> Tensor

    - - - - -

    self is a scalar float value, and exponent is a tensor. -The returned tensor out is of the same shape as exponent

    -

    The operation applied is:

    -

    $$ - \mbox{out}_i = \mbox{self} ^ {\mbox{exponent}_i} -$$

    - -

    Examples

    -
    # \dontrun{ - -a = torch_randn(c(4)) -a
    #> torch_tensor -#> -0.6638 -#> 0.2351 -#> -0.1040 -#> 0.7775 -#> [ CPUFloatType{4} ]
    torch_pow(a, 2)
    #> torch_tensor -#> 0.4406 -#> 0.0553 -#> 0.0108 -#> 0.6045 -#> [ CPUFloatType{4} ]
    exp = torch_arange(1., 5.) -a = torch_arange(1., 5.) -a
    #> torch_tensor -#> 1 -#> 2 -#> 3 -#> 4 -#> [ CPUFloatType{4} ]
    exp
    #> torch_tensor -#> 1 -#> 2 -#> 3 -#> 4 -#> [ CPUFloatType{4} ]
    torch_pow(a, exp)
    #> torch_tensor -#> 1 -#> 4 -#> 27 -#> 256 -#> [ CPUFloatType{4} ]
    - -exp = torch_arange(1., 5.) -base = 2 -torch_pow(base, exp)
    #> torch_tensor -#> 2 -#> 4 -#> 8 -#> 16 -#> [ CPUFloatType{4} ]
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_prod.html b/docs/reference/torch_prod.html deleted file mode 100644 index 725a21e922e71794e142f7f8926a9520873bcb1f..0000000000000000000000000000000000000000 --- a/docs/reference/torch_prod.html +++ /dev/null @@ -1,254 +0,0 @@ - - - - - - - - -Prod — torch_prod • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Prod

    -
    - - -

    Arguments

    - - - - - - - - - - - - - - - - - - -
    input

    (Tensor) the input tensor.

    dtype

    (torch.dtype, optional) the desired data type of returned tensor. If specified, the input tensor is casted to dtype before the operation is performed. This is useful for preventing data type overflows. Default: None.

    dim

    (int) the dimension to reduce.

    keepdim

    (bool) whether the output tensor has dim retained or not.

    - -

    prod(input, dtype=None) -> Tensor

    - - - - -

    Returns the product of all elements in the input tensor.

    -

    prod(input, dim, keepdim=False, dtype=None) -> Tensor

    - - - - -

    Returns the product of each row of the input tensor in the given -dimension dim.

    -

    If keepdim is True, the output tensor is of the same size -as input except in the dimension dim where it is of size 1. -Otherwise, dim is squeezed (see torch_squeeze), resulting in -the output tensor having 1 fewer dimension than input.

    - -

    Examples

    -
    # \dontrun{ - -a = torch_randn(c(1, 3)) -a
    #> torch_tensor -#> 0.0090 0.8878 1.0236 -#> [ CPUFloatType{1,3} ]
    torch_prod(a)
    #> torch_tensor -#> 0.00817587 -#> [ CPUFloatType{} ]
    - -a = torch_randn(c(4, 2)) -a
    #> torch_tensor -#> -1.1330 0.8404 -#> 2.0557 0.2876 -#> 2.0148 1.2245 -#> 0.4052 0.2208 -#> [ CPUFloatType{4,2} ]
    torch_prod(a, 1)
    #> torch_tensor -#> -1.9012 -#> 0.0653 -#> [ CPUFloatType{2} ]
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_promote_types.html b/docs/reference/torch_promote_types.html deleted file mode 100644 index d368948a8f2d67107516e90a1e5dcec3699ee9e6..0000000000000000000000000000000000000000 --- a/docs/reference/torch_promote_types.html +++ /dev/null @@ -1,222 +0,0 @@ - - - - - - - - -Promote_types — torch_promote_types • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Promote_types

    -
    - - -

    Arguments

    - - - - - - - - - - -
    type1

    (torch.dtype)

    type2

    (torch.dtype)

    - -

    promote_types(type1, type2) -> dtype

    - - - - -

    Returns the torch_dtype with the smallest size and scalar kind that is -not smaller nor of lower kind than either type1 or type2. See type promotion -documentation for more information on the type -promotion logic.

    - -

    Examples

    -
    # \dontrun{ - -torch_promote_types(torch_int32(), torch_float32())
    #> torch_Float
    torch_promote_types(torch_uint8(), torch_long())
    #> torch_Long
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_qr.html b/docs/reference/torch_qr.html deleted file mode 100644 index ed7e5d585923f752062092b07e339948ddcd0290..0000000000000000000000000000000000000000 --- a/docs/reference/torch_qr.html +++ /dev/null @@ -1,247 +0,0 @@ - - - - - - - - -Qr — torch_qr • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Qr

    -
    - - -

    Arguments

    - - - - - - - - - - - - - - -
    input

    (Tensor) the input tensor of size \((*, m, n)\) where * is zero or more batch dimensions consisting of matrices of dimension \(m \times n\).

    some

    (bool, optional) Set to True for reduced QR decomposition and False for complete QR decomposition.

    out

    (tuple, optional) tuple of Q and R tensors satisfying input = torch.matmul(Q, R). The dimensions of Q and R are \((*, m, k)\) and \((*, k, n)\) respectively, where \(k = \min(m, n)\) if some: is True and \(k = m\) otherwise.

    - -

    Note

    - -

    precision may be lost if the magnitudes of the elements of input -are large

    -

    While it should always give you a valid decomposition, it may not -give you the same one across platforms - it will depend on your -LAPACK implementation.

    -

    qr(input, some=True, out=None) -> (Tensor, Tensor)

    - - - - -

    Computes the QR decomposition of a matrix or a batch of matrices input, -and returns a namedtuple (Q, R) of tensors such that \(\mbox{input} = Q R\) -with \(Q\) being an orthogonal matrix or batch of orthogonal matrices and -\(R\) being an upper triangular matrix or batch of upper triangular matrices.

    -

    If some is True, then this function returns the thin (reduced) QR factorization. -Otherwise, if some is False, this function returns the complete QR factorization.

    - -

    Examples

    -
    # \dontrun{ - -a = torch_tensor(matrix(c(12., -51, 4, 6, 167, -68, -4, 24, -41), ncol = 3, byrow = TRUE)) -out = torch_qr(a) -q = out[[1]] -r = out[[2]] -torch_mm(q, r)$round()
    #> torch_tensor -#> 12 -51 4 -#> 6 167 -68 -#> -4 24 -41 -#> [ CPUFloatType{3,3} ]
    torch_mm(q$t(), q)$round()
    #> torch_tensor -#> 1 0 0 -#> 0 1 -0 -#> 0 -0 1 -#> [ CPUFloatType{3,3} ]
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_qscheme.html b/docs/reference/torch_qscheme.html deleted file mode 100644 index 31a9de9088ed5d28e42a7e82b63d074ef390b553..0000000000000000000000000000000000000000 --- a/docs/reference/torch_qscheme.html +++ /dev/null @@ -1,203 +0,0 @@ - - - - - - - - -Creates the corresponding Scheme object — torch_qscheme • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Creates the corresponding Scheme object

    -
    - -
    torch_per_channel_affine()
    -
    -torch_per_tensor_affine()
    -
    -torch_per_channel_symmetric()
    -
    -torch_per_tensor_symmetric()
    - - - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_quantize_per_channel.html b/docs/reference/torch_quantize_per_channel.html deleted file mode 100644 index ee1313b292b500bd4dc5df9a716d7d4b3b7e7fc2..0000000000000000000000000000000000000000 --- a/docs/reference/torch_quantize_per_channel.html +++ /dev/null @@ -1,239 +0,0 @@ - - - - - - - - -Quantize_per_channel — torch_quantize_per_channel • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Quantize_per_channel

    -
    - - -

    Arguments

    - - - - - - - - - - - - - - - - - - - - - - -
    input

    (Tensor) float tensor to quantize

    scales

    (Tensor) float 1D tensor of scales to use, size should match input.size(axis)

    zero_points

    (int) integer 1D tensor of offset to use, size should match input.size(axis)

    axis

    (int) dimension on which apply per-channel quantization

    dtype

    (torch.dtype) the desired data type of returned tensor. Has to be one of the quantized dtypes: torch_quint8, torch.qint8, torch.qint32

    - -

    quantize_per_channel(input, scales, zero_points, axis, dtype) -> Tensor

    - - - - -

    Converts a float tensor to per-channel quantized tensor with given scales and zero points.

    - -

    Examples

    -
    # \dontrun{ -x = torch_tensor(matrix(c(-1.0, 0.0, 1.0, 2.0), ncol = 2, byrow = TRUE)) -torch_quantize_per_channel(x, torch_tensor(c(0.1, 0.01)), - torch_tensor(c(10L, 0L)), 0, torch_quint8())
    #> torch_tensor -#> -1 0 -#> 1 2 -#> [ CPUFloatType{2,2} ]
    torch_quantize_per_channel(x, torch_tensor(c(0.1, 0.01)), - torch_tensor(c(10L, 0L)), 0, torch_quint8())$int_repr()
    #> torch_tensor -#> 0 10 -#> 100 200 -#> [ CPUByteType{2,2} ]
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_quantize_per_tensor.html b/docs/reference/torch_quantize_per_tensor.html deleted file mode 100644 index b97f07e78416d03ed4ab80c334e7130b2d44fcea..0000000000000000000000000000000000000000 --- a/docs/reference/torch_quantize_per_tensor.html +++ /dev/null @@ -1,236 +0,0 @@ - - - - - - - - -Quantize_per_tensor — torch_quantize_per_tensor • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Quantize_per_tensor

    -
    - - -

    Arguments

    - - - - - - - - - - - - - - - - - - -
    input

    (Tensor) float tensor to quantize

    scale

    (float) scale to apply in quantization formula

    zero_point

    (int) offset in integer value that maps to float zero

    dtype

    (torch.dtype) the desired data type of returned tensor. Has to be one of the quantized dtypes: torch_quint8, torch.qint8, torch.qint32

    - -

    quantize_per_tensor(input, scale, zero_point, dtype) -> Tensor

    - - - - -

    Converts a float tensor to quantized tensor with given scale and zero point.

    - -

    Examples

    -
    # \dontrun{ -torch_quantize_per_tensor(torch_tensor(c(-1.0, 0.0, 1.0, 2.0)), 0.1, 10, torch_quint8())
    #> torch_tensor -#> -1 -#> 0 -#> 1 -#> 2 -#> [ CPUFloatType{4} ]
    torch_quantize_per_tensor(torch_tensor(c(-1.0, 0.0, 1.0, 2.0)), 0.1, 10, torch_quint8())$int_repr()
    #> torch_tensor -#> 0 -#> 10 -#> 20 -#> 30 -#> [ CPUByteType{4} ]
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_rand.html b/docs/reference/torch_rand.html deleted file mode 100644 index ab2f26082a07fe20f96bf2b5aa5066706cfd7b41..0000000000000000000000000000000000000000 --- a/docs/reference/torch_rand.html +++ /dev/null @@ -1,245 +0,0 @@ - - - - - - - - -Rand — torch_rand • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Rand

    -
    - - -

    Arguments

    - - - - - - - - - - - - - - - - - - - - - - - - - - -
    size

    (int...) a sequence of integers defining the shape of the output tensor. Can be a variable number of arguments or a collection like a list or tuple.

    out

    (Tensor, optional) the output tensor.

    dtype

    (torch.dtype, optional) the desired data type of returned tensor. Default: if None, uses a global default (see torch_set_default_tensor_type).

    layout

    (torch.layout, optional) the desired layout of returned Tensor. Default: torch_strided.

    device

    (torch.device, optional) the desired device of returned tensor. Default: if None, uses the current device for the default tensor type (see torch_set_default_tensor_type). device will be the CPU for CPU tensor types and the current CUDA device for CUDA tensor types.

    requires_grad

    (bool, optional) If autograd should record operations on the returned tensor. Default: False.

    - -

    rand(*size, out=None, dtype=None, layout=torch.strided, device=None, requires_grad=False) -> Tensor

    - - - - -

    Returns a tensor filled with random numbers from a uniform distribution -on the interval \([0, 1)\)

    -

    The shape of the tensor is defined by the variable argument size.

    - -

    Examples

    -
    # \dontrun{ - -torch_rand(4)
    #> torch_tensor -#> 0.8391 -#> 0.5766 -#> 0.5790 -#> 0.0523 -#> [ CPUFloatType{4} ]
    torch_rand(c(2, 3))
    #> torch_tensor -#> 0.9712 0.5669 0.1881 -#> 0.4962 0.3052 0.8577 -#> [ CPUFloatType{2,3} ]
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_rand_like.html b/docs/reference/torch_rand_like.html deleted file mode 100644 index acfa0a87980cd1e7c15232488758cb59565f1163..0000000000000000000000000000000000000000 --- a/docs/reference/torch_rand_like.html +++ /dev/null @@ -1,233 +0,0 @@ - - - - - - - - -Rand_like — torch_rand_like • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Rand_like

    -
    - - -

    Arguments

    - - - - - - - - - - - - - - - - - - - - - - - - - - -
    input

    (Tensor) the size of input will determine size of the output tensor.

    dtype

    (torch.dtype, optional) the desired data type of returned Tensor. Default: if None, defaults to the dtype of input.

    layout

    (torch.layout, optional) the desired layout of returned tensor. Default: if None, defaults to the layout of input.

    device

    (torch.device, optional) the desired device of returned tensor. Default: if None, defaults to the device of input.

    requires_grad

    (bool, optional) If autograd should record operations on the returned tensor. Default: False.

    memory_format

    (torch.memory_format, optional) the desired memory format of returned Tensor. Default: torch_preserve_format.

    - -

    rand_like(input, dtype=None, layout=None, device=None, requires_grad=False, memory_format=torch.preserve_format) -> Tensor

    - - - - -

    Returns a tensor with the same size as input that is filled with -random numbers from a uniform distribution on the interval \([0, 1)\). -torch_rand_like(input) is equivalent to -torch_rand(input.size(), dtype=input.dtype, layout=input.layout, device=input.device).

    - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_randint.html b/docs/reference/torch_randint.html deleted file mode 100644 index 497200d064480f599c798787cfa831cbe4ad2d1f..0000000000000000000000000000000000000000 --- a/docs/reference/torch_randint.html +++ /dev/null @@ -1,263 +0,0 @@ - - - - - - - - -Randint — torch_randint • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Randint

    -
    - - -

    Arguments

    - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    low

    (int, optional) Lowest integer to be drawn from the distribution. Default: 0.

    high

    (int) One above the highest integer to be drawn from the distribution.

    size

    (tuple) a tuple defining the shape of the output tensor.

    generator

    (torch.Generator, optional) a pseudorandom number generator for sampling

    out

    (Tensor, optional) the output tensor.

    dtype

    (torch.dtype, optional) the desired data type of returned tensor. Default: if None, uses a global default (see torch_set_default_tensor_type).

    layout

    (torch.layout, optional) the desired layout of returned Tensor. Default: torch_strided.

    device

    (torch.device, optional) the desired device of returned tensor. Default: if None, uses the current device for the default tensor type (see torch_set_default_tensor_type). device will be the CPU for CPU tensor types and the current CUDA device for CUDA tensor types.

    requires_grad

    (bool, optional) If autograd should record operations on the returned tensor. Default: False.

    - -

    randint(low=0, high, size, *, generator=None, out=None, \

    - - - - -

    dtype=None, layout=torch.strided, device=None, requires_grad=False) -> Tensor

    -

    Returns a tensor filled with random integers generated uniformly -between low (inclusive) and high (exclusive).

    -

    The shape of the tensor is defined by the variable argument size.

    -

    .. note: -With the global dtype default (torch_float32), this function returns -a tensor with dtype torch_int64.

    - -

    Examples

    -
    # \dontrun{ - -torch_randint(3, 5, list(3))
    #> torch_tensor -#> 4 -#> 3 -#> 3 -#> [ CPUFloatType{3} ]
    torch_randint(0, 10, size = list(2, 2))
    #> torch_tensor -#> 0 7 -#> 8 6 -#> [ CPUFloatType{2,2} ]
    torch_randint(3, 10, list(2, 2))
    #> torch_tensor -#> 8 8 -#> 3 5 -#> [ CPUFloatType{2,2} ]
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_randint_like.html b/docs/reference/torch_randint_like.html deleted file mode 100644 index e4b6cb5b5573fd27a9a387f325556c630d2adcdc..0000000000000000000000000000000000000000 --- a/docs/reference/torch_randint_like.html +++ /dev/null @@ -1,244 +0,0 @@ - - - - - - - - -Randint_like — torch_randint_like • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Randint_like

    -
    - - -

    Arguments

    - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    input

    (Tensor) the size of input will determine size of the output tensor.

    low

    (int, optional) Lowest integer to be drawn from the distribution. Default: 0.

    high

    (int) One above the highest integer to be drawn from the distribution.

    dtype

    (torch.dtype, optional) the desired data type of returned Tensor. Default: if None, defaults to the dtype of input.

    layout

    (torch.layout, optional) the desired layout of returned tensor. Default: if None, defaults to the layout of input.

    device

    (torch.device, optional) the desired device of returned tensor. Default: if None, defaults to the device of input.

    requires_grad

    (bool, optional) If autograd should record operations on the returned tensor. Default: False.

    memory_format

    (torch.memory_format, optional) the desired memory format of returned Tensor. Default: torch_preserve_format.

    - -

    randint_like(input, low=0, high, dtype=None, layout=torch.strided, device=None, requires_grad=False,

    - - - - -

    memory_format=torch.preserve_format) -> Tensor

    -

    Returns a tensor with the same shape as Tensor input filled with -random integers generated uniformly between low (inclusive) and -high (exclusive).

    -

    .. note: -With the global dtype default (torch_float32), this function returns -a tensor with dtype torch_int64.

    - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_randn.html b/docs/reference/torch_randn.html deleted file mode 100644 index bd219584d92c4489ccdf138b62739669cb00ddb7..0000000000000000000000000000000000000000 --- a/docs/reference/torch_randn.html +++ /dev/null @@ -1,249 +0,0 @@ - - - - - - - - -Randn — torch_randn • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Randn

    -
    - - -

    Arguments

    - - - - - - - - - - - - - - - - - - - - - - - - - - -
    size

    (int...) a sequence of integers defining the shape of the output tensor. Can be a variable number of arguments or a collection like a list or tuple.

    out

    (Tensor, optional) the output tensor.

    dtype

    (torch.dtype, optional) the desired data type of returned tensor. Default: if None, uses a global default (see torch_set_default_tensor_type).

    layout

    (torch.layout, optional) the desired layout of returned Tensor. Default: torch_strided.

    device

    (torch.device, optional) the desired device of returned tensor. Default: if None, uses the current device for the default tensor type (see torch_set_default_tensor_type). device will be the CPU for CPU tensor types and the current CUDA device for CUDA tensor types.

    requires_grad

    (bool, optional) If autograd should record operations on the returned tensor. Default: False.

    - -

    randn(*size, out=None, dtype=None, layout=torch.strided, device=None, requires_grad=False) -> Tensor

    - - - - -

    Returns a tensor filled with random numbers from a normal distribution -with mean 0 and variance 1 (also called the standard normal -distribution).

    -

    $$ - \mbox{out}_{i} \sim \mathcal{N}(0, 1) -$$ -The shape of the tensor is defined by the variable argument size.

    - -

    Examples

    -
    # \dontrun{ - -torch_randn(c(4))
    #> torch_tensor -#> -0.5578 -#> -1.6968 -#> -0.0944 -#> -0.7900 -#> [ CPUFloatType{4} ]
    torch_randn(c(2, 3))
    #> torch_tensor -#> -2.1279 1.0919 2.2659 -#> 0.1722 0.1719 -1.1738 -#> [ CPUFloatType{2,3} ]
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_randn_like.html b/docs/reference/torch_randn_like.html deleted file mode 100644 index e4b94d82aee199d86d1c48a86bc2ddca650b8d8a..0000000000000000000000000000000000000000 --- a/docs/reference/torch_randn_like.html +++ /dev/null @@ -1,233 +0,0 @@ - - - - - - - - -Randn_like — torch_randn_like • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Randn_like

    -
    - - -

    Arguments

    - - - - - - - - - - - - - - - - - - - - - - - - - - -
    input

    (Tensor) the size of input will determine size of the output tensor.

    dtype

    (torch.dtype, optional) the desired data type of returned Tensor. Default: if None, defaults to the dtype of input.

    layout

    (torch.layout, optional) the desired layout of returned tensor. Default: if None, defaults to the layout of input.

    device

    (torch.device, optional) the desired device of returned tensor. Default: if None, defaults to the device of input.

    requires_grad

    (bool, optional) If autograd should record operations on the returned tensor. Default: False.

    memory_format

    (torch.memory_format, optional) the desired memory format of returned Tensor. Default: torch_preserve_format.

    - -

    randn_like(input, dtype=None, layout=None, device=None, requires_grad=False, memory_format=torch.preserve_format) -> Tensor

    - - - - -

    Returns a tensor with the same size as input that is filled with -random numbers from a normal distribution with mean 0 and variance 1. -torch_randn_like(input) is equivalent to -torch_randn(input.size(), dtype=input.dtype, layout=input.layout, device=input.device).

    - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_randperm.html b/docs/reference/torch_randperm.html deleted file mode 100644 index d61a9b411350ad6fc0be367b4230e98da2eae6d8..0000000000000000000000000000000000000000 --- a/docs/reference/torch_randperm.html +++ /dev/null @@ -1,240 +0,0 @@ - - - - - - - - -Randperm — torch_randperm • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Randperm

    -
    - - -

    Arguments

    - - - - - - - - - - - - - - - - - - - - - - - - - - -
    n

    (int) the upper bound (exclusive)

    out

    (Tensor, optional) the output tensor.

    dtype

    (torch.dtype, optional) the desired data type of returned tensor. Default: torch_int64.

    layout

    (torch.layout, optional) the desired layout of returned Tensor. Default: torch_strided.

    device

    (torch.device, optional) the desired device of returned tensor. Default: if None, uses the current device for the default tensor type (see torch_set_default_tensor_type). device will be the CPU for CPU tensor types and the current CUDA device for CUDA tensor types.

    requires_grad

    (bool, optional) If autograd should record operations on the returned tensor. Default: False.

    - -

    randperm(n, out=None, dtype=torch.int64, layout=torch.strided, device=None, requires_grad=False) -> LongTensor

    - - - - -

    Returns a random permutation of integers from 0 to n - 1.

    - -

    Examples

    -
    # \dontrun{ - -torch_randperm(4)
    #> torch_tensor -#> 0 -#> 2 -#> 1 -#> 3 -#> [ CPULongType{4} ]
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_range.html b/docs/reference/torch_range.html deleted file mode 100644 index bff10a71018fcb3c6159535f72cdb0fd9a5170ff..0000000000000000000000000000000000000000 --- a/docs/reference/torch_range.html +++ /dev/null @@ -1,264 +0,0 @@ - - - - - - - - -Range — torch_range • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Range

    -
    - - -

    Arguments

    - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    start

    (float) the starting value for the set of points. Default: 0.

    end

    (float) the ending value for the set of points

    step

    (float) the gap between each pair of adjacent points. Default: 1.

    out

    (Tensor, optional) the output tensor.

    dtype

    (torch.dtype, optional) the desired data type of returned tensor. Default: if None, uses a global default (see torch_set_default_tensor_type). If dtype is not given, infer the data type from the other input arguments. If any of start, end, or stop are floating-point, the dtype is inferred to be the default dtype, see ~torch.get_default_dtype. Otherwise, the dtype is inferred to be torch.int64.

    layout

    (torch.layout, optional) the desired layout of returned Tensor. Default: torch_strided.

    device

    (torch.device, optional) the desired device of returned tensor. Default: if None, uses the current device for the default tensor type (see torch_set_default_tensor_type). device will be the CPU for CPU tensor types and the current CUDA device for CUDA tensor types.

    requires_grad

    (bool, optional) If autograd should record operations on the returned tensor. Default: False.

    - -

    range(start=0, end, step=1, out=None, dtype=None, layout=torch.strided, device=None, requires_grad=False) -> Tensor

    - - - - -

    Returns a 1-D tensor of size \(\left\lfloor \frac{\mbox{end} - \mbox{start}}{\mbox{step}} \right\rfloor + 1\) -with values from start to end with step step. Step is -the gap between two values in the tensor.

    -

    $$ - \mbox{out}_{i+1} = \mbox{out}_i + \mbox{step}. -$$

    -

    Warning

    - - - -

    This function is deprecated in favor of torch_arange.

    - -

    Examples

    -
    # \dontrun{ - -torch_range(1, 4)
    #> Warning: This function is deprecated in favor of torch_arange.
    #> torch_tensor -#> 1 -#> 2 -#> 3 -#> [ CPUFloatType{3} ]
    torch_range(1, 4, 0.5)
    #> Warning: This function is deprecated in favor of torch_arange.
    #> torch_tensor -#> 1.0000 -#> 1.5000 -#> 2.0000 -#> 2.5000 -#> 3.0000 -#> 3.5000 -#> [ CPUFloatType{6} ]
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_real.html b/docs/reference/torch_real.html deleted file mode 100644 index 542b16ad293283a386df2ae4010cafb768383e47..0000000000000000000000000000000000000000 --- a/docs/reference/torch_real.html +++ /dev/null @@ -1,226 +0,0 @@ - - - - - - - - -Real — torch_real • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Real

    -
    - - -

    Arguments

    - - - - - - - - - - -
    input

    (Tensor) the input tensor.

    out

    (Tensor, optional) the output tensor.

    - -

    real(input, out=None) -> Tensor

    - - - - -

    Returns the real part of the input tensor. If -input is a real (non-complex) tensor, this function just -returns it.

    -

    Warning

    - - - -

    Not yet implemented for complex tensors.

    -

    $$ - \mbox{out}_{i} = real(\mbox{input}_{i}) -$$

    - -

    Examples

    -
    
    -  
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_reciprocal.html b/docs/reference/torch_reciprocal.html deleted file mode 100644 index 14716d7ddefe5b2304c0fd8d4d36f7838973880b..0000000000000000000000000000000000000000 --- a/docs/reference/torch_reciprocal.html +++ /dev/null @@ -1,233 +0,0 @@ - - - - - - - - -Reciprocal — torch_reciprocal • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Reciprocal

    -
    - - -

    Arguments

    - - - - - - - - - - -
    input

    (Tensor) the input tensor.

    out

    (Tensor, optional) the output tensor.

    - -

    reciprocal(input, out=None) -> Tensor

    - - - - -

    Returns a new tensor with the reciprocal of the elements of input

    -

    $$ - \mbox{out}_{i} = \frac{1}{\mbox{input}_{i}} -$$

    - -

    Examples

    -
    # \dontrun{ - -a = torch_randn(c(4)) -a
    #> torch_tensor -#> -0.6585 -#> 0.2569 -#> 1.4761 -#> -0.0839 -#> [ CPUFloatType{4} ]
    torch_reciprocal(a)
    #> torch_tensor -#> -1.5185 -#> 3.8925 -#> 0.6775 -#> -11.9170 -#> [ CPUFloatType{4} ]
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_reduction.html b/docs/reference/torch_reduction.html deleted file mode 100644 index 0ee7d2894e047c90f10970fae4d13437b535daf6..0000000000000000000000000000000000000000 --- a/docs/reference/torch_reduction.html +++ /dev/null @@ -1,201 +0,0 @@ - - - - - - - - -Creates the reduction objet — torch_reduction • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Creates the reduction objet

    -
    - -
    torch_reduction_sum()
    -
    -torch_reduction_mean()
    -
    -torch_reduction_none()
    - - - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_relu_.html b/docs/reference/torch_relu_.html deleted file mode 100644 index d284eac5db78489921293ffb6daac1be0f1c4e6d..0000000000000000000000000000000000000000 --- a/docs/reference/torch_relu_.html +++ /dev/null @@ -1,202 +0,0 @@ - - - - - - - - -Relu_ — torch_relu_ • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Relu_

    -
    - - - -

    relu_(input) -> Tensor

    - - - - -

    In-place version of torch_relu.

    - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_remainder.html b/docs/reference/torch_remainder.html deleted file mode 100644 index 9517e58f371de5724019758a758ac4038db879a8..0000000000000000000000000000000000000000 --- a/docs/reference/torch_remainder.html +++ /dev/null @@ -1,240 +0,0 @@ - - - - - - - - -Remainder — torch_remainder • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Remainder

    -
    - - -

    Arguments

    - - - - - - - - - - - - - - -
    input

    (Tensor) the dividend

    other

    (Tensor or float) the divisor that may be either a number or a Tensor of the same shape as the dividend

    out

    (Tensor, optional) the output tensor.

    - -

    remainder(input, other, out=None) -> Tensor

    - - - - -

    Computes the element-wise remainder of division.

    -

    The divisor and dividend may contain both for integer and floating point -numbers. The remainder has the same sign as the divisor.

    -

    When other is a tensor, the shapes of input and -other must be broadcastable .

    - -

    Examples

    -
    # \dontrun{ - -torch_remainder(torch_tensor(c(-3., -2, -1, 1, 2, 3)), 2)
    #> torch_tensor -#> 1 -#> 0 -#> 1 -#> 1 -#> 0 -#> 1 -#> [ CPUFloatType{6} ]
    torch_remainder(torch_tensor(c(1., 2, 3, 4, 5)), 1.5)
    #> torch_tensor -#> 1.0000 -#> 0.5000 -#> 0.0000 -#> 1.0000 -#> 0.5000 -#> [ CPUFloatType{5} ]
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_renorm.html b/docs/reference/torch_renorm.html deleted file mode 100644 index ce1b88c0776736aa5291f9be0e0642566353bbda..0000000000000000000000000000000000000000 --- a/docs/reference/torch_renorm.html +++ /dev/null @@ -1,252 +0,0 @@ - - - - - - - - -Renorm — torch_renorm • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Renorm

    -
    - - -

    Arguments

    - - - - - - - - - - - - - - - - - - - - - - -
    input

    (Tensor) the input tensor.

    p

    (float) the power for the norm computation

    dim

    (int) the dimension to slice over to get the sub-tensors

    maxnorm

    (float) the maximum norm to keep each sub-tensor under

    out

    (Tensor, optional) the output tensor.

    - -

    Note

    - -

    If the norm of a row is lower than maxnorm, the row is unchanged

    -

    renorm(input, p, dim, maxnorm, out=None) -> Tensor

    - - - - -

    Returns a tensor where each sub-tensor of input along dimension -dim is normalized such that the p-norm of the sub-tensor is lower -than the value maxnorm

    - -

    Examples

    -
    # \dontrun{ -x = torch_ones(c(3, 3)) -x[2,]$fill_(2)
    #> torch_tensor -#> 2 -#> 2 -#> 2 -#> [ CPUFloatType{3} ]
    x[3,]$fill_(3)
    #> torch_tensor -#> 3 -#> 3 -#> 3 -#> [ CPUFloatType{3} ]
    x
    #> torch_tensor -#> 1 1 1 -#> 2 2 2 -#> 3 3 3 -#> [ CPUFloatType{3,3} ]
    torch_renorm(x, 1, 1, 5)
    #> torch_tensor -#> 1.0000 1.0000 1.0000 -#> 1.6667 1.6667 1.6667 -#> 1.6667 1.6667 1.6667 -#> [ CPUFloatType{3,3} ]
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_repeat_interleave.html b/docs/reference/torch_repeat_interleave.html deleted file mode 100644 index c0b945915d43f894cbf09ae4bf1cb02e7d02da8e..0000000000000000000000000000000000000000 --- a/docs/reference/torch_repeat_interleave.html +++ /dev/null @@ -1,234 +0,0 @@ - - - - - - - - -Repeat_interleave — torch_repeat_interleave • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Repeat_interleave

    -
    - - -

    Arguments

    - - - - - - - - - - - - - - -
    input

    (Tensor) the input tensor.

    repeats

    (Tensor or int) The number of repetitions for each element. repeats is broadcasted to fit the shape of the given axis.

    dim

    (int, optional) The dimension along which to repeat values. By default, use the flattened input array, and return a flat output array.

    - -

    repeat_interleave(input, repeats, dim=None) -> Tensor

    - - - - -

    Repeat elements of a tensor.

    -

    Warning

    - - -
    This is different from `torch_Tensor.repeat` but similar to ``numpy.repeat``.
    -
    - -

    repeat_interleave(repeats) -> Tensor

    - - - - -

    If the repeats is tensor([n1, n2, n3, ...]), then the output will be -tensor([0, 0, ..., 1, 1, ..., 2, 2, ..., ...]) where 0 appears n1 times, -1 appears n2 times, 2 appears n3 times, etc.

    - -

    Examples

    -
    
    -  
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_reshape.html b/docs/reference/torch_reshape.html deleted file mode 100644 index d149b6c0f0109211a63dc87512e614848fad179d..0000000000000000000000000000000000000000 --- a/docs/reference/torch_reshape.html +++ /dev/null @@ -1,236 +0,0 @@ - - - - - - - - -Reshape — torch_reshape • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Reshape

    -
    - - -

    Arguments

    - - - - - - - - - - -
    input

    (Tensor) the tensor to be reshaped

    shape

    (tuple of ints) the new shape

    - -

    reshape(input, shape) -> Tensor

    - - - - -

    Returns a tensor with the same data and number of elements as input, -but with the specified shape. When possible, the returned tensor will be a view -of input. Otherwise, it will be a copy. Contiguous inputs and inputs -with compatible strides can be reshaped without copying, but you should not -depend on the copying vs. viewing behavior.

    -

    See torch_Tensor.view on when it is possible to return a view.

    -

    A single dimension may be -1, in which case it's inferred from the remaining -dimensions and the number of elements in input.

    - -

    Examples

    -
    # \dontrun{ - -a = torch_arange(0, 4) -torch_reshape(a, list(2, 2))
    #> torch_tensor -#> 0 1 -#> 2 3 -#> [ CPUFloatType{2,2} ]
    b = torch_tensor(matrix(c(0, 1, 2, 3), ncol = 2, byrow=TRUE)) -torch_reshape(b, list(-1))
    #> torch_tensor -#> 0 -#> 1 -#> 2 -#> 3 -#> [ CPUFloatType{4} ]
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_result_type.html b/docs/reference/torch_result_type.html deleted file mode 100644 index a1f275a0423c64ac838c9b3fe2043fcfdb67f9d7..0000000000000000000000000000000000000000 --- a/docs/reference/torch_result_type.html +++ /dev/null @@ -1,222 +0,0 @@ - - - - - - - - -Result_type — torch_result_type • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Result_type

    -
    - - -

    Arguments

    - - - - - - - - - - -
    tensor1

    (Tensor or Number) an input tensor or number

    tensor2

    (Tensor or Number) an input tensor or number

    - -

    result_type(tensor1, tensor2) -> dtype

    - - - - -

    Returns the torch_dtype that would result from performing an arithmetic -operation on the provided input tensors. See type promotion documentation -for more information on the type promotion logic.

    - -

    Examples

    -
    # \dontrun{ - -torch_result_type(tensor = torch_tensor(c(1, 2), dtype=torch_int()), 1.0)
    #> torch_Float
    # torch_result_type(tensor = torch_tensor(c(1, 2), dtype=torch_uint8()), torch_tensor(1)) -# } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_rfft.html b/docs/reference/torch_rfft.html deleted file mode 100644 index ffc88d168ef3cd592ca8125cf673133d0ccc8ceb..0000000000000000000000000000000000000000 --- a/docs/reference/torch_rfft.html +++ /dev/null @@ -1,324 +0,0 @@ - - - - - - - - -Rfft — torch_rfft • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Rfft

    -
    - - -

    Arguments

    - - - - - - - - - - - - - - - - - - -
    input

    (Tensor) the input tensor of at least signal_ndim dimensions

    signal_ndim

    (int) the number of dimensions in each signal. signal_ndim can only be 1, 2 or 3

    normalized

    (bool, optional) controls whether to return normalized results. Default: False

    onesided

    (bool, optional) controls whether to return half of results to avoid redundancy. Default: True

    - -

    Note

    - - -
    For CUDA tensors, an LRU cache is used for cuFFT plans to speed up
    -repeatedly running FFT methods on tensors of same geometry with same
    -configuration. See cufft-plan-cache for more details on how to
    -monitor and control the cache.
    -
    - -

    rfft(input, signal_ndim, normalized=False, onesided=True) -> Tensor

    - - - - -

    Real-to-complex Discrete Fourier Transform

    -

    This method computes the real-to-complex discrete Fourier transform. It is -mathematically equivalent with torch_fft with differences only in -formats of the input and output.

    -

    This method supports 1D, 2D and 3D real-to-complex transforms, indicated -by signal_ndim. input must be a tensor with at least -signal_ndim dimensions with optionally arbitrary number of leading batch -dimensions. If normalized is set to True, this normalizes the result -by dividing it with \(\sqrt{\prod_{i=1}^K N_i}\) so that the operator is -unitary, where \(N_i\) is the size of signal dimension \(i\).

    -

    The real-to-complex Fourier transform results follow conjugate symmetry:

    -

    $$ - X[\omega_1, \dots, \omega_d] = X^*[N_1 - \omega_1, \dots, N_d - \omega_d], -$$ -where the index arithmetic is computed modulus the size of the corresponding -dimension, \(\ ^*\) is the conjugate operator, and -\(d\) = signal_ndim. onesided flag controls whether to avoid -redundancy in the output results. If set to True (default), the output will -not be full complex result of shape \((*, 2)\), where \(*\) is the shape -of input, but instead the last dimension will be halfed as of size -\(\lfloor \frac{N_d}{2} \rfloor + 1\).

    -

    The inverse of this function is torch_irfft.

    -

    Warning

    - - - -

    For CPU tensors, this method is currently only available with MKL. Use -torch_backends.mkl.is_available to check if MKL is installed.

    - -

    Examples

    -
    # \dontrun{ - -x = torch_randn(c(5, 5)) -torch_rfft(x, 2)
    #> torch_tensor -#> (1,.,.) = -#> 4.9202 0.0000 -#> 2.9093 2.7036 -#> 1.8227 -4.8232 -#> -#> (2,.,.) = -#> -2.0160 4.4730 -#> 1.8297 -3.0301 -#> 0.4639 0.6830 -#> -#> (3,.,.) = -#> -6.8659 1.3907 -#> -5.6917 -4.3527 -#> 2.7115 0.4562 -#> -#> (4,.,.) = -#> -6.8659 -1.3907 -#> -3.4394 -1.4155 -#> 4.7454 -1.6532 -#> -#> (5,.,.) = -#> -2.0160 -4.4730 -#> -1.0448 5.6595 -#> 5.7855 -3.2515 -#> [ CPUFloatType{5,3,2} ]
    torch_rfft(x, 2, onesided=FALSE)
    #> torch_tensor -#> (1,.,.) = -#> 4.9202 0.0000 -#> 2.9093 2.7036 -#> 1.8227 -4.8232 -#> 1.8227 4.8232 -#> 2.9093 -2.7036 -#> -#> (2,.,.) = -#> -2.0160 4.4730 -#> 1.8297 -3.0301 -#> 0.4639 0.6830 -#> 5.7855 3.2515 -#> -1.0448 -5.6595 -#> -#> (3,.,.) = -#> -6.8659 1.3907 -#> -5.6917 -4.3527 -#> 2.7115 0.4562 -#> 4.7454 1.6532 -#> -3.4394 1.4155 -#> -#> (4,.,.) = -#> -6.8659 -1.3907 -#> -3.4394 -1.4155 -#> 4.7454 -1.6532 -#> 2.7115 -0.4562 -#> -5.6917 4.3527 -#> -#> (5,.,.) = -#> -2.0160 -4.4730 -#> -1.0448 5.6595 -#> 5.7855 -3.2515 -#> 0.4639 -0.6830 -#> 1.8297 3.0301 -#> [ CPUFloatType{5,5,2} ]
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_roll.html b/docs/reference/torch_roll.html deleted file mode 100644 index 5867dff58a0e19d95d7d34db4b3f773e9d6f0cb0..0000000000000000000000000000000000000000 --- a/docs/reference/torch_roll.html +++ /dev/null @@ -1,247 +0,0 @@ - - - - - - - - -Roll — torch_roll • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Roll

    -
    - - -

    Arguments

    - - - - - - - - - - - - - - -
    input

    (Tensor) the input tensor.

    shifts

    (int or tuple of ints) The number of places by which the elements of the tensor are shifted. If shifts is a tuple, dims must be a tuple of the same size, and each dimension will be rolled by the corresponding value

    dims

    (int or tuple of ints) Axis along which to roll

    - -

    roll(input, shifts, dims=None) -> Tensor

    - - - - -

    Roll the tensor along the given dimension(s). Elements that are shifted beyond the -last position are re-introduced at the first position. If a dimension is not -specified, the tensor will be flattened before rolling and then restored -to the original shape.

    - -

    Examples

    -
    # \dontrun{ - -x = torch_tensor(c(1, 2, 3, 4, 5, 6, 7, 8))$view(c(4, 2)) -x
    #> torch_tensor -#> 1 2 -#> 3 4 -#> 5 6 -#> 7 8 -#> [ CPUFloatType{4,2} ]
    torch_roll(x, 1, 1)
    #> torch_tensor -#> 7 8 -#> 1 2 -#> 3 4 -#> 5 6 -#> [ CPUFloatType{4,2} ]
    torch_roll(x, -1, 1)
    #> torch_tensor -#> 3 4 -#> 5 6 -#> 7 8 -#> 1 2 -#> [ CPUFloatType{4,2} ]
    torch_roll(x, shifts=list(2, 1), dims=list(1, 2))
    #> torch_tensor -#> 6 5 -#> 8 7 -#> 2 1 -#> 4 3 -#> [ CPUFloatType{4,2} ]
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_rot90.html b/docs/reference/torch_rot90.html deleted file mode 100644 index 09927b2d58dfbe0f9a437aa3de23c50dc46d20c7..0000000000000000000000000000000000000000 --- a/docs/reference/torch_rot90.html +++ /dev/null @@ -1,248 +0,0 @@ - - - - - - - - -Rot90 — torch_rot90 • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Rot90

    -
    - - -

    Arguments

    - - - - - - - - - - - - - - -
    input

    (Tensor) the input tensor.

    k

    (int) number of times to rotate

    dims

    (a list or tuple) axis to rotate

    - -

    rot90(input, k, dims) -> Tensor

    - - - - -

    Rotate a n-D tensor by 90 degrees in the plane specified by dims axis. -Rotation direction is from the first towards the second axis if k > 0, and from the second towards the first for k < 0.

    - -

    Examples

    -
    # \dontrun{ - -x = torch_arange(0, 4)$view(c(2, 2)) -x
    #> torch_tensor -#> 0 1 -#> 2 3 -#> [ CPUFloatType{2,2} ]
    torch_rot90(x, 1, c(1, 2))
    #> torch_tensor -#> 1 3 -#> 0 2 -#> [ CPUFloatType{2,2} ]
    x = torch_arange(0, 8)$view(c(2, 2, 2)) -x
    #> torch_tensor -#> (1,.,.) = -#> 0 1 -#> 2 3 -#> -#> (2,.,.) = -#> 4 5 -#> 6 7 -#> [ CPUFloatType{2,2,2} ]
    torch_rot90(x, 1, c(1, 2))
    #> torch_tensor -#> (1,.,.) = -#> 2 3 -#> 6 7 -#> -#> (2,.,.) = -#> 0 1 -#> 4 5 -#> [ CPUFloatType{2,2,2} ]
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_round.html b/docs/reference/torch_round.html deleted file mode 100644 index 002ae3df9556b3c74c5491b883a432d0d2ca1e6c..0000000000000000000000000000000000000000 --- a/docs/reference/torch_round.html +++ /dev/null @@ -1,231 +0,0 @@ - - - - - - - - -Round — torch_round • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Round

    -
    - - -

    Arguments

    - - - - - - - - - - -
    input

    (Tensor) the input tensor.

    out

    (Tensor, optional) the output tensor.

    - -

    round(input, out=None) -> Tensor

    - - - - -

    Returns a new tensor with each of the elements of input rounded -to the closest integer.

    - -

    Examples

    -
    # \dontrun{ - -a = torch_randn(c(4)) -a
    #> torch_tensor -#> 0.8911 -#> -0.2131 -#> -0.4903 -#> -0.0724 -#> [ CPUFloatType{4} ]
    torch_round(a)
    #> torch_tensor -#> 1 -#> -0 -#> -0 -#> -0 -#> [ CPUFloatType{4} ]
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_rrelu_.html b/docs/reference/torch_rrelu_.html deleted file mode 100644 index dc2001c657f3e4f922c6e4d6f0f5edf524c65954..0000000000000000000000000000000000000000 --- a/docs/reference/torch_rrelu_.html +++ /dev/null @@ -1,202 +0,0 @@ - - - - - - - - -Rrelu_ — torch_rrelu_ • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Rrelu_

    -
    - - - -

    rrelu_(input, lower=1./8, upper=1./3, training=False) -> Tensor

    - - - - -

    In-place version of torch_rrelu.

    - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_rsqrt.html b/docs/reference/torch_rsqrt.html deleted file mode 100644 index fc8f39b62ba2cec3c11c211eee8799f3f325caee..0000000000000000000000000000000000000000 --- a/docs/reference/torch_rsqrt.html +++ /dev/null @@ -1,234 +0,0 @@ - - - - - - - - -Rsqrt — torch_rsqrt • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Rsqrt

    -
    - - -

    Arguments

    - - - - - - - - - - -
    input

    (Tensor) the input tensor.

    out

    (Tensor, optional) the output tensor.

    - -

    rsqrt(input, out=None) -> Tensor

    - - - - -

    Returns a new tensor with the reciprocal of the square-root of each of -the elements of input.

    -

    $$ - \mbox{out}_{i} = \frac{1}{\sqrt{\mbox{input}_{i}}} -$$

    - -

    Examples

    -
    # \dontrun{ - -a = torch_randn(c(4)) -a
    #> torch_tensor -#> 0.5826 -#> -0.3344 -#> -0.2779 -#> -1.1775 -#> [ CPUFloatType{4} ]
    torch_rsqrt(a)
    #> torch_tensor -#> 1.3101 -#> nan -#> nan -#> nan -#> [ CPUFloatType{4} ]
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_save.html b/docs/reference/torch_save.html deleted file mode 100644 index f735bb755eae21975bb4f38e2a55b357ee83fafa..0000000000000000000000000000000000000000 --- a/docs/reference/torch_save.html +++ /dev/null @@ -1,219 +0,0 @@ - - - - - - - - -Saves an object to a disk file. — torch_save • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    This function is experimental, don't use for long -term storage.

    -
    - -
    torch_save(obj, path, ...)
    - -

    Arguments

    - - - - - - - - - - - - - - -
    obj

    the saved object

    path

    a connection or the name of the file to save.

    ...

    not currently used.

    - -

    See also

    - -

    Other torch_save: -torch_load()

    - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_selu_.html b/docs/reference/torch_selu_.html deleted file mode 100644 index adb79c4ebdcc58b266f3ba7dfdb4233e88374417..0000000000000000000000000000000000000000 --- a/docs/reference/torch_selu_.html +++ /dev/null @@ -1,202 +0,0 @@ - - - - - - - - -Selu_ — torch_selu_ • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Selu_

    -
    - - - -

    selu_(input) -> Tensor

    - - - - -

    In-place version of toch_selu.

    - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_sigmoid.html b/docs/reference/torch_sigmoid.html deleted file mode 100644 index 9629d30325f5455a9bbf4a9250e506ffea2ba667..0000000000000000000000000000000000000000 --- a/docs/reference/torch_sigmoid.html +++ /dev/null @@ -1,233 +0,0 @@ - - - - - - - - -Sigmoid — torch_sigmoid • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Sigmoid

    -
    - - -

    Arguments

    - - - - - - - - - - -
    input

    (Tensor) the input tensor.

    out

    (Tensor, optional) the output tensor.

    - -

    sigmoid(input, out=None) -> Tensor

    - - - - -

    Returns a new tensor with the sigmoid of the elements of input.

    -

    $$ - \mbox{out}_{i} = \frac{1}{1 + e^{-\mbox{input}_{i}}} -$$

    - -

    Examples

    -
    # \dontrun{ - -a = torch_randn(c(4)) -a
    #> torch_tensor -#> 2.1600 -#> -1.3253 -#> -0.1559 -#> 0.1856 -#> [ CPUFloatType{4} ]
    torch_sigmoid(a)
    #> torch_tensor -#> 0.8966 -#> 0.2099 -#> 0.4611 -#> 0.5463 -#> [ CPUFloatType{4} ]
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_sign.html b/docs/reference/torch_sign.html deleted file mode 100644 index 9f51cbf1b7415d7bcda86cf404570b0c46953db1..0000000000000000000000000000000000000000 --- a/docs/reference/torch_sign.html +++ /dev/null @@ -1,233 +0,0 @@ - - - - - - - - -Sign — torch_sign • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Sign

    -
    - - -

    Arguments

    - - - - - - - - - - -
    input

    (Tensor) the input tensor.

    out

    (Tensor, optional) the output tensor.

    - -

    sign(input, out=None) -> Tensor

    - - - - -

    Returns a new tensor with the signs of the elements of input.

    -

    $$ - \mbox{out}_{i} = \mbox{sgn}(\mbox{input}_{i}) -$$

    - -

    Examples

    -
    # \dontrun{ - -a = torch_tensor(c(0.7, -1.2, 0., 2.3)) -a
    #> torch_tensor -#> 0.7000 -#> -1.2000 -#> 0.0000 -#> 2.3000 -#> [ CPUFloatType{4} ]
    torch_sign(a)
    #> torch_tensor -#> 1 -#> -1 -#> 0 -#> 1 -#> [ CPUFloatType{4} ]
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_sin.html b/docs/reference/torch_sin.html deleted file mode 100644 index fdaa9c5dba3d8b0c0f11499af8e2ae3679b93e6e..0000000000000000000000000000000000000000 --- a/docs/reference/torch_sin.html +++ /dev/null @@ -1,233 +0,0 @@ - - - - - - - - -Sin — torch_sin • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Sin

    -
    - - -

    Arguments

    - - - - - - - - - - -
    input

    (Tensor) the input tensor.

    out

    (Tensor, optional) the output tensor.

    - -

    sin(input, out=None) -> Tensor

    - - - - -

    Returns a new tensor with the sine of the elements of input.

    -

    $$ - \mbox{out}_{i} = \sin(\mbox{input}_{i}) -$$

    - -

    Examples

    -
    # \dontrun{ - -a = torch_randn(c(4)) -a
    #> torch_tensor -#> 0.5115 -#> 0.7241 -#> -0.6876 -#> -0.3453 -#> [ CPUFloatType{4} ]
    torch_sin(a)
    #> torch_tensor -#> 0.4895 -#> 0.6625 -#> -0.6347 -#> -0.3384 -#> [ CPUFloatType{4} ]
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_sinh.html b/docs/reference/torch_sinh.html deleted file mode 100644 index 1f9c1b918fed599a1b493e5a946d9cb4e8f49e17..0000000000000000000000000000000000000000 --- a/docs/reference/torch_sinh.html +++ /dev/null @@ -1,234 +0,0 @@ - - - - - - - - -Sinh — torch_sinh • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Sinh

    -
    - - -

    Arguments

    - - - - - - - - - - -
    input

    (Tensor) the input tensor.

    out

    (Tensor, optional) the output tensor.

    - -

    sinh(input, out=None) -> Tensor

    - - - - -

    Returns a new tensor with the hyperbolic sine of the elements of -input.

    -

    $$ - \mbox{out}_{i} = \sinh(\mbox{input}_{i}) -$$

    - -

    Examples

    -
    # \dontrun{ - -a = torch_randn(c(4)) -a
    #> torch_tensor -#> 0.7504 -#> -0.3593 -#> -0.6244 -#> -1.7192 -#> [ CPUFloatType{4} ]
    torch_sinh(a)
    #> torch_tensor -#> 0.8228 -#> -0.3671 -#> -0.6658 -#> -2.7003 -#> [ CPUFloatType{4} ]
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_slogdet.html b/docs/reference/torch_slogdet.html deleted file mode 100644 index 4fc6eb514e424adda86a0d12a39dcd88df5adbc4..0000000000000000000000000000000000000000 --- a/docs/reference/torch_slogdet.html +++ /dev/null @@ -1,245 +0,0 @@ - - - - - - - - -Slogdet — torch_slogdet • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Slogdet

    -
    - - -

    Arguments

    - - - - - - -
    input

    (Tensor) the input tensor of size (*, n, n) where * is zero or more batch dimensions.

    - -

    Note

    - - -
    If ``input`` has zero determinant, this returns ``(0, -inf)``.
    -
    - -
    Backward through `slogdet` internally uses SVD results when `input`
    -is not invertible. In this case, double backward through `slogdet`
    -will be unstable in when `input` doesn't have distinct singular values.
    -See `~torch.svd` for details.
    -
    - -

    slogdet(input) -> (Tensor, Tensor)

    - - - - -

    Calculates the sign and log absolute value of the determinant(s) of a square matrix or batches of square matrices.

    - -

    Examples

    -
    # \dontrun{ - -A = torch_randn(c(3, 3)) -A
    #> torch_tensor -#> 0.0461 -1.3909 0.9825 -#> 0.5340 0.3877 -0.2309 -#> 0.3683 1.6290 0.3208 -#> [ CPUFloatType{3,3} ]
    #> torch_tensor -#> 1.094 -#> [ CPUFloatType{} ]
    #> torch_tensor -#> 0.0898445 -#> [ CPUFloatType{} ]
    torch_slogdet(A)
    #> [[1]] -#> torch_tensor -#> 1 -#> [ CPUFloatType{} ] -#> -#> [[2]] -#> torch_tensor -#> 0.0898445 -#> [ CPUFloatType{} ] -#>
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_solve.html b/docs/reference/torch_solve.html deleted file mode 100644 index b29abbc24c939bd09fcdfb92d97b6b8ff69ee916..0000000000000000000000000000000000000000 --- a/docs/reference/torch_solve.html +++ /dev/null @@ -1,259 +0,0 @@ - - - - - - - - -Solve — torch_solve • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Solve

    -
    - - -

    Arguments

    - - - - - - - - - - - - - - -
    input

    (Tensor) input matrix \(B\) of size \((*, m, k)\) , where \(*\) is zero or more batch dimensions.

    A

    (Tensor) input square matrix of size \((*, m, m)\), where \(*\) is zero or more batch dimensions.

    out

    ((Tensor, Tensor) optional output tuple.

    - -

    Note

    - - -
    Irrespective of the original strides, the returned matrices
    -`solution` and `LU` will be transposed, i.e. with strides like
    -`B.contiguous().transpose(-1, -2).stride()` and
    -`A.contiguous().transpose(-1, -2).stride()` respectively.
    -
    - -

    torch.solve(input, A, out=None) -> (Tensor, Tensor)

    - - - - -

    This function returns the solution to the system of linear -equations represented by \(AX = B\) and the LU factorization of -A, in order as a namedtuple solution, LU.

    -

    LU contains L and U factors for LU factorization of A.

    -

    torch_solve(B, A) can take in 2D inputs B, A or inputs that are -batches of 2D matrices. If the inputs are batches, then returns -batched outputs solution, LU.

    - -

    Examples

    -
    # \dontrun{ - -A = torch_tensor(rbind(c(6.80, -2.11, 5.66, 5.97, 8.23), - c(-6.05, -3.30, 5.36, -4.44, 1.08), - c(-0.45, 2.58, -2.70, 0.27, 9.04), - c(8.32, 2.71, 4.35, -7.17, 2.14), - c(-9.67, -5.14, -7.26, 6.08, -6.87)))$t() -B = torch_tensor(rbind(c(4.02, 6.19, -8.22, -7.57, -3.03), - c(-1.56, 4.00, -8.67, 1.75, 2.86), - c(9.81, -4.09, -4.57, -8.61, 8.99)))$t() -out = torch_solve(B, A) -X = out[[1]] -LU = out[[2]] -torch_dist(B, torch_mm(A, X))
    #> torch_tensor -#> 7.09771e-06 -#> [ CPUFloatType{} ]
    # Batched solver example -A = torch_randn(c(2, 3, 1, 4, 4)) -B = torch_randn(c(2, 3, 1, 4, 6)) -out = torch_solve(B, A) -X = out[[1]] -LU = out[[2]] -torch_dist(B, A$matmul(X))
    #> torch_tensor -#> 6.14486e-06 -#> [ CPUFloatType{} ]
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_sort.html b/docs/reference/torch_sort.html deleted file mode 100644 index e628801dec385535dea16ba18b8a10519a0fdd37..0000000000000000000000000000000000000000 --- a/docs/reference/torch_sort.html +++ /dev/null @@ -1,263 +0,0 @@ - - - - - - - - -Sort — torch_sort • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Sort

    -
    - - -

    Arguments

    - - - - - - - - - - - - - - - - - - -
    input

    (Tensor) the input tensor.

    dim

    (int, optional) the dimension to sort along

    descending

    (bool, optional) controls the sorting order (ascending or descending)

    out

    (tuple, optional) the output tuple of (Tensor, LongTensor) that can be optionally given to be used as output buffers

    - -

    sort(input, dim=-1, descending=False, out=None) -> (Tensor, LongTensor)

    - - - - -

    Sorts the elements of the input tensor along a given dimension -in ascending order by value.

    -

    If dim is not given, the last dimension of the input is chosen.

    -

    If descending is True then the elements are sorted in descending -order by value.

    -

    A namedtuple of (values, indices) is returned, where the values are the -sorted values and indices are the indices of the elements in the original -input tensor.

    - -

    Examples

    -
    # \dontrun{ - -x = torch_randn(c(3, 4)) -out = torch_sort(x) -out
    #> [[1]] -#> torch_tensor -#> -0.9253 -0.3520 -0.0946 1.1472 -#> -0.6827 0.1674 0.7136 1.4204 -#> -0.4754 -0.0864 0.5561 1.1917 -#> [ CPUFloatType{3,4} ] -#> -#> [[2]] -#> torch_tensor -#> 3 0 1 2 -#> 0 1 2 3 -#> 2 0 3 1 -#> [ CPULongType{3,4} ] -#>
    out = torch_sort(x, 1) -out
    #> [[1]] -#> torch_tensor -#> -0.6827 -0.0946 -0.4754 -0.9253 -#> -0.3520 0.1674 0.7136 0.5561 -#> -0.0864 1.1917 1.1472 1.4204 -#> [ CPUFloatType{3,4} ] -#> -#> [[2]] -#> torch_tensor -#> 1 0 2 0 -#> 0 1 1 2 -#> 2 2 0 1 -#> [ CPULongType{3,4} ] -#>
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_sparse_coo_tensor.html b/docs/reference/torch_sparse_coo_tensor.html deleted file mode 100644 index 8ce3c749f515f2ef6db77cb4ae66b507224313ae..0000000000000000000000000000000000000000 --- a/docs/reference/torch_sparse_coo_tensor.html +++ /dev/null @@ -1,277 +0,0 @@ - - - - - - - - -Sparse_coo_tensor — torch_sparse_coo_tensor • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Sparse_coo_tensor

    -
    - - -

    Arguments

    - - - - - - - - - - - - - - - - - - - - - - - - - - -
    indices

    (array_like) Initial data for the tensor. Can be a list, tuple, NumPy ndarray, scalar, and other types. Will be cast to a torch_LongTensor internally. The indices are the coordinates of the non-zero values in the matrix, and thus should be two-dimensional where the first dimension is the number of tensor dimensions and the second dimension is the number of non-zero values.

    values

    (array_like) Initial values for the tensor. Can be a list, tuple, NumPy ndarray, scalar, and other types.

    size

    (list, tuple, or torch.Size, optional) Size of the sparse tensor. If not provided the size will be inferred as the minimum size big enough to hold all non-zero elements.

    dtype

    (torch.dtype, optional) the desired data type of returned tensor. Default: if None, infers data type from values.

    device

    (torch.device, optional) the desired device of returned tensor. Default: if None, uses the current device for the default tensor type (see torch_set_default_tensor_type). device will be the CPU for CPU tensor types and the current CUDA device for CUDA tensor types.

    requires_grad

    (bool, optional) If autograd should record operations on the returned tensor. Default: False.

    - -

    sparse_coo_tensor(indices, values, size=None, dtype=None, device=None, requires_grad=False) -> Tensor

    - - - - -

    Constructs a sparse tensors in COO(rdinate) format with non-zero elements at the given indices -with the given values. A sparse tensor can be uncoalesced, in that case, there are duplicate -coordinates in the indices, and the value at that index is the sum of all duplicate value entries: -torch_sparse_.

    - -

    Examples

    -
    # \dontrun{ - -i = torch_tensor(matrix(c(1, 2, 2, 3, 1, 3), ncol = 3, byrow = TRUE), dtype=torch_int64()) -v = torch_tensor(c(3, 4, 5), dtype=torch_float32()) -torch_sparse_coo_tensor(i, v)
    #> torch_tensor -#> [ SparseCPUFloatType{} -#> indices: -#> 0 1 1 -#> 2 0 2 -#> [ CPULongType{2,3} ] -#> values: -#> 3 -#> 4 -#> 5 -#> [ CPUFloatType{3} ] -#> size: -#> [2, 3] -#> ]
    torch_sparse_coo_tensor(i, v, c(2, 4))
    #> torch_tensor -#> [ SparseCPUFloatType{} -#> indices: -#> 0 1 1 -#> 2 0 2 -#> [ CPULongType{2,3} ] -#> values: -#> 3 -#> 4 -#> 5 -#> [ CPUFloatType{3} ] -#> size: -#> [2, 4] -#> ]
    -# create empty sparse tensors -S = torch_sparse_coo_tensor( - torch_empty(c(1, 0), dtype = torch_int64()), - torch_tensor(numeric(), dtype = torch_float32()), - c(1) -) -S = torch_sparse_coo_tensor( - torch_empty(c(1, 0), dtype = torch_int64()), - torch_empty(c(0, 2)), - c(1, 2) -) -# }
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_split.html b/docs/reference/torch_split.html deleted file mode 100644 index db9cbed725bcbae1544b3762830dc8e0fcb146be..0000000000000000000000000000000000000000 --- a/docs/reference/torch_split.html +++ /dev/null @@ -1,227 +0,0 @@ - - - - - - - - -Split — torch_split • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Split

    -
    - - -

    Arguments

    - - - - - - - - - - - - - - -
    tensor

    (Tensor) tensor to split.

    split_size_or_sections

    (int) size of a single chunk or list of sizes for each chunk

    dim

    (int) dimension along which to split the tensor.

    - -

    TEST

    - - - - -

    Splits the tensor into chunks. Each chunk is a view of the original tensor.

    If `split_size_or_sections` is an integer type, then `tensor` will
    -be split into equally sized chunks (if possible). Last chunk will be smaller if
    -the tensor size along the given dimension `dim` is not divisible by
    -`split_size`.
    -
    -If `split_size_or_sections` is a list, then `tensor` will be split
    -into ``len(split_size_or_sections)`` chunks with sizes in `dim` according
    -to `split_size_or_sections`.
    -
    - - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_sqrt.html b/docs/reference/torch_sqrt.html deleted file mode 100644 index b0a3f0316c0f6195e23c3ac6cbd9854c11928b62..0000000000000000000000000000000000000000 --- a/docs/reference/torch_sqrt.html +++ /dev/null @@ -1,233 +0,0 @@ - - - - - - - - -Sqrt — torch_sqrt • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Sqrt

    -
    - - -

    Arguments

    - - - - - - - - - - -
    input

    (Tensor) the input tensor.

    out

    (Tensor, optional) the output tensor.

    - -

    sqrt(input, out=None) -> Tensor

    - - - - -

    Returns a new tensor with the square-root of the elements of input.

    -

    $$ - \mbox{out}_{i} = \sqrt{\mbox{input}_{i}} -$$

    - -

    Examples

    -
    # \dontrun{ - -a = torch_randn(c(4)) -a
    #> torch_tensor -#> 0.4471 -#> 1.6376 -#> 0.0918 -#> 0.6598 -#> [ CPUFloatType{4} ]
    torch_sqrt(a)
    #> torch_tensor -#> 0.6686 -#> 1.2797 -#> 0.3031 -#> 0.8123 -#> [ CPUFloatType{4} ]
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_square.html b/docs/reference/torch_square.html deleted file mode 100644 index d593f436d4b1c4a5ef8cc81c248b1f068e53dcbb..0000000000000000000000000000000000000000 --- a/docs/reference/torch_square.html +++ /dev/null @@ -1,230 +0,0 @@ - - - - - - - - -Square — torch_square • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Square

    -
    - - -

    Arguments

    - - - - - - - - - - -
    input

    (Tensor) the input tensor.

    out

    (Tensor, optional) the output tensor.

    - -

    square(input, out=None) -> Tensor

    - - - - -

    Returns a new tensor with the square of the elements of input.

    - -

    Examples

    -
    # \dontrun{ - -a = torch_randn(c(4)) -a
    #> torch_tensor -#> -0.3828 -#> 0.0233 -#> -1.1883 -#> 1.0369 -#> [ CPUFloatType{4} ]
    torch_square(a)
    #> torch_tensor -#> 0.1466 -#> 0.0005 -#> 1.4121 -#> 1.0752 -#> [ CPUFloatType{4} ]
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_squeeze.html b/docs/reference/torch_squeeze.html deleted file mode 100644 index 3760e7030412d48df91614c14c4b6b24e2bbed51..0000000000000000000000000000000000000000 --- a/docs/reference/torch_squeeze.html +++ /dev/null @@ -1,282 +0,0 @@ - - - - - - - - -Squeeze — torch_squeeze • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Squeeze

    -
    - - -

    Arguments

    - - - - - - - - - - - - - - -
    input

    (Tensor) the input tensor.

    dim

    (int, optional) if given, the input will be squeezed only in this dimension

    out

    (Tensor, optional) the output tensor.

    - -

    Note

    - -

    The returned tensor shares the storage with the input tensor, -so changing the contents of one will change the contents of the other.

    -

    squeeze(input, dim=None, out=None) -> Tensor

    - - - - -

    Returns a tensor with all the dimensions of input of size 1 removed.

    -

    For example, if input is of shape: -\((A \times 1 \times B \times C \times 1 \times D)\) then the out tensor -will be of shape: \((A \times B \times C \times D)\).

    -

    When dim is given, a squeeze operation is done only in the given -dimension. If input is of shape: \((A \times 1 \times B)\), -squeeze(input, 0) leaves the tensor unchanged, but squeeze(input, 1) -will squeeze the tensor to the shape \((A \times B)\).

    - -

    Examples

    -
    # \dontrun{ - -x = torch_zeros(c(2, 1, 2, 1, 2)) -x
    #> torch_tensor -#> (1,1,1,.,.) = -#> 0 0 -#> -#> (2,1,1,.,.) = -#> 0 0 -#> -#> (1,1,2,.,.) = -#> 0 0 -#> -#> (2,1,2,.,.) = -#> 0 0 -#> [ CPUFloatType{2,1,2,1,2} ]
    y = torch_squeeze(x) -y
    #> torch_tensor -#> (1,.,.) = -#> 0 0 -#> 0 0 -#> -#> (2,.,.) = -#> 0 0 -#> 0 0 -#> [ CPUFloatType{2,2,2} ]
    y = torch_squeeze(x, 1) -y
    #> torch_tensor -#> (1,1,1,.,.) = -#> 0 0 -#> -#> (2,1,1,.,.) = -#> 0 0 -#> -#> (1,1,2,.,.) = -#> 0 0 -#> -#> (2,1,2,.,.) = -#> 0 0 -#> [ CPUFloatType{2,1,2,1,2} ]
    y = torch_squeeze(x, 2) -y
    #> torch_tensor -#> (1,1,.,.) = -#> 0 0 -#> -#> (2,1,.,.) = -#> 0 0 -#> -#> (1,2,.,.) = -#> 0 0 -#> -#> (2,2,.,.) = -#> 0 0 -#> [ CPUFloatType{2,2,1,2} ]
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_stack.html b/docs/reference/torch_stack.html deleted file mode 100644 index 563a40774e1fd9da1ede0d536d74fdd4778e2f55..0000000000000000000000000000000000000000 --- a/docs/reference/torch_stack.html +++ /dev/null @@ -1,219 +0,0 @@ - - - - - - - - -Stack — torch_stack • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Stack

    -
    - - -

    Arguments

    - - - - - - - - - - - - - - -
    tensors

    (sequence of Tensors) sequence of tensors to concatenate

    dim

    (int) dimension to insert. Has to be between 0 and the number of dimensions of concatenated tensors (inclusive)

    out

    (Tensor, optional) the output tensor.

    - -

    stack(tensors, dim=0, out=None) -> Tensor

    - - - - -

    Concatenates sequence of tensors along a new dimension.

    -

    All tensors need to be of the same size.

    - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_std.html b/docs/reference/torch_std.html deleted file mode 100644 index c8bb9b531d61a985b5004883845723be4f047e44..0000000000000000000000000000000000000000 --- a/docs/reference/torch_std.html +++ /dev/null @@ -1,265 +0,0 @@ - - - - - - - - -Std — torch_std • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Std

    -
    - - -

    Arguments

    - - - - - - - - - - - - - - - - - - - - - - -
    input

    (Tensor) the input tensor.

    unbiased

    (bool) whether to use the unbiased estimation or not

    dim

    (int or tuple of ints) the dimension or dimensions to reduce.

    keepdim

    (bool) whether the output tensor has dim retained or not.

    out

    (Tensor, optional) the output tensor.

    - -

    std(input, unbiased=True) -> Tensor

    - - - - -

    Returns the standard-deviation of all elements in the input tensor.

    -

    If unbiased is False, then the standard-deviation will be calculated -via the biased estimator. Otherwise, Bessel's correction will be used.

    -

    std(input, dim, unbiased=True, keepdim=False, out=None) -> Tensor

    - - - - -

    Returns the standard-deviation of each row of the input tensor in the -dimension dim. If dim is a list of dimensions, -reduce over all of them.

    -

    If keepdim is True, the output tensor is of the same size -as input except in the dimension(s) dim where it is of size 1. -Otherwise, dim is squeezed (see torch_squeeze), resulting in the -output tensor having 1 (or len(dim)) fewer dimension(s).

    -

    If unbiased is False, then the standard-deviation will be calculated -via the biased estimator. Otherwise, Bessel's correction will be used.

    - -

    Examples

    -
    # \dontrun{ - -a = torch_randn(c(1, 3)) -a
    #> torch_tensor -#> -0.3162 0.4255 0.1976 -#> [ CPUFloatType{1,3} ]
    torch_std(a)
    #> torch_tensor -#> 0.379947 -#> [ CPUFloatType{} ]
    - -a = torch_randn(c(4, 4)) -a
    #> torch_tensor -#> 1.2036 -2.0630 -2.1182 -0.6214 -#> -1.6360 0.5014 -0.1266 -1.7918 -#> 0.2972 -0.5018 1.3086 1.4842 -#> 0.6903 -0.5105 0.5911 0.7067 -#> [ CPUFloatType{4,4} ]
    torch_std(a, dim=1)
    #> torch_tensor -#> 1.2400 -#> 1.0589 -#> 1.4759 -#> 1.4476 -#> [ CPUFloatType{4} ]
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_std_mean.html b/docs/reference/torch_std_mean.html deleted file mode 100644 index 7940b4467d8273aca01d77887614f4167a1ded37..0000000000000000000000000000000000000000 --- a/docs/reference/torch_std_mean.html +++ /dev/null @@ -1,278 +0,0 @@ - - - - - - - - -Std_mean — torch_std_mean • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Std_mean

    -
    - - -

    Arguments

    - - - - - - - - - - - - - - - - - - -
    input

    (Tensor) the input tensor.

    unbiased

    (bool) whether to use the unbiased estimation or not

    dim

    (int or tuple of ints) the dimension or dimensions to reduce.

    keepdim

    (bool) whether the output tensor has dim retained or not.

    - -

    std_mean(input, unbiased=True) -> (Tensor, Tensor)

    - - - - -

    Returns the standard-deviation and mean of all elements in the input tensor.

    -

    If unbiased is False, then the standard-deviation will be calculated -via the biased estimator. Otherwise, Bessel's correction will be used.

    -

    std_mean(input, dim, unbiased=True, keepdim=False) -> (Tensor, Tensor)

    - - - - -

    Returns the standard-deviation and mean of each row of the input tensor in the -dimension dim. If dim is a list of dimensions, -reduce over all of them.

    -

    If keepdim is True, the output tensor is of the same size -as input except in the dimension(s) dim where it is of size 1. -Otherwise, dim is squeezed (see torch_squeeze), resulting in the -output tensor having 1 (or len(dim)) fewer dimension(s).

    -

    If unbiased is False, then the standard-deviation will be calculated -via the biased estimator. Otherwise, Bessel's correction will be used.

    - -

    Examples

    -
    # \dontrun{ - -a = torch_randn(c(1, 3)) -a
    #> torch_tensor -#> -0.8958 2.0764 0.0589 -#> [ CPUFloatType{1,3} ]
    torch_std_mean(a)
    #> [[1]] -#> torch_tensor -#> 1.51748 -#> [ CPUFloatType{} ] -#> -#> [[2]] -#> torch_tensor -#> 0.413156 -#> [ CPUFloatType{} ] -#>
    - -a = torch_randn(c(4, 4)) -a
    #> torch_tensor -#> 0.2699 -0.5021 -0.4712 1.0471 -#> -2.2339 0.6823 -0.9388 0.5461 -#> -0.4050 0.9288 0.8127 1.5763 -#> 1.1732 -0.2562 0.5626 0.8124 -#> [ CPUFloatType{4,4} ]
    torch_std_mean(a, 1)
    #> [[1]] -#> torch_tensor -#> 1.4429 -#> 0.6986 -#> 0.8327 -#> 0.4380 -#> [ CPUFloatType{4} ] -#> -#> [[2]] -#> torch_tensor -#> -0.2989 -#> 0.2132 -#> -0.0087 -#> 0.9955 -#> [ CPUFloatType{4} ] -#>
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_stft.html b/docs/reference/torch_stft.html deleted file mode 100644 index 5cd1c571f935c81ab1259f0868c3b6f76c8aae1e..0000000000000000000000000000000000000000 --- a/docs/reference/torch_stft.html +++ /dev/null @@ -1,296 +0,0 @@ - - - - - - - - -Stft — torch_stft • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Stft

    -
    - - -

    Arguments

    - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    input

    (Tensor) the input tensor

    n_fft

    (int) size of Fourier transform

    hop_length

    (int, optional) the distance between neighboring sliding window frames. Default: None (treated as equal to floor(n_fft / 4))

    win_length

    (int, optional) the size of window frame and STFT filter. Default: None (treated as equal to n_fft)

    window

    (Tensor, optional) the optional window function. Default: None (treated as window of all \(1\) s)

    center

    (bool, optional) whether to pad input on both sides so that the \(t\)-th frame is centered at time \(t \times \mbox{hop\_length}\). Default: True

    pad_mode

    (string, optional) controls the padding method used when center is True. Default: "reflect"

    normalized

    (bool, optional) controls whether to return the normalized STFT results Default: False

    onesided

    (bool, optional) controls whether to return half of results to avoid redundancy Default: True

    - -

    Short-time Fourier transform (STFT).

    - - - - -

    Short-time Fourier transform (STFT).

    Ignoring the optional batch dimension, this method computes the following
    -expression:
    -
    - -

    $$ - X[m, \omega] = \sum_{k = 0}^{\mbox{win\_length-1}}% - \mbox{window}[k]\ \mbox{input}[m \times \mbox{hop\_length} + k]\ % - \exp\left(- j \frac{2 \pi \cdot \omega k}{\mbox{win\_length}}\right), -$$ -where \(m\) is the index of the sliding window, and \(\omega\) is -the frequency that \(0 \leq \omega < \mbox{n\_fft}\). When -onesided is the default value True,

    * `input` must be either a 1-D time sequence or a 2-D batch of time
    -  sequences.
    -
    -* If `hop_length` is ``None`` (default), it is treated as equal to
    -  ``floor(n_fft / 4)``.
    -
    -* If `win_length` is ``None`` (default), it is treated as equal to
    -  `n_fft`.
    -
    -* `window` can be a 1-D tensor of size `win_length`, e.g., from
    -  `torch_hann_window`. If `window` is ``None`` (default), it is
    -  treated as if having \eqn{1} everywhere in the window. If
    -  \eqn{\mbox{win\_length} &lt; \mbox{n\_fft}}, `window` will be padded on
    -  both sides to length `n_fft` before being applied.
    -
    -* If `center` is ``True`` (default), `input` will be padded on
    -  both sides so that the \eqn{t}-th frame is centered at time
    -  \eqn{t \times \mbox{hop\_length}}. Otherwise, the \eqn{t}-th frame
    -  begins at time  \eqn{t \times \mbox{hop\_length}}.
    -
    -* `pad_mode` determines the padding method used on `input` when
    -  `center` is ``True``. See `torch_nn.functional.pad` for
    -  all available options. Default is ``"reflect"``.
    -
    -* If `onesided` is ``True`` (default), only values for \eqn{\omega}
    -  in \eqn{\left[0, 1, 2, \dots, \left\lfloor \frac{\mbox{n\_fft}}{2} \right\rfloor + 1\right]}
    -  are returned because the real-to-complex Fourier transform satisfies the
    -  conjugate symmetry, i.e., \eqn{X[m, \omega] = X[m, \mbox{n\_fft} - \omega]^*}.
    -
    -* If `normalized` is ``True`` (default is ``False``), the function
    -  returns the normalized STFT results, i.e., multiplied by \eqn{(\mbox{frame\_length})^{-0.5}}.
    -
    -Returns the real and the imaginary parts together as one tensor of size
    -\eqn{(* \times N \times T \times 2)}, where \eqn{*} is the optional
    -batch size of `input`, \eqn{N} is the number of frequencies where
    -STFT is applied, \eqn{T} is the total number of frames used, and each pair
    -in the last dimension represents a complex number as the real part and the
    -imaginary part.
    -
    -.. warning::
    -  This function changed signature at version 0.4.1. Calling with the
    -  previous signature may cause error or return incorrect result.
    -
    - - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_sum.html b/docs/reference/torch_sum.html deleted file mode 100644 index ad2eabd388b65f7f5ae22edf9e4b743a46f4ddf5..0000000000000000000000000000000000000000 --- a/docs/reference/torch_sum.html +++ /dev/null @@ -1,263 +0,0 @@ - - - - - - - - -Sum — torch_sum • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Sum

    -
    - - -

    Arguments

    - - - - - - - - - - - - - - - - - - -
    input

    (Tensor) the input tensor.

    dtype

    (torch.dtype, optional) the desired data type of returned tensor. If specified, the input tensor is casted to dtype before the operation is performed. This is useful for preventing data type overflows. Default: None.

    dim

    (int or tuple of ints) the dimension or dimensions to reduce.

    keepdim

    (bool) whether the output tensor has dim retained or not.

    - -

    sum(input, dtype=None) -> Tensor

    - - - - -

    Returns the sum of all elements in the input tensor.

    -

    sum(input, dim, keepdim=False, dtype=None) -> Tensor

    - - - - -

    Returns the sum of each row of the input tensor in the given -dimension dim. If dim is a list of dimensions, -reduce over all of them.

    -

    If keepdim is True, the output tensor is of the same size -as input except in the dimension(s) dim where it is of size 1. -Otherwise, dim is squeezed (see torch_squeeze), resulting in the -output tensor having 1 (or len(dim)) fewer dimension(s).

    - -

    Examples

    -
    # \dontrun{ - -a = torch_randn(c(1, 3)) -a
    #> torch_tensor -#> -0.6977 -0.5155 -1.9107 -#> [ CPUFloatType{1,3} ]
    torch_sum(a)
    #> torch_tensor -#> -3.12391 -#> [ CPUFloatType{} ]
    - -a = torch_randn(c(4, 4)) -a
    #> torch_tensor -#> -0.4190 0.0463 -0.7716 0.4229 -#> -1.2665 0.4791 -0.5515 -1.0623 -#> 1.1148 -1.2247 0.0682 1.1490 -#> -0.7653 -1.3195 0.4248 -0.6928 -#> [ CPUFloatType{4,4} ]
    torch_sum(a, 1)
    #> torch_tensor -#> -1.3360 -#> -2.0188 -#> -0.8302 -#> -0.1831 -#> [ CPUFloatType{4} ]
    b = torch_arange(0, 4 * 5 * 6)$view(c(4, 5, 6)) -torch_sum(b, list(2, 1))
    #> torch_tensor -#> 435 -#> 1335 -#> 2235 -#> 3135 -#> [ CPUFloatType{4} ]
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_svd.html b/docs/reference/torch_svd.html deleted file mode 100644 index e933a9b77244378267f9988755b1c2a4f3ec775c..0000000000000000000000000000000000000000 --- a/docs/reference/torch_svd.html +++ /dev/null @@ -1,274 +0,0 @@ - - - - - - - - -Svd — torch_svd • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Svd

    -
    - - -

    Arguments

    - - - - - - - - - - - - - - - - - - -
    input

    (Tensor) the input tensor of size \((*, m, n)\) where * is zero or more batch dimensions consisting of \(m \times n\) matrices.

    some

    (bool, optional) controls the shape of returned U and V

    compute_uv

    (bool, optional) option whether to compute U and V or not

    out

    (tuple, optional) the output tuple of tensors

    - -

    Note

    - -

    The singular values are returned in descending order. If input is a batch of matrices, -then the singular values of each matrix in the batch is returned in descending order.

    -

    The implementation of SVD on CPU uses the LAPACK routine ?gesdd (a divide-and-conquer -algorithm) instead of ?gesvd for speed. Analogously, the SVD on GPU uses the MAGMA routine -gesdd as well.

    -

    Irrespective of the original strides, the returned matrix U -will be transposed, i.e. with strides U.contiguous().transpose(-2, -1).stride()

    -

    Extra care needs to be taken when backward through U and V -outputs. Such operation is really only stable when input is -full rank with all distinct singular values. Otherwise, NaN can -appear as the gradients are not properly defined. Also, notice that -double backward will usually do an additional backward through U and -V even if the original backward is only on S.

    -

    When some = False, the gradients on U[..., :, min(m, n):] -and V[..., :, min(m, n):] will be ignored in backward as those vectors -can be arbitrary bases of the subspaces.

    -

    When compute_uv = False, backward cannot be performed since U and V -from the forward pass is required for the backward operation.

    -

    svd(input, some=True, compute_uv=True, out=None) -> (Tensor, Tensor, Tensor)

    - - - - -

    This function returns a namedtuple (U, S, V) which is the singular value -decomposition of a input real matrix or batches of real matrices input such that -\(input = U \times diag(S) \times V^T\).

    -

    If some is True (default), the method returns the reduced singular value decomposition -i.e., if the last two dimensions of input are m and n, then the returned -U and V matrices will contain only \(min(n, m)\) orthonormal columns.

    -

    If compute_uv is False, the returned U and V matrices will be zero matrices -of shape \((m \times m)\) and \((n \times n)\) respectively. some will be ignored here.

    - -

    Examples

    -
    # \dontrun{ - -a = torch_randn(c(5, 3)) -a
    #> torch_tensor -#> -0.7888 -0.7696 0.3986 -#> -0.7711 0.2121 -1.5684 -#> 0.3715 -0.0456 -1.0608 -#> 1.1539 -0.8844 -0.7384 -#> -1.3783 -0.2028 1.6554 -#> [ CPUFloatType{5,3} ]
    out = torch_svd(a) -u = out[[1]] -s = out[[2]] -v = out[[3]] -torch_dist(a, torch_mm(torch_mm(u, torch_diag(s)), v$t()))
    #> torch_tensor -#> 8.51333e-07 -#> [ CPUFloatType{} ]
    a_big = torch_randn(c(7, 5, 3)) -out = torch_svd(a_big) -u = out[[1]] -s = out[[2]] -v = out[[3]] -torch_dist(a_big, torch_matmul(torch_matmul(u, torch_diag_embed(s)), v$transpose(-2, -1)))
    #> torch_tensor -#> 2.71036e-06 -#> [ CPUFloatType{} ]
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_symeig.html b/docs/reference/torch_symeig.html deleted file mode 100644 index 8a6a067ed13ae7dfbd63a6b318137bb54bf77e16..0000000000000000000000000000000000000000 --- a/docs/reference/torch_symeig.html +++ /dev/null @@ -1,275 +0,0 @@ - - - - - - - - -Symeig — torch_symeig • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Symeig

    -
    - - -

    Arguments

    - - - - - - - - - - - - - - - - - - -
    input

    (Tensor) the input tensor of size \((*, n, n)\) where * is zero or more batch dimensions consisting of symmetric matrices.

    eigenvectors

    (boolean, optional) controls whether eigenvectors have to be computed

    upper

    (boolean, optional) controls whether to consider upper-triangular or lower-triangular region

    out

    (tuple, optional) the output tuple of (Tensor, Tensor)

    - -

    Note

    - -

    The eigenvalues are returned in ascending order. If input is a batch of matrices, -then the eigenvalues of each matrix in the batch is returned in ascending order.

    -

    Irrespective of the original strides, the returned matrix V will -be transposed, i.e. with strides V.contiguous().transpose(-1, -2).stride().

    -

    Extra care needs to be taken when backward through outputs. Such -operation is really only stable when all eigenvalues are distinct. -Otherwise, NaN can appear as the gradients are not properly defined.

    -

    symeig(input, eigenvectors=False, upper=True, out=None) -> (Tensor, Tensor)

    - - - - -

    This function returns eigenvalues and eigenvectors -of a real symmetric matrix input or a batch of real symmetric matrices, -represented by a namedtuple (eigenvalues, eigenvectors).

    -

    This function calculates all eigenvalues (and vectors) of input -such that \(\mbox{input} = V \mbox{diag}(e) V^T\).

    -

    The boolean argument eigenvectors defines computation of -both eigenvectors and eigenvalues or eigenvalues only.

    -

    If it is False, only eigenvalues are computed. If it is True, -both eigenvalues and eigenvectors are computed.

    -

    Since the input matrix input is supposed to be symmetric, -only the upper triangular portion is used by default.

    -

    If upper is False, then lower triangular portion is used.

    - -

    Examples

    -
    # \dontrun{ - -a = torch_randn(c(5, 5)) -a = a + a$t() # To make a symmetric -a
    #> torch_tensor -#> 2.1703 -0.5663 -0.5122 -0.2134 -0.0549 -#> -0.5663 0.9832 -0.9685 0.1017 0.9142 -#> -0.5122 -0.9685 -0.8703 0.7874 -0.6067 -#> -0.2134 0.1017 0.7874 2.8112 -0.1549 -#> -0.0549 0.9142 -0.6067 -0.1549 -0.4494 -#> [ CPUFloatType{5,5} ]
    o = torch_symeig(a, eigenvectors=TRUE) -e = o[[1]] -v = o[[2]] -e
    #> torch_tensor -#> -1.5747 -#> -0.9141 -#> 1.6122 -#> 2.4168 -#> 3.1047 -#> [ CPUFloatType{5} ]
    v
    #> torch_tensor -#> 0.1632 -0.0736 0.4932 0.7860 -0.3270 -#> 0.3015 -0.4689 0.6486 -0.5181 -0.0109 -#> 0.8988 -0.0414 -0.3445 0.1244 0.2372 -#> -0.1524 0.0538 0.3046 0.2248 0.9114 -#> 0.2266 0.8776 0.3530 -0.2188 -0.0780 -#> [ CPUFloatType{5,5} ]
    a_big = torch_randn(c(5, 2, 2)) -a_big = a_big + a_big$transpose(-2, -1) # To make a_big symmetric -o = a_big$symeig(eigenvectors=TRUE) -e = o[[1]] -v = o[[2]] -torch_allclose(torch_matmul(v, torch_matmul(e$diag_embed(), v$transpose(-2, -1))), a_big)
    #> [1] TRUE
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_t.html b/docs/reference/torch_t.html deleted file mode 100644 index 28630fff235ab338da2dde605b553403a26ea3d8..0000000000000000000000000000000000000000 --- a/docs/reference/torch_t.html +++ /dev/null @@ -1,243 +0,0 @@ - - - - - - - - -T — torch_t • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    T

    -
    - - -

    Arguments

    - - - - - - -
    input

    (Tensor) the input tensor.

    - -

    t(input) -> Tensor

    - - - - -

    Expects input to be <= 2-D tensor and transposes dimensions 0 -and 1.

    -

    0-D and 1-D tensors are returned as is. When input is a 2-D tensor this -is equivalent to transpose(input, 0, 1).

    - -

    Examples

    -
    # \dontrun{ - -x = torch_randn(c(2,3)) -x
    #> torch_tensor -#> -1.4813 0.4524 -0.0294 -#> 2.6447 -0.7806 0.5434 -#> [ CPUFloatType{2,3} ]
    torch_t(x)
    #> torch_tensor -#> -1.4813 2.6447 -#> 0.4524 -0.7806 -#> -0.0294 0.5434 -#> [ CPUFloatType{3,2} ]
    x = torch_randn(c(3)) -x
    #> torch_tensor -#> -0.0126 -#> -0.5420 -#> 0.5410 -#> [ CPUFloatType{3} ]
    torch_t(x)
    #> torch_tensor -#> -0.0126 -#> -0.5420 -#> 0.5410 -#> [ CPUFloatType{3} ]
    x = torch_randn(c(2, 3)) -x
    #> torch_tensor -#> -1.9746 -0.2671 0.6073 -#> -0.1863 0.6615 1.5133 -#> [ CPUFloatType{2,3} ]
    torch_t(x)
    #> torch_tensor -#> -1.9746 -0.1863 -#> -0.2671 0.6615 -#> 0.6073 1.5133 -#> [ CPUFloatType{3,2} ]
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_take.html b/docs/reference/torch_take.html deleted file mode 100644 index da2dc588e486e6477c92b5f915470dc04b7be739..0000000000000000000000000000000000000000 --- a/docs/reference/torch_take.html +++ /dev/null @@ -1,226 +0,0 @@ - - - - - - - - -Take — torch_take • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Take

    -
    - - -

    Arguments

    - - - - - - - - - - -
    input

    (Tensor) the input tensor.

    indices

    (LongTensor) the indices into tensor

    - -

    take(input, index) -> Tensor

    - - - - -

    Returns a new tensor with the elements of input at the given indices. -The input tensor is treated as if it were viewed as a 1-D tensor. The result -takes the same shape as the indices.

    - -

    Examples

    -
    # \dontrun{ - -src = torch_tensor(matrix(c(4,3,5,6,7,8), ncol = 3, byrow = TRUE)) -torch_take(src, torch_tensor(c(0, 2, 5), dtype = torch_int64()))
    #> torch_tensor -#> 8 -#> 3 -#> 7 -#> [ CPUFloatType{3} ]
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_tan.html b/docs/reference/torch_tan.html deleted file mode 100644 index 706fe948e5ce1ec83333533a6d9f4990dbf69a53..0000000000000000000000000000000000000000 --- a/docs/reference/torch_tan.html +++ /dev/null @@ -1,233 +0,0 @@ - - - - - - - - -Tan — torch_tan • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Tan

    -
    - - -

    Arguments

    - - - - - - - - - - -
    input

    (Tensor) the input tensor.

    out

    (Tensor, optional) the output tensor.

    - -

    tan(input, out=None) -> Tensor

    - - - - -

    Returns a new tensor with the tangent of the elements of input.

    -

    $$ - \mbox{out}_{i} = \tan(\mbox{input}_{i}) -$$

    - -

    Examples

    -
    # \dontrun{ - -a = torch_randn(c(4)) -a
    #> torch_tensor -#> -1.1158 -#> 0.0583 -#> 0.2334 -#> -0.9159 -#> [ CPUFloatType{4} ]
    torch_tan(a)
    #> torch_tensor -#> -2.0440 -#> 0.0583 -#> 0.2377 -#> -1.3021 -#> [ CPUFloatType{4} ]
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_tanh.html b/docs/reference/torch_tanh.html deleted file mode 100644 index 6c41b01c4c53da3140c5cc76de42ca9b5f92a3b1..0000000000000000000000000000000000000000 --- a/docs/reference/torch_tanh.html +++ /dev/null @@ -1,234 +0,0 @@ - - - - - - - - -Tanh — torch_tanh • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Tanh

    -
    - - -

    Arguments

    - - - - - - - - - - -
    input

    (Tensor) the input tensor.

    out

    (Tensor, optional) the output tensor.

    - -

    tanh(input, out=None) -> Tensor

    - - - - -

    Returns a new tensor with the hyperbolic tangent of the elements -of input.

    -

    $$ - \mbox{out}_{i} = \tanh(\mbox{input}_{i}) -$$

    - -

    Examples

    -
    # \dontrun{ - -a = torch_randn(c(4)) -a
    #> torch_tensor -#> -0.5525 -#> -0.3071 -#> 0.8392 -#> -0.6511 -#> [ CPUFloatType{4} ]
    torch_tanh(a)
    #> torch_tensor -#> -0.5024 -#> -0.2978 -#> 0.6854 -#> -0.5724 -#> [ CPUFloatType{4} ]
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_tensor.html b/docs/reference/torch_tensor.html deleted file mode 100644 index cdb3308bece2f4630977ce6fdc3e223ee5bbe1c0..0000000000000000000000000000000000000000 --- a/docs/reference/torch_tensor.html +++ /dev/null @@ -1,242 +0,0 @@ - - - - - - - - -Converts R objects to a torch tensor — torch_tensor • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Converts R objects to a torch tensor

    -
    - -
    torch_tensor(
    -  data,
    -  dtype = NULL,
    -  device = NULL,
    -  requires_grad = FALSE,
    -  pin_memory = FALSE
    -)
    - -

    Arguments

    - - - - - - - - - - - - - - - - - - - - - - -
    data

    an R atomic vector, matrix or array

    dtype

    a torch_dtype instance

    device

    a device creted with torch_device()

    requires_grad

    if autograd should record operations on the returned tensor.

    pin_memory

    If set, returned tensor would be allocated in the pinned memory.

    - - -

    Examples

    -
    # \dontrun{ -torch_tensor(c(1,2,3,4))
    #> torch_tensor -#> 1 -#> 2 -#> 3 -#> 4 -#> [ CPUFloatType{4} ]
    torch_tensor(c(1,2,3,4), dtype = torch_int())
    #> torch_tensor -#> 1 -#> 2 -#> 3 -#> 4 -#> [ CPUIntType{4} ]
    -# } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_tensordot.html b/docs/reference/torch_tensordot.html deleted file mode 100644 index 19907a4d2c47e27b6526b8800d62216dbff74eff..0000000000000000000000000000000000000000 --- a/docs/reference/torch_tensordot.html +++ /dev/null @@ -1,222 +0,0 @@ - - - - - - - - -Tensordot — torch_tensordot • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Tensordot

    -
    - - -

    Arguments

    - - - - - - - - - - - - - - -
    a

    (Tensor) Left tensor to contract

    b

    (Tensor) Right tensor to contract

    dims

    (int or tuple of two lists of integers) number of dimensions to contract or explicit lists of dimensions for a and b respectively

    - -

    TEST

    - - - - -

    Returns a contraction of a and b over multiple dimensions.

    `tensordot` implements a generalized matrix product.
    -
    - - -

    Examples

    -
    
    -  
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_threshold_.html b/docs/reference/torch_threshold_.html deleted file mode 100644 index 6c0a17f2172c6ef669c2da0016696669a12d054d..0000000000000000000000000000000000000000 --- a/docs/reference/torch_threshold_.html +++ /dev/null @@ -1,202 +0,0 @@ - - - - - - - - -Threshold_ — torch_threshold_ • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Threshold_

    -
    - - - -

    threshold_(input, threshold, value) -> Tensor

    - - - - -

    In-place version of torch_threshold.

    - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_topk.html b/docs/reference/torch_topk.html deleted file mode 100644 index 48ee41174e739a9e9ac59f59ec139111946e20fc..0000000000000000000000000000000000000000 --- a/docs/reference/torch_topk.html +++ /dev/null @@ -1,262 +0,0 @@ - - - - - - - - -Topk — torch_topk • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Topk

    -
    - - -

    Arguments

    - - - - - - - - - - - - - - - - - - - - - - - - - - -
    input

    (Tensor) the input tensor.

    k

    (int) the k in "top-k"

    dim

    (int, optional) the dimension to sort along

    largest

    (bool, optional) controls whether to return largest or smallest elements

    sorted

    (bool, optional) controls whether to return the elements in sorted order

    out

    (tuple, optional) the output tuple of (Tensor, LongTensor) that can be optionally given to be used as output buffers

    - -

    topk(input, k, dim=None, largest=True, sorted=True, out=None) -> (Tensor, LongTensor)

    - - - - -

    Returns the k largest elements of the given input tensor along -a given dimension.

    -

    If dim is not given, the last dimension of the input is chosen.

    -

    If largest is False then the k smallest elements are returned.

    -

    A namedtuple of (values, indices) is returned, where the indices are the indices -of the elements in the original input tensor.

    -

    The boolean option sorted if True, will make sure that the returned -k elements are themselves sorted

    - -

    Examples

    -
    # \dontrun{ - -x = torch_arange(1., 6.) -x
    #> torch_tensor -#> 1 -#> 2 -#> 3 -#> 4 -#> 5 -#> [ CPUFloatType{5} ]
    torch_topk(x, 3)
    #> [[1]] -#> torch_tensor -#> 5 -#> 4 -#> 3 -#> [ CPUFloatType{3} ] -#> -#> [[2]] -#> torch_tensor -#> 4 -#> 3 -#> 2 -#> [ CPULongType{3} ] -#>
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_trace.html b/docs/reference/torch_trace.html deleted file mode 100644 index ce6a3a4568e0f7b10bdfdb11d4fb0a8923587566..0000000000000000000000000000000000000000 --- a/docs/reference/torch_trace.html +++ /dev/null @@ -1,214 +0,0 @@ - - - - - - - - -Trace — torch_trace • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Trace

    -
    - - - -

    trace(input) -> Tensor

    - - - - -

    Returns the sum of the elements of the diagonal of the input 2-D matrix.

    - -

    Examples

    -
    # \dontrun{ - -x = torch_arange(1., 10.)$view(c(3, 3)) -x
    #> torch_tensor -#> 1 2 3 -#> 4 5 6 -#> 7 8 9 -#> [ CPUFloatType{3,3} ]
    torch_trace(x)
    #> torch_tensor -#> 15 -#> [ CPUFloatType{} ]
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_transpose.html b/docs/reference/torch_transpose.html deleted file mode 100644 index 7b759e06fc06bc9c52003c175901633b9def354c..0000000000000000000000000000000000000000 --- a/docs/reference/torch_transpose.html +++ /dev/null @@ -1,235 +0,0 @@ - - - - - - - - -Transpose — torch_transpose • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Transpose

    -
    - - -

    Arguments

    - - - - - - - - - - - - - - -
    input

    (Tensor) the input tensor.

    dim0

    (int) the first dimension to be transposed

    dim1

    (int) the second dimension to be transposed

    - -

    transpose(input, dim0, dim1) -> Tensor

    - - - - -

    Returns a tensor that is a transposed version of input. -The given dimensions dim0 and dim1 are swapped.

    -

    The resulting out tensor shares it's underlying storage with the -input tensor, so changing the content of one would change the content -of the other.

    - -

    Examples

    -
    # \dontrun{ - -x = torch_randn(c(2, 3)) -x
    #> torch_tensor -#> 0.3114 -0.1716 0.4504 -#> -0.4723 -1.0927 2.1773 -#> [ CPUFloatType{2,3} ]
    torch_transpose(x, 1, 2)
    #> torch_tensor -#> 0.3114 -0.4723 -#> -0.1716 -1.0927 -#> 0.4504 2.1773 -#> [ CPUFloatType{3,2} ]
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_trapz.html b/docs/reference/torch_trapz.html deleted file mode 100644 index 93f50c75effb8564d7c75a1668d6d2d461b5a151..0000000000000000000000000000000000000000 --- a/docs/reference/torch_trapz.html +++ /dev/null @@ -1,242 +0,0 @@ - - - - - - - - -Trapz — torch_trapz • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Trapz

    -
    - - -

    Arguments

    - - - - - - - - - - - - - - - - - - -
    y

    (Tensor) The values of the function to integrate

    x

    (Tensor) The points at which the function y is sampled. If x is not in ascending order, intervals on which it is decreasing contribute negatively to the estimated integral (i.e., the convention \(\int_a^b f = -\int_b^a f\) is followed).

    dim

    (int) The dimension along which to integrate. By default, use the last dimension.

    dx

    (float) The distance between points at which y is sampled.

    - -

    trapz(y, x, *, dim=-1) -> Tensor

    - - - - -

    Estimate \(\int y\,dx\) along dim, using the trapezoid rule.

    -

    trapz(y, *, dx=1, dim=-1) -> Tensor

    - - - - -

    As above, but the sample points are spaced uniformly at a distance of dx.

    - -

    Examples

    -
    # \dontrun{ - -y = torch_randn(list(2, 3)) -y
    #> torch_tensor -#> 0.0190 1.0024 1.9078 -#> -0.0511 -0.7302 0.0112 -#> [ CPUFloatType{2,3} ]
    x = torch_tensor(matrix(c(1, 3, 4, 1, 2, 3), ncol = 3, byrow=TRUE)) -torch_trapz(y, x = x)
    #> torch_tensor -#> 2.4765 -#> -0.7502 -#> [ CPUFloatType{2} ]
    -# } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_triangular_solve.html b/docs/reference/torch_triangular_solve.html deleted file mode 100644 index a6be380cc02e368fbcdd14688f257a0b42228b7d..0000000000000000000000000000000000000000 --- a/docs/reference/torch_triangular_solve.html +++ /dev/null @@ -1,256 +0,0 @@ - - - - - - - - -Triangular_solve — torch_triangular_solve • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Triangular_solve

    -
    - - -

    Arguments

    - - - - - - - - - - - - - - - - - - - - - - -
    input

    (Tensor) multiple right-hand sides of size \((*, m, k)\) where \(*\) is zero of more batch dimensions (\(b\))

    A

    (Tensor) the input triangular coefficient matrix of size \((*, m, m)\) where \(*\) is zero or more batch dimensions

    upper

    (bool, optional) whether to solve the upper-triangular system of equations (default) or the lower-triangular system of equations. Default: True.

    transpose

    (bool, optional) whether \(A\) should be transposed before being sent into the solver. Default: False.

    unitriangular

    (bool, optional) whether \(A\) is unit triangular. If True, the diagonal elements of \(A\) are assumed to be 1 and not referenced from \(A\). Default: False.

    - -

    triangular_solve(input, A, upper=True, transpose=False, unitriangular=False) -> (Tensor, Tensor)

    - - - - -

    Solves a system of equations with a triangular coefficient matrix \(A\) -and multiple right-hand sides \(b\).

    -

    In particular, solves \(AX = b\) and assumes \(A\) is upper-triangular -with the default keyword arguments.

    -

    torch_triangular_solve(b, A) can take in 2D inputs b, A or inputs that are -batches of 2D matrices. If the inputs are batches, then returns -batched outputs X

    - -

    Examples

    -
    # \dontrun{ - -A = torch_randn(c(2, 2))$triu() -A
    #> torch_tensor -#> -0.3460 0.1356 -#> 0.0000 1.5035 -#> [ CPUFloatType{2,2} ]
    b = torch_randn(c(2, 3)) -b
    #> torch_tensor -#> -0.4014 -0.1958 0.0379 -#> -1.3143 -0.0766 -0.3524 -#> [ CPUFloatType{2,3} ]
    torch_triangular_solve(b, A)
    #> [[1]] -#> torch_tensor -#> 0.8174 0.5459 -0.2014 -#> -0.8742 -0.0509 -0.2344 -#> [ CPUFloatType{2,3} ] -#> -#> [[2]] -#> torch_tensor -#> -0.3460 0.1356 -#> 0.0000 1.5035 -#> [ CPUFloatType{2,2} ] -#>
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_tril.html b/docs/reference/torch_tril.html deleted file mode 100644 index 70db3daad3eed901cda1152a8e5b80810acd2581..0000000000000000000000000000000000000000 --- a/docs/reference/torch_tril.html +++ /dev/null @@ -1,258 +0,0 @@ - - - - - - - - -Tril — torch_tril • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Tril

    -
    - - -

    Arguments

    - - - - - - - - - - - - - - -
    input

    (Tensor) the input tensor.

    diagonal

    (int, optional) the diagonal to consider

    out

    (Tensor, optional) the output tensor.

    - -

    tril(input, diagonal=0, out=None) -> Tensor

    - - - - -

    Returns the lower triangular part of the matrix (2-D tensor) or batch of matrices -input, the other elements of the result tensor out are set to 0.

    -

    The lower triangular part of the matrix is defined as the elements on and -below the diagonal.

    -

    The argument diagonal controls which diagonal to consider. If -diagonal = 0, all elements on and below the main diagonal are -retained. A positive value includes just as many diagonals above the main -diagonal, and similarly a negative value excludes just as many diagonals below -the main diagonal. The main diagonal are the set of indices -\(\lbrace (i, i) \rbrace\) for \(i \in [0, \min\{d_{1}, d_{2}\} - 1]\) where -\(d_{1}, d_{2}\) are the dimensions of the matrix.

    - -

    Examples

    -
    # \dontrun{ - -a = torch_randn(c(3, 3)) -a
    #> torch_tensor -#> -0.6849 0.8749 -0.7531 -#> -0.5497 -1.0238 0.3323 -#> -0.0980 0.0908 -0.4867 -#> [ CPUFloatType{3,3} ]
    torch_tril(a)
    #> torch_tensor -#> -0.6849 0.0000 0.0000 -#> -0.5497 -1.0238 0.0000 -#> -0.0980 0.0908 -0.4867 -#> [ CPUFloatType{3,3} ]
    b = torch_randn(c(4, 6)) -b
    #> torch_tensor -#> -0.7116 -0.9359 -1.5487 1.6909 0.9290 -1.8224 -#> 2.1791 -0.8098 1.4367 -0.5204 1.0782 -0.4998 -#> 1.3149 -1.0202 -0.4302 -0.5773 0.0928 -1.0440 -#> -1.7950 0.6438 -0.7581 0.0569 -1.0737 1.3707 -#> [ CPUFloatType{4,6} ]
    torch_tril(b, diagonal=1)
    #> torch_tensor -#> -0.7116 -0.9359 0.0000 0.0000 0.0000 0.0000 -#> 2.1791 -0.8098 1.4367 0.0000 0.0000 0.0000 -#> 1.3149 -1.0202 -0.4302 -0.5773 0.0000 0.0000 -#> -1.7950 0.6438 -0.7581 0.0569 -1.0737 0.0000 -#> [ CPUFloatType{4,6} ]
    torch_tril(b, diagonal=-1)
    #> torch_tensor -#> 0.0000 0.0000 0.0000 0.0000 0.0000 0.0000 -#> 2.1791 0.0000 0.0000 0.0000 0.0000 0.0000 -#> 1.3149 -1.0202 0.0000 0.0000 0.0000 0.0000 -#> -1.7950 0.6438 -0.7581 0.0000 0.0000 0.0000 -#> [ CPUFloatType{4,6} ]
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_tril_indices.html b/docs/reference/torch_tril_indices.html deleted file mode 100644 index 6874b11bba83534da85733f1782ddfc7834034c6..0000000000000000000000000000000000000000 --- a/docs/reference/torch_tril_indices.html +++ /dev/null @@ -1,251 +0,0 @@ - - - - - - - - -Tril_indices — torch_tril_indices • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Tril_indices

    -
    - - -

    Arguments

    - - - - - - - - - - - - - - - - - - - - - - - - - - -
    row

    (int) number of rows in the 2-D matrix.

    col

    (int) number of columns in the 2-D matrix.

    offset

    (int) diagonal offset from the main diagonal. Default: if not provided, 0.

    dtype

    (torch.dtype, optional) the desired data type of returned tensor. Default: if None, torch_long.

    device

    (torch.device, optional) the desired device of returned tensor. Default: if None, uses the current device for the default tensor type (see torch_set_default_tensor_type). device will be the CPU for CPU tensor types and the current CUDA device for CUDA tensor types.

    layout

    (torch.layout, optional) currently only support torch_strided.

    - -

    Note

    - - -
    When running on CUDA, ``row * col`` must be less than \eqn{2^{59}} to
    -prevent overflow during calculation.
    -
    - -

    tril_indices(row, col, offset=0, dtype=torch.long, device='cpu', layout=torch.strided) -> Tensor

    - - - - -

    Returns the indices of the lower triangular part of a row-by- -col matrix in a 2-by-N Tensor, where the first row contains row -coordinates of all indices and the second row contains column coordinates. -Indices are ordered based on rows and then columns.

    -

    The lower triangular part of the matrix is defined as the elements on and -below the diagonal.

    -

    The argument offset controls which diagonal to consider. If -offset = 0, all elements on and below the main diagonal are -retained. A positive value includes just as many diagonals above the main -diagonal, and similarly a negative value excludes just as many diagonals below -the main diagonal. The main diagonal are the set of indices -\(\lbrace (i, i) \rbrace\) for \(i \in [0, \min\{d_{1}, d_{2}\} - 1]\) -where \(d_{1}, d_{2}\) are the dimensions of the matrix.

    - -

    Examples

    -
    
    -  
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_triu.html b/docs/reference/torch_triu.html deleted file mode 100644 index 4565014f0134b882b43b8680fab154e3215d37aa..0000000000000000000000000000000000000000 --- a/docs/reference/torch_triu.html +++ /dev/null @@ -1,267 +0,0 @@ - - - - - - - - -Triu — torch_triu • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Triu

    -
    - - -

    Arguments

    - - - - - - - - - - - - - - -
    input

    (Tensor) the input tensor.

    diagonal

    (int, optional) the diagonal to consider

    out

    (Tensor, optional) the output tensor.

    - -

    triu(input, diagonal=0, out=None) -> Tensor

    - - - - -

    Returns the upper triangular part of a matrix (2-D tensor) or batch of matrices -input, the other elements of the result tensor out are set to 0.

    -

    The upper triangular part of the matrix is defined as the elements on and -above the diagonal.

    -

    The argument diagonal controls which diagonal to consider. If -diagonal = 0, all elements on and above the main diagonal are -retained. A positive value excludes just as many diagonals above the main -diagonal, and similarly a negative value includes just as many diagonals below -the main diagonal. The main diagonal are the set of indices -\(\lbrace (i, i) \rbrace\) for \(i \in [0, \min\{d_{1}, d_{2}\} - 1]\) where -\(d_{1}, d_{2}\) are the dimensions of the matrix.

    - -

    Examples

    -
    # \dontrun{ - -a = torch_randn(c(3, 3)) -a
    #> torch_tensor -#> -0.3956 -0.1868 0.0874 -#> -0.3459 -0.2626 -0.7964 -#> 0.5357 -0.4833 -0.2087 -#> [ CPUFloatType{3,3} ]
    torch_triu(a)
    #> torch_tensor -#> -0.3956 -0.1868 0.0874 -#> 0.0000 -0.2626 -0.7964 -#> 0.0000 0.0000 -0.2087 -#> [ CPUFloatType{3,3} ]
    torch_triu(a, diagonal=1)
    #> torch_tensor -#> 0.01 * -#> 0.0000 -18.6760 8.7416 -#> 0.0000 0.0000 -79.6368 -#> 0.0000 0.0000 0.0000 -#> [ CPUFloatType{3,3} ]
    torch_triu(a, diagonal=-1)
    #> torch_tensor -#> -0.3956 -0.1868 0.0874 -#> -0.3459 -0.2626 -0.7964 -#> 0.0000 -0.4833 -0.2087 -#> [ CPUFloatType{3,3} ]
    b = torch_randn(c(4, 6)) -b
    #> torch_tensor -#> -0.1247 0.3568 1.5481 0.9310 0.2551 -1.8148 -#> 0.7493 0.8313 -0.6427 0.3658 -0.2912 0.3553 -#> 0.9661 2.0171 0.9854 -0.1047 -1.6832 -0.0952 -#> 0.0011 0.5442 0.5278 -0.5429 0.4507 -0.8038 -#> [ CPUFloatType{4,6} ]
    torch_triu(b, diagonal=1)
    #> torch_tensor -#> 0.0000 0.3568 1.5481 0.9310 0.2551 -1.8148 -#> 0.0000 0.0000 -0.6427 0.3658 -0.2912 0.3553 -#> 0.0000 0.0000 0.0000 -0.1047 -1.6832 -0.0952 -#> 0.0000 0.0000 0.0000 0.0000 0.4507 -0.8038 -#> [ CPUFloatType{4,6} ]
    torch_triu(b, diagonal=-1)
    #> torch_tensor -#> -0.1247 0.3568 1.5481 0.9310 0.2551 -1.8148 -#> 0.7493 0.8313 -0.6427 0.3658 -0.2912 0.3553 -#> 0.0000 2.0171 0.9854 -0.1047 -1.6832 -0.0952 -#> 0.0000 0.0000 0.5278 -0.5429 0.4507 -0.8038 -#> [ CPUFloatType{4,6} ]
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_triu_indices.html b/docs/reference/torch_triu_indices.html deleted file mode 100644 index 405d0bcf31240076d21e8a25e5d560eb24e87021..0000000000000000000000000000000000000000 --- a/docs/reference/torch_triu_indices.html +++ /dev/null @@ -1,251 +0,0 @@ - - - - - - - - -Triu_indices — torch_triu_indices • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Triu_indices

    -
    - - -

    Arguments

    - - - - - - - - - - - - - - - - - - - - - - - - - - -
    row

    (int) number of rows in the 2-D matrix.

    col

    (int) number of columns in the 2-D matrix.

    offset

    (int) diagonal offset from the main diagonal. Default: if not provided, 0.

    dtype

    (torch.dtype, optional) the desired data type of returned tensor. Default: if None, torch_long.

    device

    (torch.device, optional) the desired device of returned tensor. Default: if None, uses the current device for the default tensor type (see torch_set_default_tensor_type). device will be the CPU for CPU tensor types and the current CUDA device for CUDA tensor types.

    layout

    (torch.layout, optional) currently only support torch_strided.

    - -

    Note

    - - -
    When running on CUDA, ``row * col`` must be less than \eqn{2^{59}} to
    -prevent overflow during calculation.
    -
    - -

    triu_indices(row, col, offset=0, dtype=torch.long, device='cpu', layout=torch.strided) -> Tensor

    - - - - -

    Returns the indices of the upper triangular part of a row by -col matrix in a 2-by-N Tensor, where the first row contains row -coordinates of all indices and the second row contains column coordinates. -Indices are ordered based on rows and then columns.

    -

    The upper triangular part of the matrix is defined as the elements on and -above the diagonal.

    -

    The argument offset controls which diagonal to consider. If -offset = 0, all elements on and above the main diagonal are -retained. A positive value excludes just as many diagonals above the main -diagonal, and similarly a negative value includes just as many diagonals below -the main diagonal. The main diagonal are the set of indices -\(\lbrace (i, i) \rbrace\) for \(i \in [0, \min\{d_{1}, d_{2}\} - 1]\) -where \(d_{1}, d_{2}\) are the dimensions of the matrix.

    - -

    Examples

    -
    
    -  
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_true_divide.html b/docs/reference/torch_true_divide.html deleted file mode 100644 index d627152402560a0345e0c054c51d46fe17e684ab..0000000000000000000000000000000000000000 --- a/docs/reference/torch_true_divide.html +++ /dev/null @@ -1,233 +0,0 @@ - - - - - - - - -True_divide — torch_true_divide • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    True_divide

    -
    - - -

    Arguments

    - - - - - - - - - - -
    dividend

    (Tensor) the dividend

    divisor

    (Tensor or Scalar) the divisor

    - -

    true_divide(dividend, divisor) -> Tensor

    - - - - -

    Performs "true division" that always computes the division -in floating point. Analogous to division in Python 3 and equivalent to -torch_div except when both inputs have bool or integer scalar types, -in which case they are cast to the default (floating) scalar type before the division.

    -

    $$ - \mbox{out}_i = \frac{\mbox{dividend}_i}{\mbox{divisor}} -$$

    - -

    Examples

    -
    # \dontrun{ - -dividend = torch_tensor(c(5, 3), dtype=torch_int()) -divisor = torch_tensor(c(3, 2), dtype=torch_int()) -torch_true_divide(dividend, divisor)
    #> torch_tensor -#> 1.6667 -#> 1.5000 -#> [ CPUFloatType{2} ]
    torch_true_divide(dividend, 2)
    #> torch_tensor -#> 2.5000 -#> 1.5000 -#> [ CPUFloatType{2} ]
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_trunc.html b/docs/reference/torch_trunc.html deleted file mode 100644 index c74cf9db9bcf403f78bd562f16ebfc1871817baf..0000000000000000000000000000000000000000 --- a/docs/reference/torch_trunc.html +++ /dev/null @@ -1,231 +0,0 @@ - - - - - - - - -Trunc — torch_trunc • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Trunc

    -
    - - -

    Arguments

    - - - - - - - - - - -
    input

    (Tensor) the input tensor.

    out

    (Tensor, optional) the output tensor.

    - -

    trunc(input, out=None) -> Tensor

    - - - - -

    Returns a new tensor with the truncated integer values of -the elements of input.

    - -

    Examples

    -
    # \dontrun{ - -a = torch_randn(c(4)) -a
    #> torch_tensor -#> -0.4632 -#> -1.3494 -#> 0.0517 -#> -1.1300 -#> [ CPUFloatType{4} ]
    torch_trunc(a)
    #> torch_tensor -#> -0 -#> -1 -#> 0 -#> -1 -#> [ CPUFloatType{4} ]
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_unbind.html b/docs/reference/torch_unbind.html deleted file mode 100644 index 4b1e3cee16de1ce989cac8a42c8e11c4e98223db..0000000000000000000000000000000000000000 --- a/docs/reference/torch_unbind.html +++ /dev/null @@ -1,240 +0,0 @@ - - - - - - - - -Unbind — torch_unbind • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Unbind

    -
    - - -

    Arguments

    - - - - - - - - - - -
    input

    (Tensor) the tensor to unbind

    dim

    (int) dimension to remove

    - -

    unbind(input, dim=0) -> seq

    - - - - -

    Removes a tensor dimension.

    -

    Returns a tuple of all slices along a given dimension, already without it.

    - -

    Examples

    -
    # \dontrun{ - -torch_unbind(torch_tensor(matrix(1:9, ncol = 3, byrow=TRUE)))
    #> [[1]] -#> torch_tensor -#> 1 -#> 2 -#> 3 -#> [ CPUIntType{3} ] -#> -#> [[2]] -#> torch_tensor -#> 4 -#> 5 -#> 6 -#> [ CPUIntType{3} ] -#> -#> [[3]] -#> torch_tensor -#> 7 -#> 8 -#> 9 -#> [ CPUIntType{3} ] -#>
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_unique_consecutive.html b/docs/reference/torch_unique_consecutive.html deleted file mode 100644 index 0fbc2af9d197cc2d64795dab36e47dc961c389e1..0000000000000000000000000000000000000000 --- a/docs/reference/torch_unique_consecutive.html +++ /dev/null @@ -1,293 +0,0 @@ - - - - - - - - -Unique_consecutive — torch_unique_consecutive • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Unique_consecutive

    -
    - - -

    Arguments

    - - - - - - - - - - - - - - - - - - -
    input

    (Tensor) the input tensor

    return_inverse

    (bool) Whether to also return the indices for where elements in the original input ended up in the returned unique list.

    return_counts

    (bool) Whether to also return the counts for each unique element.

    dim

    (int) the dimension to apply unique. If None, the unique of the flattened input is returned. default: None

    - -

    TEST

    - - - - -

    Eliminates all but the first element from every consecutive group of equivalent elements.

    .. note:: This function is different from [`torch_unique`] in the sense that this function
    -    only eliminates consecutive duplicate values. This semantics is similar to `std::unique`
    -    in C++.
    -
    - - -

    Examples

    -
    # \dontrun{ -x = torch_tensor(c(1, 1, 2, 2, 3, 1, 1, 2)) -output = torch_unique_consecutive(x) -output
    #> [[1]] -#> torch_tensor -#> 1 -#> 2 -#> 3 -#> 1 -#> 2 -#> [ CPUFloatType{5} ] -#> -#> [[2]] -#> torch_tensor -#> [ CPULongType{0} ] -#> -#> [[3]] -#> torch_tensor -#> [ CPULongType{0} ] -#>
    torch_unique_consecutive(x, return_inverse=TRUE)
    #> [[1]] -#> torch_tensor -#> 1 -#> 2 -#> 3 -#> 1 -#> 2 -#> [ CPUFloatType{5} ] -#> -#> [[2]] -#> torch_tensor -#> 0 -#> 0 -#> 1 -#> 1 -#> 2 -#> 3 -#> 3 -#> 4 -#> [ CPULongType{8} ] -#> -#> [[3]] -#> torch_tensor -#> [ CPULongType{0} ] -#>
    torch_unique_consecutive(x, return_counts=TRUE)
    #> [[1]] -#> torch_tensor -#> 1 -#> 2 -#> 3 -#> 1 -#> 2 -#> [ CPUFloatType{5} ] -#> -#> [[2]] -#> torch_tensor -#> [ CPULongType{0} ] -#> -#> [[3]] -#> torch_tensor -#> 2 -#> 2 -#> 1 -#> 2 -#> 1 -#> [ CPULongType{5} ] -#>
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_unsqueeze.html b/docs/reference/torch_unsqueeze.html deleted file mode 100644 index bab4d422dc3c3a361ba82836c35e1cf5a97139db..0000000000000000000000000000000000000000 --- a/docs/reference/torch_unsqueeze.html +++ /dev/null @@ -1,232 +0,0 @@ - - - - - - - - -Unsqueeze — torch_unsqueeze • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Unsqueeze

    -
    - - -

    Arguments

    - - - - - - - - - - -
    input

    (Tensor) the input tensor.

    dim

    (int) the index at which to insert the singleton dimension

    - -

    unsqueeze(input, dim) -> Tensor

    - - - - -

    Returns a new tensor with a dimension of size one inserted at the -specified position.

    -

    The returned tensor shares the same underlying data with this tensor.

    -

    A dim value within the range [-input.dim() - 1, input.dim() + 1) -can be used. Negative dim will correspond to unsqueeze -applied at dim = dim + input.dim() + 1.

    - -

    Examples

    -
    # \dontrun{ - -x = torch_tensor(c(1, 2, 3, 4)) -torch_unsqueeze(x, 1)
    #> torch_tensor -#> 1 2 3 4 -#> [ CPUFloatType{1,4} ]
    torch_unsqueeze(x, 2)
    #> torch_tensor -#> 1 -#> 2 -#> 3 -#> 4 -#> [ CPUFloatType{4,1} ]
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_var.html b/docs/reference/torch_var.html deleted file mode 100644 index 33aa596d43b2283cc851df5d40da7e6abbb6e141..0000000000000000000000000000000000000000 --- a/docs/reference/torch_var.html +++ /dev/null @@ -1,264 +0,0 @@ - - - - - - - - -Var — torch_var • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Var

    -
    - - -

    Arguments

    - - - - - - - - - - - - - - - - - - - - - - -
    input

    (Tensor) the input tensor.

    unbiased

    (bool) whether to use the unbiased estimation or not

    dim

    (int or tuple of ints) the dimension or dimensions to reduce.

    keepdim

    (bool) whether the output tensor has dim retained or not.

    out

    (Tensor, optional) the output tensor.

    - -

    var(input, unbiased=True) -> Tensor

    - - - - -

    Returns the variance of all elements in the input tensor.

    -

    If unbiased is False, then the variance will be calculated via the -biased estimator. Otherwise, Bessel's correction will be used.

    -

    var(input, dim, keepdim=False, unbiased=True, out=None) -> Tensor

    - - - - -

    Returns the variance of each row of the input tensor in the given -dimension dim.

    -

    If keepdim is True, the output tensor is of the same size -as input except in the dimension(s) dim where it is of size 1. -Otherwise, dim is squeezed (see torch_squeeze), resulting in the -output tensor having 1 (or len(dim)) fewer dimension(s).

    -

    If unbiased is False, then the variance will be calculated via the -biased estimator. Otherwise, Bessel's correction will be used.

    - -

    Examples

    -
    # \dontrun{ - -a = torch_randn(c(1, 3)) -a
    #> torch_tensor -#> 0.9097 -1.4605 -0.9481 -#> [ CPUFloatType{1,3} ]
    torch_var(a)
    #> torch_tensor -#> 1.55533 -#> [ CPUFloatType{} ]
    - -a = torch_randn(c(4, 4)) -a
    #> torch_tensor -#> 0.9322 -0.2661 -1.1364 -0.1596 -#> -0.8385 0.0366 -0.8830 -0.5310 -#> 1.6003 0.1409 -0.4186 2.4136 -#> -0.7193 -0.5766 0.0958 -0.3928 -#> [ CPUFloatType{4,4} ]
    torch_var(a, 1)
    #> torch_tensor -#> 1.4709 -#> 0.1046 -#> 0.2947 -#> 1.9483 -#> [ CPUFloatType{4} ]
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_var_mean.html b/docs/reference/torch_var_mean.html deleted file mode 100644 index c96403f68b963749bcfe55d691f5d5ff93ac5967..0000000000000000000000000000000000000000 --- a/docs/reference/torch_var_mean.html +++ /dev/null @@ -1,277 +0,0 @@ - - - - - - - - -Var_mean — torch_var_mean • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Var_mean

    -
    - - -

    Arguments

    - - - - - - - - - - - - - - - - - - -
    input

    (Tensor) the input tensor.

    unbiased

    (bool) whether to use the unbiased estimation or not

    dim

    (int or tuple of ints) the dimension or dimensions to reduce.

    keepdim

    (bool) whether the output tensor has dim retained or not.

    - -

    var_mean(input, unbiased=True) -> (Tensor, Tensor)

    - - - - -

    Returns the variance and mean of all elements in the input tensor.

    -

    If unbiased is False, then the variance will be calculated via the -biased estimator. Otherwise, Bessel's correction will be used.

    -

    var_mean(input, dim, keepdim=False, unbiased=True) -> (Tensor, Tensor)

    - - - - -

    Returns the variance and mean of each row of the input tensor in the given -dimension dim.

    -

    If keepdim is True, the output tensor is of the same size -as input except in the dimension(s) dim where it is of size 1. -Otherwise, dim is squeezed (see torch_squeeze), resulting in the -output tensor having 1 (or len(dim)) fewer dimension(s).

    -

    If unbiased is False, then the variance will be calculated via the -biased estimator. Otherwise, Bessel's correction will be used.

    - -

    Examples

    -
    # \dontrun{ - -a = torch_randn(c(1, 3)) -a
    #> torch_tensor -#> 1.4242 -0.2759 0.7106 -#> [ CPUFloatType{1,3} ]
    torch_var_mean(a)
    #> [[1]] -#> torch_tensor -#> 0.728761 -#> [ CPUFloatType{} ] -#> -#> [[2]] -#> torch_tensor -#> 0.61961 -#> [ CPUFloatType{} ] -#>
    - -a = torch_randn(c(4, 4)) -a
    #> torch_tensor -#> 0.0532 -0.3304 -0.5824 1.8389 -#> -2.0310 -0.6095 -0.1087 0.3034 -#> -1.3414 1.7987 -0.3098 0.8658 -#> 0.0799 -0.7031 -0.5875 -1.0066 -#> [ CPUFloatType{4,4} ]
    torch_var_mean(a, 1)
    #> [[1]] -#> torch_tensor -#> 1.1034 -#> 1.4015 -#> 0.0538 -#> 1.4116 -#> [ CPUFloatType{4} ] -#> -#> [[2]] -#> torch_tensor -#> -0.8098 -#> 0.0389 -#> -0.3971 -#> 0.5004 -#> [ CPUFloatType{4} ] -#>
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_where.html b/docs/reference/torch_where.html deleted file mode 100644 index 414bbe04b0b3de9f7f060eef39b466601199d668..0000000000000000000000000000000000000000 --- a/docs/reference/torch_where.html +++ /dev/null @@ -1,244 +0,0 @@ - - - - - - - - -Where — torch_where • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Where

    -
    - - -

    Arguments

    - - - - - - - - - - - - - - -
    condition

    (BoolTensor) When True (nonzero), yield x, otherwise yield y

    x

    (Tensor) values selected at indices where condition is True

    y

    (Tensor) values selected at indices where condition is False

    - -

    Note

    - - -
    The tensors `condition`, `x`, `y` must be broadcastable .
    -
    - -
    See also [`torch_nonzero`].
    -
    - -

    where(condition, x, y) -> Tensor

    - - - - -

    Return a tensor of elements selected from either x or y, depending on condition.

    -

    The operation is defined as:

    -

    $$ - \mbox{out}_i = \left\{ \begin{array}{ll} - \mbox{x}_i & \mbox{if } \mbox{condition}_i \\ - \mbox{y}_i & \mbox{otherwise} \\ - \end{array} - \right. -$$

    -

    where(condition) -> tuple of LongTensor

    - - - - -

    torch_where(condition) is identical to -torch_nonzero(condition, as_tuple=True).

    - -

    Examples

    -
    
    -  
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_zeros.html b/docs/reference/torch_zeros.html deleted file mode 100644 index ba06eadd68d489d7fe92a823988cd23506214dd5..0000000000000000000000000000000000000000 --- a/docs/reference/torch_zeros.html +++ /dev/null @@ -1,245 +0,0 @@ - - - - - - - - -Zeros — torch_zeros • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Zeros

    -
    - - -

    Arguments

    - - - - - - - - - - - - - - - - - - - - - - - - - - -
    size

    (int...) a sequence of integers defining the shape of the output tensor. Can be a variable number of arguments or a collection like a list or tuple.

    out

    (Tensor, optional) the output tensor.

    dtype

    (torch.dtype, optional) the desired data type of returned tensor. Default: if None, uses a global default (see torch_set_default_tensor_type).

    layout

    (torch.layout, optional) the desired layout of returned Tensor. Default: torch_strided.

    device

    (torch.device, optional) the desired device of returned tensor. Default: if None, uses the current device for the default tensor type (see torch_set_default_tensor_type). device will be the CPU for CPU tensor types and the current CUDA device for CUDA tensor types.

    requires_grad

    (bool, optional) If autograd should record operations on the returned tensor. Default: False.

    - -

    zeros(*size, out=None, dtype=None, layout=torch.strided, device=None, requires_grad=False) -> Tensor

    - - - - -

    Returns a tensor filled with the scalar value 0, with the shape defined -by the variable argument size.

    - -

    Examples

    -
    # \dontrun{ - -torch_zeros(c(2, 3))
    #> torch_tensor -#> 0 0 0 -#> 0 0 0 -#> [ CPUFloatType{2,3} ]
    torch_zeros(c(5))
    #> torch_tensor -#> 0 -#> 0 -#> 0 -#> 0 -#> 0 -#> [ CPUFloatType{5} ]
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/torch_zeros_like.html b/docs/reference/torch_zeros_like.html deleted file mode 100644 index 6f6fcc309e49cd81291c1126469a1b23998263b9..0000000000000000000000000000000000000000 --- a/docs/reference/torch_zeros_like.html +++ /dev/null @@ -1,248 +0,0 @@ - - - - - - - - -Zeros_like — torch_zeros_like • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Zeros_like

    -
    - - -

    Arguments

    - - - - - - - - - - - - - - - - - - - - - - - - - - -
    input

    (Tensor) the size of input will determine size of the output tensor.

    dtype

    (torch.dtype, optional) the desired data type of returned Tensor. Default: if None, defaults to the dtype of input.

    layout

    (torch.layout, optional) the desired layout of returned tensor. Default: if None, defaults to the layout of input.

    device

    (torch.device, optional) the desired device of returned tensor. Default: if None, defaults to the device of input.

    requires_grad

    (bool, optional) If autograd should record operations on the returned tensor. Default: False.

    memory_format

    (torch.memory_format, optional) the desired memory format of returned Tensor. Default: torch_preserve_format.

    - -

    zeros_like(input, dtype=None, layout=None, device=None, requires_grad=False, memory_format=torch.preserve_format) -> Tensor

    - - - - -

    Returns a tensor filled with the scalar value 0, with the same size as -input. torch_zeros_like(input) is equivalent to -torch_zeros(input.size(), dtype=input.dtype, layout=input.layout, device=input.device).

    -

    Warning

    - - - -

    As of 0.4, this function does not support an out keyword. As an alternative, -the old torch_zeros_like(input, out=output) is equivalent to -torch_zeros(input.size(), out=output).

    - -

    Examples

    -
    # \dontrun{ - -input = torch_empty(c(2, 3)) -torch_zeros_like(input)
    #> torch_tensor -#> 0 0 0 -#> 0 0 0 -#> [ CPUFloatType{2,3} ]
    # } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/utils_dataset.html b/docs/reference/utils_dataset.html deleted file mode 100644 index a4d271a0304d4410faaede1ff04560e49f4b23ff..0000000000000000000000000000000000000000 --- a/docs/reference/utils_dataset.html +++ /dev/null @@ -1,192 +0,0 @@ - - - - - - - - -An abstract class representing a <code>Dataset</code>. — utils_dataset • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    All datasets that represent a map from keys to data samples should subclass -it. All subclasses should overwrite get_item, supporting fetching a -data sample for a given key. Subclasses could also optionally overwrite -lenght, which is expected to return the size of the dataset by many -~torch.utils.data.Sampler implementations and the default options -of ~torch.utils.data.DataLoader.

    -
    - -
    utils_dataset(..., name = NULL)
    - -

    Arguments

    - - - - - - -
    ...

    public methods for the dataset class

    - -

    Note

    - -

    ~torch.utils.data.DataLoader by default constructs a index -sampler that yields integral indices. To make it work with a map-style -dataset with non-integral indices/keys, a custom sampler must be provided.

    - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.4.1.9000.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/utils_dataset_tensor.html b/docs/reference/utils_dataset_tensor.html deleted file mode 100644 index 6c87496be5097edb3059e0313216cb52625633c4..0000000000000000000000000000000000000000 --- a/docs/reference/utils_dataset_tensor.html +++ /dev/null @@ -1,176 +0,0 @@ - - - - - - - - -Dataset wrapping tensors. — utils_dataset_tensor • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Each sample will be retrieved by indexing tensors along the first dimension.

    -
    - -
    utils_dataset_tensor(...)
    - -

    Arguments

    - - - - - - -
    ...

    tensors that have the same size of the first dimension.

    - - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.4.1.9000.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/vision_make_grid.html b/docs/reference/vision_make_grid.html deleted file mode 100644 index e3c2b7b24495cdad5c10a3d5da6d7fd1ff917bf2..0000000000000000000000000000000000000000 --- a/docs/reference/vision_make_grid.html +++ /dev/null @@ -1,229 +0,0 @@ - - - - - - - - -A simplified version of torchvision.utils.make_grid. — vision_make_grid • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Arranges a batch of (image) tensors in a grid, with optional padding between images. -Expects a 4d mini-batch tensor of shape (B x C x H x W).

    -
    - -
    vision_make_grid(
    -  tensor,
    -  scale = TRUE,
    -  num_rows = 8,
    -  padding = 2,
    -  pad_value = 0
    -)
    - -

    Arguments

    - - - - - - - - - - - - - - - - - - - - - - -
    tensor

    tensor to arrange in grid

    scale

    whether to normalize (min-max-scale) the input tensor

    num_rows

    number of rows making up the grid (default 8)

    padding

    amount of padding between batch images (default 2)

    pad_value

    pixel value to use for padding

    - - -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/with_enable_grad.html b/docs/reference/with_enable_grad.html deleted file mode 100644 index 77e29315cb94d7a50d45a9d6b0b3ba3fca331950..0000000000000000000000000000000000000000 --- a/docs/reference/with_enable_grad.html +++ /dev/null @@ -1,226 +0,0 @@ - - - - - - - - -Enable grad — with_enable_grad • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Context-manager that enables gradient calculation. -Enables gradient calculation, if it has been disabled via with_no_grad.

    -
    - -
    with_enable_grad(code)
    - -

    Arguments

    - - - - - - -
    code

    code to be executed with gradient recording.

    - -

    Details

    - -

    This context manager is thread local; it will not affect computation in -other threads.

    - -

    Examples

    -
    # \dontrun{ - -x <- torch_tensor(1, requires_grad=TRUE) -with_no_grad({ - with_enable_grad({ - y = x * 2 - }) -}) -y$backward() -x$grad
    #> torch_tensor -#> 2 -#> [ CPUFloatType{1} ]
    -# } -
    -
    - -
    - - -
    - - -
    -

    Site built with pkgdown 1.5.1.

    -
    - -
    -
    - - - - - - - - diff --git a/docs/reference/with_no_grad.html b/docs/reference/with_no_grad.html deleted file mode 100644 index 18994141059f5aeadede0d9d766453f2cdbc10c5..0000000000000000000000000000000000000000 --- a/docs/reference/with_no_grad.html +++ /dev/null @@ -1,226 +0,0 @@ - - - - - - - - -Temporarily modify gradient recording. — with_no_grad • torch - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    -
    - - - - -
    - -
    -
    - - -
    -

    Temporarily modify gradient recording.

    -
    - -
    with_no_grad(code)
    - -

    Arguments

    - - - - - - -
    code

    code to be executed with no gradient recording.

    - - -

    Examples

    -
    # \dontrun{ -x <- torch_tensor(runif(5), requires_grad = TRUE) -with_no_grad({ - x$sub_(torch_tensor(as.numeric(1:5))) -})
    #> torch_tensor -#> -0.1943 -#> -1.1859 -#> -2.5961 -#> -3.7816 -#> -4.5816 -#> [ CPUFloatType{5} ]
    x
    #> torch_tensor -#> -0.1943 -#> -1.1859 -#> -2.5961 -#> -3.7816 -#> -4.5816 -#> [ CPUFloatType{5} ]
    x$grad
    #> torch_tensor -#> [ Tensor (undefined) ]
    -# } -
    -
    - -
    - - - -
    - - - - - - - -