I have a 100x100 matrix of rgamma realisations. I need to find the x = (mean(row)/var(row)) in order to find the SD(x). How do I loop through my matrix for this? Please kindly help, I am unsure of R syntax
data = matrix(rgamma(100,4.623371,2.757291), nrow=100)
rowMeans(data)
CodePudding user response:
Given a matrix data
the following divides the row mean by the variance of that same row, for all rows.
apply(data, 1, \(x) mean(x) / var(x))
CodePudding user response:
First, to get a 100 x 100 matrix, you need an input of 1e4 (100*100). Second, you could look at the sparseMatrixStats
package which has the functions rowMeans2
and rowVars
as well as rowSds
. I'm not sure if you want SDs per row or total.