fabiasp {fabia} | R Documentation |
fabiasp
: R implementation of fabias
, therfore it is slow.
fabiasp(X,cyc,alpha,spz,p,norm=1)
X |
the data matrix. |
cyc |
number of cycles to run. |
alpha |
sparseness loadings via projection (0.1 - 0.9). |
spz |
sparseness factors (0.5 - 4.0). |
p |
number of hidden factors = number of biclusters. |
norm |
should the data be standardized, default = 1 (yes, using mean), 2 (yes, using median). |
Biclusters are found by sparse factor analysis where both the factors and the loadings are sparse.
Essentially the model is the sum of outer products of sparse vectors. The number of summands p is the number of biclusters.
X = L Z + U
X = sum_{i=1}^{p} L_i (Z_i )^T + U
If the nonzero components of the sparse vectors are grouped together then the outer product results in a matrix with a nonzero block and zeros elsewhere.
The model selection is performed by a variational approach according to Girolami 2001 and Palmer et al. 2006.
The prior has finit support, therefore after each update of the loadings they are projected to the finite support. The projection is done according to Hoyer, 2004: given an l_1-norm and an l_2-norm minimize the Euclidian distance to the original vector (currently the l_2-norm is fixed to 1). The projection is a convex quadratic problem which is solved iteratively where at each iteration at least one component is set to zero. Instead of the l_1-norm a sparseness measurement is used which relates the l_1-norm to the l_2-norm.
The code is implemented in R, therfore it is slow.
LZ |
Estimated Noise Free Data: L Z |
L |
Loadings: L |
Z |
Factors: Z |
Psi |
Noise variance: σ |
lapla |
Variational parameter |
Sepp Hochreiter
Mark Girolami, ‘A Variational Method for Learning Sparse and Overcomplete Representations’, Neural Computation 13(11): 2517-2532, 2001.
J. Palmer, D. Wipf, K. Kreutz-Delgado, B. Rao, ‘Variational EM algorithms for non-Gaussian latent variable models’, Advances in Neural Information Processing Systems 18, pp. 1059-1066, 2006.
Patrik O. Hoyer, ‘Non-negative Matrix Factorization with Sparseness Constraints’, Journal of Machine Learning Research 5:1457-1469, 2004.
fabi
,
fabia
,
fabiap
,
fabias
,
mfsc
,
nmfdiv
,
nmfeu
,
nmfsc
,
nprojfunc
,
projfunc
,
make_fabi_data
,
make_fabi_data_blocks
,
make_fabi_data_pos
,
make_fabi_data_blocks_pos
,
extract_plot
,
extract_bic
,
myImagePlot
,
PlotBicluster
,
Breast_A
,
DLBCL_B
,
Multi_A
,
fabiaDemo
,
fabiaVersion
#--------------- # TEST #--------------- dat <- make_fabi_data_blocks(n = 100,l= 50,p = 3,f1 = 5,f2 = 5, of1 = 5,of2 = 10,sd_noise = 3.0,sd_z_noise = 0.2,mean_z = 2.0, sd_z = 1.0,sd_l_noise = 0.2,mean_l = 3.0,sd_l = 1.0) X <- dat[[1]] Y <- dat[[2]] resEx <- fabiasp(X,50,0.8,1.0,3) ## Not run: #--------------- # DEMO1 #--------------- dat <- make_fabi_data_blocks(n = 1000,l= 100,p = 10,f1 = 5,f2 = 5, of1 = 5,of2 = 10,sd_noise = 3.0,sd_z_noise = 0.2,mean_z = 2.0, sd_z = 1.0,sd_l_noise = 0.2,mean_l = 3.0,sd_l = 1.0) X <- dat[[1]] Y <- dat[[2]] resToy <- fabiasp(X,200,0.6,1.0,13) rToy <- extract_plot(X,resToy$L,resToy$Z,"ti=FABIASP",Y=Y) #--------------- # DEMO2 #--------------- data(Breast_A) X <- as.matrix(XBreast) resBreast <- fabiasp(X,200,0.4,1.0,5) rBreast <- extract_plot(X,resBreast$L,resBreast$Z,ti="FABIASP Breast cancer(Veer)") #sorting of predefined labels CBreast #--------------- # DEMO3 #--------------- data(Multi_A) X <- as.matrix(XMulti) resMulti <- fabiasp(X,200,0.4,1.0,5) rMulti <- extract_plot(X,resMulti$L,resMulti$Z,"ti=FABIASP Multiple tissues(Su)") #sorting of predefined labels CMulti #--------------- # DEMO4 #--------------- data(DLBCL_B) X <- as.matrix(XDLBCL) resDLBCL <- fabiasp(X,200,0.6,1.0,5) rDLBCL <- extract_plot(X,resDLBCL$L,resDLBCL$Z,ti="FABIASP Lymphoma(Rosenwald)") #sorting of predefined labels CDLBCL ## End(Not run)