## IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics) - new TOC TOC Alert for Publication# 3477

- 2012 Index IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics) Vol. 42el diciembre 4, 2012 a las 8:45 pm
This index covers all technical items - papers, correspondence, reviews, etc. - that appeared in this periodical during the year, and items from previous years that were commented upon or corrected in this year. Departments and other items may also be covered if they have been judged to have archival value. The Author Index contains the primary entry for each item, listed under the first author's name. The primary entry includes the co-authors' names, the title of the paper or other item, and its location, specified by the publication abbreviation, year, month, and inclusive pagination. The Subject Index contains entries describing the item under all appropriate subject headings, plus the first author's name, the publication abbreviation, month, and year, and inclusive pages. Note that the item title is found only under the primary entry in the Author Index.

- Multivariate Multilinear Regressionel noviembre 16, 2012 a las 3:18 pm
Conventional regression methods, such as multivariate linear regression (MLR) and its extension principal component regression (PCR), deal well with the situations that the data are of the form of low-dimensional vector. When the dimension grows higher, it leads to the under sample problem (USP): the dimensionality of the feature space is much higher than the number of training samples. However, little attention has been paid to such a problem. This paper first adopts an in-depth investigation to the USP in PCR, which answers three questions: 1) Why is USP produced? 2) What is the condition for USP, and 3) How is the influence of USP on regression. With the help of the above analysis, the principal components selection problem of PCR is presented. Subsequently, to address the problem of PCR, a multivariate multilinear regression (MMR) model is proposed which gives a substitutive solution to MLR, under the condition of multilinear objects. The basic idea of MMR is to transfer the multilinear structure of objects into the regression coefficients as a constraint. As a result, the regression problem is reduced to find two low-dimensional coefficients so that the principal components selection problem is avoided. Moreover, the sample size needed for solving MMR is greatly reduced so that USP is alleviated. As there is no closed-form solution for MMR, an alternative projection procedure is designed to obtain the regression matrices. For the sake of completeness, the analysis of computational cost and the proof of convergence are studied subsequently. Furthermore, MMR is applied to model the fitting procedure in the active appearance model (AAM). Experiments are conducted on both the carefully designed synthesizing data set and AAM fitting databases verified the theoretical analysis.

- Linearithmic Time Sparse and Convex Maximum Margin Clusteringel noviembre 16, 2012 a las 3:18 pm
Recently, a new clustering method called maximum margin clustering (MMC) was proposed and has shown promising performances. It was originally formulated as a difficult nonconvex integer problem. To make the MMC problem practical, the researchers either relaxed the original MMC problem to inefficient convex optimization problems or reformulated it to nonconvex optimization problems, which sacrifice the convexity for efficiency. However, no approaches can both hold the convexity and be efficient. In this paper, a new linearithmic time sparse and convex MMC algorithm, called support-vector-regression-based MMC (SVR-MMC), is proposed. Generally, it first uses the SVR as the core of the MMC. Then, it is relaxed as a convex optimization problem, which is iteratively solved by the cutting-plane algorithm. Each cutting-plane subproblem is further decomposed to a serial supervised SVR problem by a new global extended-level method (GELM). Finally, each supervised SVR problem is solved in a linear time complexity by a new sparse-kernel SVR (SKSVR) algorithm. We further extend the SVR-MMC algorithm to the multiple-kernel clustering (MKC) problem and the multiclass MMC (M3C) problem, which are denoted as SVR-MKC and SVR-M3C, respectively. One key point of the algorithms is the utilization of the SVR. It can prevent the MMC and its extensions meeting an integer matrix programming problem. Another key point is the new SKSVR. It provides a linear time interface to the nonlinear kernel scenarios, so that the SVR-MMC and its extensions can keep a linearthmic time complexity in nonlinear kernel scenarios. Our experimental results on various real-world data sets demonstrate the effectiveness and the efficiency of the SVR-MMC and its two extensions. Moreover, the unsupervised application of the SVR-MKC to the voice activity detection (VAD) shows that the SVR-MKC can achieve good performances that are close to its supervised counterpart, meet the real-time demand of the VAD, and need no- labeling for model training.

- Approximate Optimal Control Design for Nonlinear One-Dimensional Parabolic PDE Systems Using Empirical Eigenfunctions and Neural Networkel noviembre 16, 2012 a las 3:18 pm
This paper addresses the approximate optimal control problem for a class of parabolic partial differential equation (PDE) systems with nonlinear spatial differential operators. An approximate optimal control design method is proposed on the basis of the empirical eigenfunctions (EEFs) and neural network (NN). First, based on the data collected from the PDE system, the Karhunen-Loève decomposition is used to compute the EEFs. With those EEFs, the PDE system is formulated as a high-order ordinary differential equation (ODE) system. To further reduce its dimension, the singular perturbation (SP) technique is employed to derive a reduced-order model (ROM), which can accurately describe the dominant dynamics of the PDE system. Second, the Hamilton-Jacobi-Bellman (HJB) method is applied to synthesize an optimal controller based on the ROM, where the closed-loop asymptotic stability of the high-order ODE system can be guaranteed by the SP theory. By dividing the optimal control law into two parts, the linear part is obtained by solving an algebraic Riccati equation, and a new type of HJB-like equation is derived for designing the nonlinear part. Third, a control update strategy based on successive approximation is proposed to solve the HJB-like equation, and its convergence is proved. Furthermore, an NN approach is used to approximate the cost function. Finally, we apply the developed approximate optimal control method to a diffusion-reaction process with a nonlinear spatial operator, and the simulation results illustrate its effectiveness.

- Supervised Latent Linear Gaussian Process Latent Variable Model for Dimensionality Reductionel noviembre 16, 2012 a las 3:18 pm
The Gaussian process (GP) latent variable model (GPLVM) has the capability of learning low-dimensional manifold from highly nonlinear data of high dimensionality. As an unsupervised dimensionality reduction (DR) algorithm, the GPLVM has been successfully applied in many areas. However, in its current setting, GPLVM is unable to use label information, which is available for many tasks; therefore, researchers proposed many kinds of extensions to the GPLVM in order to utilize extra information, among which the supervised GPLVM (SGPLVM) has shown better performance compared with other SGPLVM extensions. However, the SGPLVM suffers in its high computational complexity. Bearing in mind the issues of the complexity and the need of incorporating additionally available information, in this paper, we propose a novel SGPLVM, called supervised latent linear GPLVM (SLLGPLVM). Our approach is motivated by both SGPLVM and supervised probabilistic principal component analysis (SPPCA). The proposed SLLGPLVM can be viewed as an appropriate compromise between the SGPLVM and the SPPCA. Furthermore, it is also appropriate to interpret the SLLGPLVM as a semiparametric regression model for supervised DR by making use of the GP to model the unknown smooth link function. Complexity analysis and experiments show that the developed SLLGPLVM outperforms the SGPLVM not only in the computational complexity but also in its accuracy. We also compared the SLLGPLVM with two classical supervised classifiers, i.e., a GP classifier and a support vector machine, to illustrate the advantages of the proposed model.