Research
Home | Personal | Education | Work | Research | Publications | Students | Search

RESEARCH INTERESTS


Research Home

Learning Systems

Synctatical Pattern Recognition

Statistical Pattern Recognition

Adaptive Data Structures

Artificial Neural Networks

Database Systems

Data Compression

Robotics

Data Retrieval and Storage

Following are some of the major results obtained in the area of Statistical Pattern Recognition.
  1. Have pioneered (together with Dr. Thomas) the new paradigm of "Anti-Bayesian" Pattern Recognition. The initial results were presented as a plenary talk at CIARP'12 (Proc. CIARP 2012, PR 2013). These results were generalized for the exponential family (Proc. ICIAR 2013, PR 2014), and for multi-dimensional features (Proc. CORES 2013, PR 2013). The results have been extended for Prototype Reduction Schemes (Proc. AI 2013), Border Identification (Proc. CAIP 2013), clustering (Proc. IEA/AIE 2015) and text classification (Proc. ICCCI 2015, TCCCI 2016).
  2. Have enhanced the Self-Organizing Map (SOM) by incorporating into it any arbitrary user-defined tree-based topology. The resulting neural network, the TTO-SOM, was presented as a plenary talk at CORES'09 (Proc. CORES 2009, Inf. Sci. 2011). This result was further enhanced by including into it the concept of adaptively changing the structure of the tree (Proc. AI 2009: This paper won the Best Paper Award of the Conference; PR 2014). This paradigm has also been used in semi-supervised Pattern Recognition ( Proc. IEA/AIE 2015, Proc. AI 2011, PR 2013). The entire theory of merging the fields of SOMs and adaptive data structures is based on the results of Astudillo and myself.
  3. Have proposed a method for training and testing a classifier when there is no training data. This method, which uses from principles of simulation, LA and pattern recognition, was used to study a disease outbreak model. The corresponding paper won the ``Best Paper Award'' for the 2008 International Conference on Health Sciences Simulation held in Ottawa, Canada, in April 2008 (Proc. ICHSS 2008).
  4. Have devised a method to estimate distributions involving unobservable events for optimal search with unknown target distributions (PAA 2008).
  5. Have shown how Dissimilarity-based Classification can be optimized using Prototype Reduction Schemes (Proc. ICIAR 2006, Pat. Rec. 2007). This talk was a Plenary/Keynote Talk of the ICIAR Conference.
  6. Have shown how Kernel-based Fisher discriminant analysis can be optimized using Prototype Reduction Schemes (Proc. SSPR 2006, IEEE T:SMC 2008).
  7. Have devised a new philosophy for pattern recognition, namely, the concept of Chaotic Pattern Recognition. These results were presented as Plenary/Keynote talks at PRIP'2005 (Minsk, Belarus, May 2005) and at CORES'2005 (Wroclaw, Poland, May 2005), and in PAA 2007.
  8. Have solved the ``inverse'' of the above problem, namely that of a chaotic model for inaccurate perception (Proc. SCIA 2005, IEEE T:SMC 2007).
  9. Have devised new prototype reduction schemes for the case when the training sets are time-varying ( Proc. AI 2005} Pat. Rec. 2006).
  10. Have used state-space search techniques to determine the dimension to be used in the design of nonlinear Kernel-based classification methods (Proc. AI 2004, IEEE T:PAMI 2005).
  11. Have proposed a formal approach to using data distributions for building causal polytrees (Proc. ISMIS 2003, Inf. Sci. 2004). The generation of data for this problem was published in (Proc IEEE-SMC 2002).
  12. Have used prototype reduction schemes by themselves (Pat. Rec. 2004) and together with fusion techniques to enhance Kernel-based classification methods (Proc. AI 2002, IEEE T:PAMI 2005).
  13. Have devised various new prototype reduction schemes for pattern classification. The first one augments traditional methods with a Vector Quantization-type algorithm to create a novel hybrid scheme (Pat. Rec. 2003). The second new method was one which recursively subdivided the data, and was very effective for large data sets (Proc. SSSPR 2002, IEEE T:SMC 2004).
  14. Have used pattern classification techniques to formally prove why heuristic functions work in AI schemes (The AI Journal 2005).
  15. Have developed a formal theory for pairwise linear Bayesian classifiers for Normal distributions. The results for the two-dimensional case were published in IEEE T:PAMI 2002, and for the d-dimensional case in Pat. Rec. 2002.
  16. Have developed new methods for the Bootstrap estimation of the Bhattacharyya error bound for Bayesian classifiers for Normal distributions (NNA 2001, PRIS 2001).
  17. Have developed an approximation techniques capable of approximating functions, curves and images in both the time and frequency domains (IEEE T:PAMI 1997).
  18. Have derived moment methods for decomposing mixture distributions for the entire single-parameter exponential family (IEEE T:SMC 1995).
  19. Have generalized the scheme devised in 1992 for approximating the dependence tree of continuous normal vectors. In this case the optimality of the solution has been shown (Pat. Rec. 1993).
  20. Have worked in the area of density approximation. Have recently proposed a new technique for the estimation of dependence trees using the Chi-Squared metric for discrete variables (Pat. Rec. 1992).