Date of publication: 2017-08-24 04:28
A thesis paper or project forms a capstone to the program. A student must complete the graduate seminar and submit an acceptable proposal to the computer science faculty before registering for, or initiating, a thesis or project.
Typically, the data are entered into a number of data files (. text files or excel files), and then the data in the files are fed into some statistical package for statistical analysis. As the work progressed, to test some hypotheses, or to perform some exploratory analysis, new data files often had to be prepared. This approach is very time-consuming and error-prone.
The scheme can be applied to Natural Language Processing, Sentiment Analysis and Question-Answering Systems to serve as a tool for identifying the precise meaning of a word, and consequently to achieve Word Sense Disambiguation.
The rapid advances in the recent years in the areas of integrated circuit electronics, wireless communication and micro-electromechanical systems have led to the emergence of the wireless sensor network technology. The 'Smart Dust' project at the University of California, Berkley, introduced the vision of self-configuring networks of inexpensive small (~6 mm8 ) nodes (with a processor, a radio transceiver and sensors) in a wide range of applications such as environment monitoring, health care and agriculture.
The comparative genomics approach compares two or more genomes (the total heritable portion of an organism). Traditional visual presentations have centered on linear tracks with connecting lines to show points of similarity or difference. In this project you will overlay large amounts of comparative data on a set of 8D surfaces which are controlled and interfaced by using human interaction, like the Xbox Kinect.
In today's world, web searches are major activity undertaken by people for industrial, research and other reasons. They involve searches across a very wide range of web pages in a wide range of sources. The searcher may down load pages, extract information from pages, and, in the process, create a history of link activations. The problem people face is what happens if the searcher has to stop, and resume the process days later.
We also supervise in selection of Master thesis topics in Computer Science, Electronics and Communication, Computer Vision, Machine Learning and Artificial Intelligence along with the assistance in the paper publication process in IEEE journals, Thomson Reuters, Science Citation Indexed journals and various reputed International Journals.
Computational approached for motif discovery in DNA sequences demonstrated promising results. However, the existing tools for motif finding are lack of reliability and scalability, and the obtained results vary from each run of the programs. This uncertainty makes the biologists unsatisfied because they do not know which result will be further verified in wet laboratory. Therefore, it is meaningful to develop more reliable searching tools to reduce the cost in labs. Students to take this project need strong programming skills, knowledge on data mining and computational intelligence.
Prokaryotes, single-celled organisms like bacteria, do not have an enclosed nucleus, therefore their DNA is floating around in the cytoplasm. This project will use computational techniques to analyse DNA sequences to assess supercoiling in the context of packing large amounts of DNA and its implications for 8D structures. Machine learning, profile generation and statistical techniques are combined to generate a suite of predictive tools for the Bioinformatic community.
The aim of web services is to make data resources available over the Internet to applications (programs) written in any language. There are two approaches to web services: SOAP (where "SOAP" stands for "Simple Object Access Protocol) -based and RESTful (where "REST" stands for "Representational State Transfer). RESTful Web services have now been recognized as generally the most useful methods to provide data-services for web and mobile application development.
To build fuzzy inference systems or neuro-fuzzy intelligent systems, the extraction of a set of fuzzy rules from numerical data plays a key role for successful modelling or forecasting time series data. This project aims to further develop robust extraction of fuzzy rulebase using association analysis in data mining. It is expected that the generated rulebase will result in a more robust fuzzy system against some uncertainties presence. Some typical business applications will be employed for assessing the merits and shortcomings of the proposed techniques. Students will have a good opportunity to further develop their knowledge base and implementation skills with this project.
THIS PROJECT.. What should the design-rules look like for a system of the ZAIA type look like? One way of doing this would be to design and demonstrate such a tool, such as tabbase 
The challenges for big data analysis include investigation, collection, visualization, exploration, distribution, storing, transmission, and security. The development to big data sets is due to the additional information derivable from analysis of large set of related data and allow data correlations to be created to becoming useful information and knowledge. This project will come across limitations due to big data sets in one of areas, including bioinformatics/genomics, multimedia, complex simulations, or environmental discovery. For more details please contact Phoebe.