Topic: Some Decidability Results for Duration Calculus under synchronous interpretation
Keywords: Specification/Verification; Real-time systems; Automatic Theorem Proving
Indian Institute of Technology, Guwahati - 781001, India
Dan Van Hung International Institute for Software Technology, Post Box 3058, Macau
Paritosh K Pandya Tata Institute of Fundamental Research, Bombay, India.
Abstract: Duration Calculus (or DC in short) presents a formal notation to specify properties of real-time systems and a calculus to formally prove such properties. Decidability is the underlying foundation to automated reasoning. But, excepting some of its simple fragments, DC has been shown to be undecidable. DC takes the set of real numbers to represent time. The main reason of undecidability comes from the assumption that, in a real-time system, state changes can occur at any time point. But an implementation of a specification is ultimately executed on a computer, and theire states change according to a system clock. Under such an assumption, it has been shown that the decidability results can be extended to cover relatively richer subsets of DC. In this report, we extend such decidability results to still richer subsets of DC. As a result, many real-time systems of importance could now be mechanically verified.
Presented by: Brajendra Panda Department of Computer Science University of North Dakota, P.O. Box 9015 Grand Forks, ND 58202, USA
Abstract: It is extremely difficult to build computer systems that share information over the networks and still remain invulnerable to electronic attacks. There are techniques to detect such attacks. This research is based on the assumption that the database system has been attacked electronically and the attacking transaction has been detected. A graph based approach is used in this paper to determine the damage made by the attacker. Two algorithims are developed one of which makes the damage assessment while the second one recovers the database to a consistent state.
Presented by: P.N. Girija, K. Kalyana Chakravarthy and A. Bhanu Shankar Department of Computer/Information Sciences Artificial Intelligence Lab University of Hyderabad Hyderabad - 500046
Abstract: Surface representation and reconstruction are important problems in a variety of disciplines including geographic data processing, computer vision, computer graphics and computer aided design. The delauny triangulation in both two and three dimensions has been used by different authours as the basis for constructing object-centered surface descriptions. The triangle based representations are invarient through rigid transformations, they adapt to the variable density of data distribution, and they can easily be updated. In particular, among all possible triangulations Delauny triangulation is considered the most appropriate for surface approximation because of the equilateral shape of its triangles.
Presented by: K. Narayana Murthy Department of CIS University of Hyderabad, Hyderabad, 500046 INDIA Phone: 3010512 Extn. 4056,FAX : 91-040-3010145 email: knmcs@uohyd. ernet. in
Abstract: This paper is about efficient retrieval from large dictionaries. Dictionaries are often too large to fit into the main memory of a computer. Good indexing schemes are required to make dictionary access efficient. In this paper we present a specific indexing technique for efficient retrieval from large dictionaries. Our scheme includes a non-dense TRIE index stored in main memory and a dense index file stored in secondary memory. To look up any word in the dictionary, only a few character comparisions and a few word comparisions would be required. For both kinds of comparisios, upper bounds can be specified. We can also easily fine tune this indexing scheme to get good performance for any given dictionary and given machine configuration.
Presented by: Anustup Nayak, School of Public Policy, Georgia institute of Technology, Atlanta, GA 30332, U.S.A
Abstract: As corporate and public sector entities accept information technology as a production factor in their business processes, the volume of organizational knowledge captured in electronic database has grown exponentially. Data mining is a suite of tools and techniques that seek to discover hidden knowledge, interesting patterns and new business rules hidden in repositories of electronic data. Currently regarded as the key element of a much more elaborate process of Knowledge Discovery in Database (KDD), Data Mining integrates theoretical perspectives from statistics, machine learning and artificial intelligence. From the standpoint of technology implementation it relies on advances in data modeling, data warehousing and information retrieval. However the most important challenges lie in organizing business practices around the knowledge discovery activity. As India gears towards an informatized economy, there are huge gains to be made from realizing the gains of this new technology. With the growth of consumer related information for market research, online database for financial decision support, public domain databases on meteorology and satelite images and possibilities for creating on line libraries, there is enormous scope for applications on data mining techniques and research in the country. The tutorial talk will focus on the following aspects of the technology What is Data Mining? The process of knowledge discovery in database. Algorithimic approaches to data mining. Industrial applications of data mining. Brief overview of Text data mining : my experience Implementing data mining in a corporate context How Indian business and government agencies can benefit.
Presented by: Gyanendra Ku. Patra HCL India
Abstract: At the present day IT, many vendors are supplying machines which are of different architecture and they use different types of OS as well as software on them. But the internet community is trying to combine these together in a single forum with a single network. The traffic of the communication channel is increasing which is leading to infrastructural problems. When the infrastructure comes into picture, then everything lies either at the user end or the communication bandwidth. By' at the user end' I mean to say that the type of hardware they use or the software they operate upon. At this juncture of diversity in network protocols as well as machine architecture java comes into picture providing a challenging solution to the net community for high accessibility as well as reliability. The most rewarding solution what java provides is that it unites the different architectures, protocols into a single one. In this tutorial we are going to see a comparative study of java and the rest of the Internet tools and java's importance on the community.
Presented by: Jay N. Bhuyan Computer Science Department Tuskegee University Tuskegee, AL 36088, U.S.A. j.bhuyan @ computer.org Venkat N. Gudivada Dow Jones Markets Harborside Financial Centre Jersey City, NJ 07311, U.S.A. firstname.lastname@example.org
Abstract: This paper discusses the design and implementation of a complete information retrieval system in which the feedback from a number of users of the system, about its performance (global feedback), is stored in the form of clusters called user-oriented clusters. These clusters are utilized in answering a query in conjuction with feedback obtained about the system's performance after a partial retrieval of the query (local feedback). The global feedback is also utilized for balancing the load when the clusters of documents are distributed among different processors in a distributed environment. The first part of the paper deals with the use of user-oriented clusters without the consideration of any distributed environment. Clusters are constructed taking into account the user's perception of similarity between documents. The system accumulates feedback from the users into 3 undirected graphs and uses it to construct user-oriented clusters. An optimization function to improver the effectiveness of the clustering process is developed and optimized through genetic algorithms. In order to determine an appropriate description of clusters we decided to discard some redundant and inappropriate terms present in the document collection. This problem, called term refinement problem, is formulated and solved. Clusters and queries are represented as the vectors of the selected terms. The system starts retrieving clusters in a decreasing order of their cosine similarities to a user's query until the user is satisfied. The system developed is experimentally validated and shows an average improvement in the range of 87-237% over an existing system based on vector space model. The second part of this paper deals with some ongoing research on the distribution of documents in a distributed environment. We assume that a term present both in a query and a cluster results in a unit of computation that determines the cosine similarity between the two. It is also assumed that the frequency of use of a term by past queries is directly proportional to the frequency of its use for future queries. The problem is to distribute the clusters into different processors such that each processor does the same amount of computation. As this problem is found to be NP-hard any optimal solution is unlikely to exist. We propose two heuristic methods: one based on greedy technique and the other based on genetic algorithms, to solve this problem. We also discuss ongoing experiments of this system on a cluster of workstation.
Presented by: Parthasarathi Roop University of New South Wales
Abstract: We propose a new multi-level temporal logic, HPTL (Hidden Propositional Temporal Logic) and compare it with existing schemes. HPTL supports hidden propositions which help in expressing quantitative temporal constraints in a propositional framework. It also supports module name qualifiers of the form [in c]p helpful in expressing local safety and liveness properties of composite systems. We propose a semantics of HPTL in terms of composed sequence functions and propose scheme for the propagation of local properties to the global level.
Presented by: S. Yerneni, G.R. Mohapatra , T. Alwast Victoria University of Technology Department of Computer and Mathematical Sciences PO Box:14428, MC, Melbourne Victoria 8001, Australia
Abstract: Now-a-days, almost everyone has access to Internet and it has become part of daily life to surf the net for latest news in all areas. It is lots of fun and also informative to surf the net. But it takes lot of time to retrieve webpages and look for the required information when the traffic is very high. Today's networks are composed of many interconnected heterogeneous resources. Users specify personal, rule-based, intelligent agents that control the retrieval and handling done on their behalves. It is essential to use tools able to master this great amount of complexity to manage network resources in a single and consistent way. Many intelligent agents and agent prototypes have been introduced in recent years. But each of these agents is entering a different application niche and promotes the particular decision technology. If this sequence continues, in near future we are likely to see an explosion of agent confusion, created by masses of non-standard and non-communicating agents absorbing the network resources. The ABE Developer's toolkit is designed to ease agent-enabling of applications. The Agent Building Environment (ABE) Architecture describes an agent structure, which allows for agents to be created with different types of intelligence engines and integrated into existing applications. This ABE architecture supports the Agent Design Model (ADM). This paper involves the development of a mobile intelligent agent based on the ABE architecture. This agent will sit on the client computer and acts on the remote server which in our case is the Internet. Due to this reason, it will not be using most of network resources. The main task of this agent is to retrieve the specified web page, read all the course information of computer science department and email user only about the important information. This saves time and effort for user and he just have to start the agent program to run and he will get all the information whenever he wants. Due to this, the agent will not be wandering on the internet, thereby decreasing the network traffic. This agent is provided with user interface in which user can select some option like loading or unloading the fact set dynamically which the agent is running.
Presented by: Manoranjan Baral, SAP America, USA
Abstract: The Internet world has changed the conventional way of doing business. Electronic Commerce has become a buzz word these days. In business sense, it refers to electronically handled business transactions between customer/company/reseller and companies. Business object forms the backbone of Electronic Commerce. In a technical perspective, Electronic Commerce refers to Business over the Internet (Web Browser, Web Server, Transaction Servers, Application Servers, Database Servers, Workflow, and EDI). Electronic Commerce creates conditions for short response times to customer requests and saves money with fully automated business operations. Typical Examples : * Handling of Bank Transactions (Electronic Banking) * Software purchase and distribution (Electronic software delivery) * Processing of work orders and purchase orders(Electronic Ordering) Note: Above material is a personal view of mine, not an SAP official Document.
Presented by: Ms. Suchitra Pattnaik TEKON Services 29/30 Rasulgarh Ind. Est. Bhubaneswar-10 Ph : 91-674-581885, Ph/Fax : 91-674-580529. email : email@example.com
Abstract: Today, Geographic Information System (GIS) is a multi-billion-dollar industry employing hundreds of thousands of people worldwide. GIS is being taught in schools, colleges and universities throughout the world though it is still in its infancy in India. There are definite advantages of thinking and working geographically and that is the reason why GIS is increasingly becoming a predominant technology to help with decision making and problem solving. GIS is a computer based tool for mapping and analyzing things that exist and events that occur on Earth. A GIS stores information about the world as a collection of thematic layers that can be linked together by geography. This is relatively a simple concept having a broad range application to many real-world problems from flood damage estimate to traffic volume monitoring, to planning of train networks. GIS technology integrates common database operations such as query and statistical analysis with the unique visualisation and geographic analysis benefits offered by maps. This is how GIS is different from other information systems and makes it an important tool for explaining events, predicting outcomes and planning strategies. The article focuses on the GIS methodology, the GIS building blocks, the GIS technologies and tasks, and its applicants in transportaion, mapping and land use planning.
Presented by: G. Panda, S.K. Meher, K.C. Mohapatra REC Rourkela
Abstract: Image compression is an important issue in many applications including Biomedical Imaging, Geophysics, Communication and Astronomy. This paper presents a novel image compression technique using Discrete Wavelet Transform. The proposed algorithim has been applied to two different digital images and it is observed that a substantial compression of images is possible. From the compressed data it is possible to reconstruct faithfully the original images by using Inverse Discrete Wavelet Transform.
Presented by: A. Patnaik, S.K. Dash, R.K. Mishra Department of Electronic Science Berhampur University, Berhampur-760007
Abstract: Indexing terms : Circular microstrip antenna, Neural networks, Resonant frequency An artificial neural network architecture is developed for design of circular microstrip antenna. The network takes the thickness of the substrate(h in mm), dielectric constant (Er) and the resonant frequency in the dominant TM11 mode (fr in GHZ) as its input and gives the corresponding radius (a in mm) of the circular patch. A fast learning algorithm has been used for training.
Presented by: Nerraj Kapoor, Sambit Samal & Vedvyas Infosys Technologies Limited, Bhubaneswar
Abstract: The growing complexity of information systems and the ensuing problems of their development and management have highlighted the inadequacy of the formal and informal methods for constructing highly reliable systems. These problems manifest themselves in the computer system. Users have often demanded for reliable computer system because they realise that most failures are due to poor specification and design. The Software Engineering Matrix Analysis Methodology stated here aims to demonstrate an approach to software development that will not only led to good information system creation. Will also help to keep track of the various steps involved in the development cycle. Also a strict adherance to it will help to achieve creation of software which have standardisation built into it.
Presented by: S.K. Sahu, P.K. Tripathy Orissa Computer Academy Bhubaneswar and P.G. Department of Statistics, Utkal University.
Abstract: A mathematical model has been derived for obtaining Economic Order Quantity of an item for which the supplier permits a conditional credity facility in settling the amount owned to him and there are intermediate payments from the customer. In the proposed model, the demand rate is assumed not only to be stock dependent but also follow the relationship . To illustrate the effectiveness of the proposed model, the numerical results have been found using Newton Raphson method.
Presented by: S.K. Udgata & Smita Mahapatra Department of Computer Science, Berhampur University, Berhampur, Orissa -760007
Abstract: Artificial neural network being excellent detector for pattern recognition problems has become a major topic of study in recent years. Speech recognition is now one of the major significant research area under the broad domain of Artificial Neural Network as a pattern recognition problem. Indeed, it is major domain of computer science apart from other traditional disciplines that study the spoken words. In this paper, an attempt has been made to recognize the speaker independent speech using an important paradigm of artificial neural network called Adaptive Resonance Theory which is intended to be consistent with human brain Speech recognition is a much-talked about topic in recent times. The related works are often viewed with either unrealistic expectations or unwarranted sceptism and a prudent appraisal are rare. This paper attempts to examine several aspects of speech recognition by applying different schemes of Adaptive Resonance Theory viz. ART-2A and S-ART based on general ART network principle. Speaker independent speech recognition is an example of pattern recognition where words from a given vocabulary have to be recognised irrespective of the speaker, his/her mood, pitch etc. The speech signal is recorded in 16 bit signed linear PCM format at 32 KHz. Background noise is eliminated by detecting the end points in the speech signal using the end point location algorithm based on the standard deviation method. The speech signal which comprises of different types of information such as syntactic structure, meaning sex, his/her mood and so on. For speech recognition problem the focus is on obtaining the information relevant to recognition to spoken words and eliminating all irrelevant information. The relevant information is extracted using MEl Frequency Cepstrum Coefficients (MECCs) by applying Fast Fourier Transform to a Hamming window moved in steps of 16 ms. and eight MFCCs are evaluated for each window of speech. As the neural network is taken to be fixed size, the number of windows for all the words are need to be constant for which the MFCCs are time normalised to obtain 20 units of 8 MFCCs. Since the speech need to be independent of the pitch, the MFCCs are also amplitude normalised to remove continuity constraints and then the MFCCs are smoothened. The preprocessed speech signal is now presented to the ART network for training. The training is done by taking number of speech samples from a given vocabulary by different speakers. After successful training, the network is able to recognize the words from the vocabulary spoken by any speaker irrespective of the speaker being involved in the training or not. The performance with regard to noise level, speed and accuracy is compared and it has been found that S-ART network model is a better alternate. Key Words: Speech Recognition, Artificial Neural Network, Adaptive Resonance Theory, Unsupervised Learning.
Presented by: S.K. Udgata & Smita Mahapatra Department of Computer Science, Berhampur University, Berhampur, Orissa 760007
Abstract: Identifying persons through their fingerprints has been an age old and well established technique. Finger print recognition problems have always been the focus of the researchers for past several years. Many algorithms have been proposed and implemented for successful recognition of the fingerprints. Artificial neural networks having excellent feature detection characteristics are considered to be one of the important tools for pattern recognition problems. Finger print recognition problems can also be considered in a broad domain of pattern recognition problem. Adaptive Resonance Theory (ART) network models are very recent, effective and powerful tools for classification of patterns in an unsupervised mode. The ART network model also solves the stability-plasticity dilemma to a greater extent. ART-1, ART-2, ART-2A, S-ART are various ART network models proposed in the same order for various classification problems each one having its own advantages and limitations. In this paper, an attempt has been made to apply the ART network models for finger print identification with the inherent characteristics viz. rotation, scaling, translation, noise etc. Each fingerprint is mapped onto a 512 x 512 binary matrix in which a ridge is represented by '1', otherwise '0'. The input finger print matrices with rotational, scaled, translational features are preprocessed to obtain the moment invariant features before being presented to the S-ART network, but the same binary matrix is presented to the ART-1 network for training. The performance of ART-1 and S-ART network model with respect to fingerprint recognition with rotational, scaled, translational and noise features are compared. It has been found that the performance of S-ART model with respect to noise level, speed and accuracy is better than ART-1 model for disorted fingerprints.
Key Words: Finger Print, Artificial Neural Network, Adaptive Resonance Theory, Unsupervised Learning, Moment Invariant Features.
Presented by: Rakesh Agrawal Infosys Technologies Limited, Near Planetarium, Bhubaneswar email: firstname.lastname@example.org, Ph: 583068-71, Fax: 583991
The growing complexity of information systems and the ensuring problems of their development, maintenance and management have highlighted the inadequacy of formal and informal methods for constructing such systems. These problems manifest themselves in the computer systems which are often unmanageable, unreliable, inflexible and hence difficult to maintain. Users have often demanded for reliable computer systems because they realize that most failures are due to poor specification, and design. This has resulted in the emergence of a number of information systems methodologies together with associated computerized development environments in which the Object-Oriented (OO) approach is one of the most recent.
OO is often used for promoting software development and its reuse. Languages like Smalltalk reduce not only development time but also the cost of maintenance, simplifying the creation of new systems and the reuse of old ones. Nevertheless OO is not a panacea i.e. efforts are to put in for its proper use. Thus we consider OO as a paradigm which provides a new image, a new way of conceptualizing the development life cycle. By the help of paradigms, software developers and users are supported in apprehending the development life cycle and means to organize the aspects of the life cycle into a comprehensive method.
PATHOS (A paradigmatic Approach To High-level Object-Oriented Software development) aims to demonstrate an approach to information system development that will lead not only to good information system creation, but also to explicitly represent the maintenance of the business knowledge so as to allow for its more effective and active exploitation at run time.