Subscribe:

Pages

Showing posts with label P. Show all posts
Showing posts with label P. Show all posts

Wednesday, January 12, 2011

PERCEPTIVE COMPUTING:A computer with perceptual capabilities

Survival of animal depends highly on developed sensory abilities. Like wise human recognition depends on highly developed abilities to perceive, integrate, and interpret visual, auditory, and touch information. Also we have no doubt that if computers had even a small fraction of the perceptual ability of animals or humans, then they would be much more powerfull.Adding such perceptual abilities to computers would enable computers and humans to work together more as partners. Perceptive Computing (Blue Eyes) project aims at creating computational devices with the sort of perceptual abilities that people take for granted. Blue Eyes uses non-obtrusive sensing technology

Friday, July 30, 2010

Seminar on Photonic Band Gap Materials: Light Trapping Crystals

Photonic Band Gap (PBG) materials are artificial, periodic, dielectrics that enable engineering of the most fundamental properties of electromagnetic waves. These include the laws of refraction, diffraction, and spontaneous emission of light. Unlike traditional semiconductors that rely on the propagation of electrons through an atomic lattice, PBG materials execute their novel functions through selective trapping or localization of light. This is a fundamentally new and largely unexplored property of Maxwell's equations. This is also of practical importance for alloptical communications, information processing, efficient lighting, and solar energy trapping. Three dimensional (3D) PBG materials offer a unique opportunity to simultaneously (i) synthesize micron-scale 3D circuits of light that do not suffer from diffractive losses and (ii) engineer the electromagnetic vacuum density of states in this 3D optical micro-chip. This combined capability opens a new frontier in integrated optics as well as the basic science of radiation-matter interactions. I review recent approaches to micro-fabrication of photonic crystals with a large 3D PBG centered near 1.5 microns. These include direct laser-writing techniques, holographic lithography, and a newly invented optical phase mask lithography technique. I discuss consequences of PBG materials in classical and quantum electrodynamics.

Power transformers

Power transformers are the most significant pieces of equipment for electrical power delivery systems. One of the key parameters to be monitored in a power transformer is the internal temperature. High temperature accelerates the aging of winding paper insulation and increases the risk of bubbling under severe load conditions. Temperature is also an important parameter for transformer cooling system. The transformer winding hottest-spot temperature is one of a number of limiting factors for the loading capability of transformers. One way to increase the loading capability is to increase the efficiency of the cooling system by using fans and pumps. This research focuses on the investigation of the effect of the cooling system parameters, in particular the oil flow rate, on the thermal performance of power transformers.

Sunday, August 30, 2009

Prolog

Logic programming is a programming paradigm based on mathematical logic. In this paradigm the programmer
specifies relationships among data values (this constitutes a logic program) and then poses queries to
the execution environment (usually an interactive interpreter) in order to see whether certain relationships
hold. Putting this in another way, a logic program, through explicit facts and rules, defines a base of knowledge
from which implicit knowledge can be extracted. This style of programming is popular for data base
interfaces, expert systems, and mathematical theorem provers. In this tutorial you will be introduced to
Prolog, the primary logic programming language, through the interactive SWI-Prolog system (interpreter).
You will notice that Prolog has some similarities to a functional programming language such as Hugs. A
functional program consists of a sequence of function definitions — a logic program consists of a sequence
of relation definitions. Both rely heavily on recursive definitions. The big difference is in the underlying
execution “engine” — i.e., the imperative parts of the languages. The execution engine of a functional
language evaluates an expression by converting it to an acyclic graph and then reducing the graph to a
normal form which represents the computed value. The Prolog execution environment, on the other hand,
doesn’t so much “compute” an answer, it “deduces” an answer from the relation definitions at hand. Rather
than being given an expression to evaluate, the Prolog environment is given an expression which it interprets
as a question:
For what parameter values does the expression evaluate to true?
You will see that Prolog is quite different from other programming languages you have studied. First,
Prolog has no types. In fact, the basic logic programming environment has no literal values as such. Identifiers
starting with lower-case letters denote data values (almost like values in an enumerated type) while all other
identifiers denote variables. Though the basic elements of Prolog are typeless, most implementations have
been enhanced to include character and integer values and operations. Also, Prolog has mechanisms built
in for describing tuples and lists. You will find some similarity between these structures and those provided
in Hugs.
CSCI

Saturday, August 1, 2009

PLASMA PANEL DISPLAY

For the past 75 years, the vast majority of displays have been built around the same technology: the cathode ray tube (CRT). Recently, a new alternative has popped up on store shelves: the plasma flat panel display. These displays have wide screens, comparable to the largest CRT sets, but they are only about 6 inches (15 cm) thick. Based on the information in a video signal, the display lights up thousands of tiny dots (called pixels) with a high-energy beam of electrons. In most systems, there are three pixel colors -- red, green and blue -- which are evenly distributed on the screen. By combining these colors in different proportions, the display can produce the entire color spectrum. The basic idea of a plasma display is to illuminate tiny colored fluorescent lights to form an image. Each pixel is made up of three fluorescent lights -- a red light, a green light and a blue light. Just like a CRT television, the plasma display varies the intensities of the different lights to produce a full range of colors. The central element in a fluorescent light is a plasma, a gas made up of free-flowing ions (electrically charged atoms) and electrons (negatively charged particles). Xenon and neon atoms, the atoms used in plasma screens, release light photons when they are excited. These photons are used to illuminate the pixels accordingly.

Friday, July 3, 2009

Protein Memories for Computers

ABSTRACT

The world’s most advanced super computer doesn’t require a single semiconductor chip.
The human brain consists of organic molecules that combines to form a highly sophisticated network able to calculate, perceive, manipulate, self-repair, think and feel. Digital computers can certainly perform calculations much faster and more precisely than humans, but even simple organisms are superior to computers in the other five domains. Computer designers may never be able to make machines having all the facilities of natural brain,but we can exploit some special properties of biological molecular-particularly proteins-to build computer components that are faster ,smaller and more powerful than any electronic devices .
Devices fabricated from biological molecules promise compact size and faster data storage. They lead themselves to use in parallel processing computers,3Dmemories and neural networks.
As the trend towards miniaturization continues, the cost of manufacturing a chip increases considerably. On the other hand ,the use of biological molecules as the active components in a computer circuitry may offer an alternative approach that is more economical.

If you are you interested in this seminar topic, mail to them to get
the full report * of the seminar topic.
Mail ID: - contact4seminars@gmail.com

Thursday, July 2, 2009

Push Technology

Push technology reverses the Internet's content delivery model. Before push, content publishers had to reply upon the end-users own initiative to bring them to a web site or download content. With push technology the publisher can deliver a content directly to the users PC, thus substantially improving the likelihood that the user will view it. Push content can be extremely timely, and delivered fresh several times a day. Information keeps coming to user whatever he asked for it or not. The most common analog for push technology is a TV channel; it keeps sending us stuff whether we care about it or not.

Push was created to alleviate two problems facing users of net. The first problem is information overload. The volume and dynamic nature of content on the internet is a impediment to users, and has become an ease-of -use of issue. Without push applications can be tedious, time consuming, and less than dependable. Users have to manually hunt down information, search out links, and monitor sites and information sources. Push applications and technology building blocks narrow that focus even further and add considerable ease of use. The second problem is that most end-users are restricted to low bandwidth internet connections, such as 33.3 kbps modems, thus making it difficult to receive multimedia content. Push technology provides means to pre-deliver much larger packages of content.

Push technology enables the delivery of multimedia content on the internet through the use of local storage and transparent content downloads. Like a faithful delivery agent, push, often referred to as broadcasting, delivers content directly to user transparently and automatically. It is one of the internet's most promising technologies.


Already a success, push is being used to pump data in the form of news, current affairs and sports etc, to many computers connected to the internet.Updating software is one of the fastest growing uses of push. It is a new and exciting way to manage software update and upgrade hassles. Using the internet today without the aid of a push application can be a tedious, time consuming, and less than dependable. Computer programming is an inexact art, and there is a huge need to quickly and easily get bug fixes, software updates, and even whole new program out to people. Users have to manually hunt down information, search out links, and monitor sites and information sources.

2. THE PUSH PROCESS

For the end user, the process of receiving push content is quite simple. First, an individual subscribes to a publisher's site or channel by providing the content preferences. The subscriber also sets up a schedule specifying when information should be delivered. Based on the subscriber's schedule, the PC connects to the internet, and the client software notifies the publisher's server that the download can occur. The server collates the content pertaining to the subscriber's profile and downloads it to the subscriber's machine, after which the content is available for the subscriber's viewing

WORKING

Interestingly enough, from a technical point of view, most push applications are pull and just appear to be 'push' to the user. In fact, a more accurate description of this process would be 'automated pull'.

The web currently requires the user to poll sites for new or updated information. This manual polling and downloading process is referred to as 'pull' technology. From a business point of view, this process provides little information about user, and even little control over what information is acquired. It is the user has to keep track of the location of the information sites, and the user has to continuously search for informational changes - a very time consuming process. The 'push' model alleviates much of this tedium.