This post presents a list of terms used commonly with the graphical user interface
(GUI): GUIs are systems that allow creation and manipulations of user interfaces
employing windows, menus, icons, dialog boxes-mouse and keyboard. Macintosh toolbox,
Microsoft Windows and X-Windows are some examples of GUIs.
Pointing Devices:
Pointing devices allow users to point at different parts of the screen. Pointing devices
can be used to invoke a command from a list of commands presented in a menu. They
can also be used to manipulate objects on the screen by:
* Selecting objects on the screen
* Moving objects around the screen, or
* Merging several objects into another object.
Since the 1960s, a diverse set of tools have been used as pointing devices including the
light pen, joystick, touch sensitive screen and a mouse. The popularity of the mouse is
due to the optimal coordination of hand and easier tracking of the cursor on the screen.
Pointer:
A symbol that appears on the display screen and that you move to select objects and
commands. Usually the pointer appears as a small angled arrow.
Bit-Mapped Displays:
As memory chips get denser and cheaper, bit displays are replacing character-based
display screens. Bit-mapped display made up of tiny dots (pixels) are independently
addressable and much finer resolution than character displays. Bit-mapped displays
have advantages over character displays. One of the major advantages is graphic
manipulation capabilities for vector and raster graphics, which presents information in
the final form on paper (also called WYSIWYG: What You See Is What You Get).
Windows:
When a screen is split into several independent regions, each one is called a window.
Several applications can display results simultaneously in different windows.The end-user can switch from one application to another or share data between
applications. Windowing systems have capabilities to display windows either tiled or
over-lapped, Users can organize the screen by resizing the window or
moving related windows closer.
Menus:
A menu displays a list of commands available within an application . From
this menu, the end-user can select operations such as File, Edit or Search. Instead of
remembering commands at each stage, a menu can be used to provide a list of items.
Each menu item can be either a word or an icon representing a command or a
function. A menu item can be invoked by moving the cursor on the menu item and
selecting the item by clicking the mouse.
Instead of memorizing commands to each stage, the user selects a command from a
menu bar displaying a list of available commands.
Dialog boxes:
Dialog boxes allow more complex interaction between the user and the
computer. Dialog boxes employ a collection of control objects such as dials, buttons,
scroll bars and editable boxes.
In graphical user-interfaces, textual data is not only a form of interaction. Icons
represent concepts such as file folders, wastebaskets, and printers. Icons symbolize
words and concepts commonly applied in different situations. Each one of these icons represents a certain type of painting behavior. Once the pencil icon is clicked, for example, the cursor can
behave as a pencil to draw lines. Applications of icons to the user-interface design are
still being explored in new computer systems and software such as the NeXT
computer user interface.
Dialog boxes are primarily used to collect information from the user or to present
information to the user.
Among the information obtained are the number of copies and page numbers to be
printed. Dialog boxes are also used to indicate error message in the form of alert
boxes. Dialog boxes use a wide range of screen control elements to communicate with
the user.
Icons:
Icons are used to provide a symbolic representation of any systemluser-defined object
such as file, folder, address, book, applications and so on. Different types of objects
are represented by a specific type of icon. In some GUIs, documents representing
folders are represented by a folder icon A folder icon contains a group of
files or other folder icons. Double clicking on the folder icon causes a window to be
opened displaying a list of icons and folder icons representing the folder's
contents.
Desktop Metaphor:
The idea of metaphors has brought the computer closer to the natural environment of
the end-user. The concept of physical metaphor paradigm, developed by Alan Kay,
initiated most of the research for graphic user interfaces based on a new programming approach called object-oriented programming. Discussion of this subject is beyond this
unit. The physical metaphor is a way of saying that the visual displays of a computer
system should present the images of real physical objects For example, the wastepaper basket icon can be used to discard objects from the
system by simply dragging the unwanted objects into the dustbin, as in real life. The
desktop metaphor probably has been the most famous paradigm. Because of the
large set of potential office users, this metaphor can have the most dramatic effect.
In this paradigm, the computer presents information and objects as they would
appear and behave in an office, using icons for folders, in-baskets, out-baskets and
calendars. In a desktop metaphor, users are not aware of applications. Users deal with files,
folders, drawers, a clipboard and an outbox. Instead of starting the word process and
loading file, users merely open the report document, which implicitly invokes the word
processor, Clicking the mouse on an icon representing the report causes the word
processor to get started and to load the report file implicitly. Today, several computing
environments provide this capability.
The 3D GUI:
The desktop metaphor GUI is 2% D. It is 2D because its visual elements are two dimensional:
they lie in the xy plane, are defined in 2D coordinates, are flat and contain
only planar regions (areas). It is 2% D because where visual elements overlap they
obscure each other according to their priority. In a 3D GUI the visual elements are
genuinely three-dimensional: they are situated in xyz space, are defined in terms of 3D
coordinates, need not be flat and may contain spatial regions (volumes).
The design considerations for a 3D GUI appear more complex than for a 2% D GUI.
To begin with, the issues of metaphor and elements arise afresh. The desktop
metaphor with its windows, icons, menus and pointing device elements is firmly
established for 2!4D GUIs. Tn contrast no clearly defined metaphor and set of
elements for 3D GUIs are manifest -yet. 3D GUIs offer considerably more scope
for metaphors than 2%D GL. Tls; there are many metaphors which could be based on
our physical 3D environment, including the obvious extension of the desktop metaphor
into a 3D environment, including the obvious extension of the desktop metaphor into a
3D office metaphor. On the other hand, much more abstract metaphors are possible,
such as one based "starmaps" where objects are simply placed somewhere in
"cyberspace". Likewise the elements of a 3D GUI may resemble, or differ
substantially from, the elements of the 2% D GUI.
The various prototypes have been developed to design the same elements in the
3D GUI as in the 2 'I2D desktop GUI: windows, icons, menus, a general space in
which to arrange the visual elements, a cursor and an input device to manipulate
the cursor.
No comments:
Post a Comment