“Learn the rules so you know how to break them properly”—The Dalia Lama

Terms & Definitions

Accessibility is the degree to which a product, device, service, or environment is available to as many people as possible. Accessibility can be viewed as the "ability to access" and benefit from some system or entity. The concept often focuses on people with disabilities or special needs (such as the Convention on the Rights of Persons with Disabilities) and their right of access, enabling the use of assistive technology.

Accessibility is not to be confused with usability, which is the extent to which a product (such as a device, service, or environment) can be used by specified users to achieve specified goals with effectiveness, efficiency and satisfaction in a specified context of use.

Accessibility is strongly related to universal design when the approach involves "direct access." This is about making things accessible to all people (whether they have a disability or not). An alternative is to provide "indirect access" by having the entity support the use of a person's assistive technology to achieve access (for example, computer screen readers).



Ad Hoc is a Latin phrase meaning "for this". It generally signifies a solution designed for a specific problem or task, non-generalizable, and not intended to be able to be adapted to other purposes (compare a priori). Common examples are organizations, committees, and commissions created at the national or international level for a specific task. In other fields the term may refer, for example, to a military unit created under special circumstances, a tailor-made suit, a handcrafted network protocol, or a purpose-specific equation. Ad hoc can also mean makeshift solutions, shifting contexts to create new meanings, inadequate planning, or improvised events.




Affinity Diagram is a business tool used to organize ideas and data. It is one of the Seven Management and Planning Tools.

The tool is commonly used within project management and allows large numbers of ideas stemming from brainstorming[1] to be sorted into groups, based on their natural relationships, for review and analysis.[2] It is also frequently used in contextual inquiry as a way to organize notes and insights from field interviews. It can also be used for organizing other freeform comments, such as open-ended survey responses, support call logs, or other qualitative data.

People have been grouping data into groups based on natural relationships for thousands of years; however the term affinity diagram was devised by Jiro Kawakita in the 1960s[3] and is sometimes referred to as the KJ Method.




Agile Software Development is a group of software development methods based on iterative and incremental development, where requirements and solutions evolve through collaboration between self-organizing, cross-functional teams. It promotes adaptive planning, evolutionary development and delivery, a time-boxed iterative approach, and encourages rapid and flexible response to change. It is a conceptual framework that promotes foreseen interactions throughout the development cycle. The Agile Manifesto introduced the term in 2001.



Back-End Database is a database that is accessed by users indirectly through an external application rather than by application programming stored within the database itself or by low level manipulation of the data (e.g. through SQL commands).

A back-end database stores data but does not include end-user application elements such as stored queries, forms, macros or reports.




Bubble Chart is a type of chart that displays three dimensions of data. Each entity with its triplet (v1, v2, v3) of associated data is plotted as a disk that expresses two of the vi values through the disk's xy location and the third through its size. Bubble charts can facilitate the understanding of social, economical, medical, and other scientific relationships.

Bubble charts can be considered a variation of the scatter plot, in which the data points are replaced with bubbles. As the documentation for Microsoft Office explains, “this type of chart can be used instead of a Scatter chart if your data has three data series, each of which contains a set of values”.





Card sorting is a simple technique in user experience design where a group of subject experts or “users”, however inexperienced with design, are guided to generate a category tree or folksonomy. It is a useful approach for designing information architecture, workflows, menu structure, or web site navigation paths.


Chrome The visible graphical interface features of an application are sometimes referred to as “chrome”.




Chromostereopsis



Command-Line Interface (CMS)



Comp (graphic design) a rapidly-drawn but high-quality sketch intended for presentation purposes. Traditionally comps are created as quick color sketches done in marker, often used for client presentations especially in advertising and architecture. A comp is usually intended to be a very close approximation to the final production image so that it can easily be evaluated without the ambiguity of a rough sketch.





Content Management System (CMS) a computer program that allows publishing, editing and modifying content as well as maintenance from a central interface. Such systems of content management provide procedures to manage workflow in a collaborative environment.[4] These procedures can be manual steps or an automated cascade.

The first content management system was announced at the end of the 1990s. This CMS was designed to simplify the complex task of writing numerous versions of code and to make the website development process more flexible. CMS platforms allow users to centralize data editing, publishing and modification on a single back-end interface. CMS platforms are often used as blog software.




Contextual Design (CD) is a user-centered design process developed by Hugh Beyer and Karen Holtzblatt. It incorporates ethnographic methods for gathering data relevant to the product via field studies, rationalizing workflows, and designing human-computer interfaces. In practice, this means that researchers aggregate data from customers in the field where people are living and applying these findings into a final product.[1] Contextual Design can be seen as an alternative to engineering and feature driven models of creating new systems.

The Contextual Design process consists of the following top-level steps: Contextual Inquiry, Interpretation, Data Consolidation, Visioning, Storyboarding, User Environment Design, and Prototyping.




Data Visualization is the study of the visual representation of data, meaning “information that has been abstracted in some schematic form, including attributes or variables for the units of information.




Dendrogram (from Greek dendron “tree”, -gramma “drawing”) is a tree diagram frequently used to illustrate the arrangement of the clusters produced by hierarchical clustering. Dendrograms are often used in computational biology to illustrate the clustering of genes or samples.




Digital Strategy is the process of specifying an organization's vision, goals, opportunities and initiatives in order to maximize the business benefits of digital initiatives to the organization. These can range from an enterprise focus, which considers the broader opportunities and risks that digital potentially creates (e.g., changes in the publishing industry) and often includes customer intelligence, collaboration, new product/market exploration, sales and service optimization, enterprise technology architectures and processes, innovation and governance; to more marketing and customer-focused efforts such as web sites, mobile, eCommerce, social, site and search engine optimization, and advertising.




Drop-off loss of a user’s interest to the point where they log off the site or go to another site.



eCRM or electronic customer relationship marketing concept is derived from e-commerce. It also uses net environment i.e., intranet, extranet and internet. Electronic CRM concerns all forms of managing relationships with customers making use of information technology (IT). eCRM is enterprises using IT to integrate internal organization resources and external marketing strategies to understand and fulfill their customers needs. Comparing with traditional CRM, the integrated information for eCRM intraorganizational collaboration can be more efficient to communicate with customers.




EM Used in measurement of type; 1 EM = 12 points; 0.5 EM = 6 points, 2 EM = 24 points.



Facilitator The person who conducts an interview of the participant, often, but not always accompanied by an observer who takes notes.



Folksonomy is a system of classification derived from the practice and method of collaboratively creating and managing tags to annotate and categorize content; this practice is also known as collaborative tagging,[3] social classification, social indexing, and social tagging. Folksonomy, a term coined by Thomas Vander Wal, is a portmanteau of folk and taxonomy.



Front end and Back End, in addition to their obvious use in everyday English, are generalised terms that refer to the initial and the final stages of a process.[citation needed] The front end is responsible for collecting input in various forms from the user and processing it to conform to a specification the back end can use. The front end is an interface between the user and the back end.

In software architecture there may be many layers between the hardware and end user. Each can be spoken of as having a front end and a back end. The front is an abstraction, simplifying the underlying component by providing a user-friendly interface.

In software design, the model-view-controller architecture for example, provides front and back ends for the database, the user, and the data processing components. The separation of software systems into front and back ends simplifies development and separates maintenance. A rule of thumb is that the front (or "client") side is anything you can see when you view the code source of the page (when you are on the client side, i.e. not on the server.) To view the server-side (or "back-end") code, you must be on the server. The confusion arises when you have to make front-end edits to server-side files. Most HTML designers, for instance, don't need to be on the server when they are developing the HTML; conversely, the server-side engineers are, by definition, never on anything but a server. It takes both, to be sure, to ultimately make a functioning, interactive website.




Fourth-Generation Programming Language (1970s-1990) (abbreviated 4GL) is better understood to be a fourth generation environment; packages of systems development software including very high level programming languages.[1] A very high level programming language and a development environment or 'Analyst Workbench' designed with a central data dictionary system, a library of loosely coupled design patterns, a CRUD generator, report generator, end-user query language, DBMS, visual design tool and integration API. Historically often used for prototyping and evolutionary development of commercial business software[citation needed]. In the history of computer science, the 4GL followed the 3GL in an upward trend toward higher abstraction and statement power[citation needed]. The 4GL was followed by efforts to define and use a 5GL.

The natural-language, block-structured mode of the third-generation programming languages improved the process of software development. However, 3GL development methods can be slow and error-prone. It became clear that some applications could be developed more rapidly by adding a higher-level programming language and methodology which would generate the equivalent of very complicated 3GL instructions with fewer errors[citation needed]. In some senses, software engineering arose to handle 3GL development. 4GL and 5GL projects are more oriented toward problem solving and systems engineering.

All 4GLs are designed to reduce programming effort, the time it takes to develop software, and the cost of software development. They are not always successful in this task, sometimes resulting in inelegant and unmaintainable code. However, given the right problem, the use of an appropriate 4GL can be spectacularly successful as was seen with MARK-IV and MAPPER (see History Section, Santa Fe real-time tracking of their freight cars – the productivity gains were estimated to be 8 times over COBOL). The usability improvements obtained by some 4GLs (and their environment) allowed better exploration for heuristic solutions than did the 3GL.

A quantitative definition of 4GL has been set by Capers Jones, as part of his work on function point analysis. Jones defines the various generations of programming languages in terms of developer productivity, measured in function points per staff-month[citation needed]. A 4GL is defined as a language that supports 12–20 function points per staff month. This correlates with about 16–27 lines of code per function point implemented in a 4GL[citation needed].

Fourth-generation languages have often been compared to domain-specific programming languages (DSLs). Some researchers state that 4GLs are a subset of DSLs.



GNU Compiler Collection (GCC) is a compiler system produced by the GNU Project supporting various programming languages. GCC is a key component of the GNU toolchain. As well as being the official compiler of the unfinished GNU operating system, GCC has been adopted as the standard compiler by most other modern Unix-like computer operating systems, including Linux, and the BSD family. A port to RISC OS has also been developed extensively in recent years. There is also an old (3.0) port of GCC to Plan9, running under its ANSI/POSIX Environment (APE). GCC is also available for Microsoft Windows operating systems, and for the ARM processor used by many portable devices.

GCC has been ported to a wide variety of processor architectures, and is widely deployed as a tool in proprietary development environments[citation needed]. GCC is also available for most embedded platforms, including Symbian (called gcce),[3] AMCC and Freescale Power Architecture-based chips.[4] The compiler can target a wide variety of platforms, including videogame consoles such as the PlayStation 2[5] and Dreamcast.[6] Several companies make a business out of supplying and supporting GCC ports to various platforms, and chip manufacturers[who?] today consider a GCC port almost essential to the success of an architecture.[citation needed]

Originally named the GNU C Compiler, because it only handled the C programming language, GCC 1.0 was released in 1987, and the compiler was extended to compile C++ in December of that year.[1] Front ends were later developed for Objective-C, Objective-C++, Fortran, Java, Ada, and Go among others.

The Free Software Foundation (FSF) distributes GCC under the GNU General Public License (GNU GPL). GCC has played an important role in the growth of free software, as both a tool and an example.




Graphical User Interface (GUI)




Haptic Feedback is a tactile feedback technology which takes advantage of the sense of touch by applying forces, vibrations, or motions to the user. This mechanical stimulation can be used to assist in the creation of virtual objects in a computer simulation, to control such virtual objects, and to enhance the remote control of machines and devices (telerobotics). It has been described as "doing for the sense of touch what computer graphics does for vision". Haptic devices may incorporate tactile sensors that measure forces exerted by the user on the interface.




Hashtag a word or a phrase prefixed with the symbol #, a form of metadata tag. Short messages on microblogging social networking services such as Twitter, Tout, identi.ca, or Google+ may be tagged by including one or more with multiple words concatenated, e.g.:

#Wikipedia is an #encyclopedia

Hashtags provide a means of grouping such messages, since one can search for the hashtag and get the set of messages that contain it.



Heuristic Evaluation is a usability inspection method for computer software that helps to identify usability problems in the user interface (UI) design. It specifically involves evaluators examining the interface and judging its compliance with recognized usability principles (the "heuristics"). These evaluation methods are now widely taught and practiced in the New Media sector, where UIs are often designed in a short space of time on a budget that may restrict the amount of money available to provide for other types of interface testing.




High-Fidelity Prototype a prototype that is quite close to the final product, with lots of detail and functionality. From a user testing point of view, a high-fidelity prototype is close enough to a final product to be able to examine usability questions in detail and make strong conclusions about how behavior will relate to use of the final product.



Holistic relating to or concerned with wholes or with complete systems rather than with the analysis of, treatment of, or dissection into parts (holistic medicine attempts to treat both the mind and the body; holistic ecology views humans and the environment as a single system.)



Horizontal Prototypes display a wide range of features but without fully implementing all of those features; they are appropriate for understanding relationships across a broad system and for showing the range of abilities of a system.




Human–Computer Interaction (HCI) involves the study, planning, and design of the interaction between people (users) and computers. It is often regarded as the intersection of computer science, behavioral sciences, design and several other fields of study. The term was popularized by Card, Moran, and Newell in their seminal 1983 book, "The Psychology of Human-Computer Interaction", although the authors first used the term in 1980,[1] and the first known use was in 1975.[2] The term connotes that, unlike other tools with only limited uses (such as a hammer, useful for driving nails, but not much else), a computer has many affordances for use and this takes place in an open-ended dialog between the user and the computer.

Because human–computer interaction studies a human and a machine in conjunction, it draws from supporting knowledge on both the machine and the human side. On the machine side, techniques in computer graphics, operating systems, programming languages, and development environments are relevant. On the human side, communication theory, graphic and industrial design disciplines, linguistics, social sciences, cognitive psychology, and human factors such as computer user satisfaction are relevant. Engineering and design methods are also relevant. Due to the multidisciplinary nature of HCI, people with different backgrounds contribute to its success. HCI is also sometimes referred to as man–machine interaction (MMI) or computer–human interaction (CHI).

Attention to human-machine interaction is important because poorly designed human-machine interfaces can lead to many unexpected problems. A classic example of this is the Three Mile Island accident, a nuclear meltdown accident, where investigations concluded that the design of the human–machine interface was at least partially responsible for the disaster.[3][4][5] Similarly, accidents in aviation have resulted from manufacturers' decisions to use non-standard flight instrument and/or throttle quadrant layouts: even though the new designs were proposed to be superior in regards to basic human–machine interaction, pilots had already ingrained the "standard" layout and thus the conceptually good idea actually had undesirable results.



i18n numeronym for Internationalization, the process of designing a software application so that it can be adapted to various languages and regions without engineering changes.



Ideation is the creative process of generating, developing, and communicating new ideas, where an idea is understood as a basic element of thought that can be either visual, concrete, or abstract. Ideation is all stages of a thought cycle, from innovation, to development, to actualization. As such, it is an essential part of the design process, both in education and practice.



Information Architecture (IA) is the art and science of organizing and labelling data including: websites, intranets, online communities, software, books and other mediums of information, to support usability.[1] It is an emerging discipline and community of practice focused on bringing together principles of design and architecture, primarily to the digital landscape.[2][page needed] Typically, it involves a model or concept of information which is used and applied to activities that require explicit details of complex information systems. These activities include library systems and database development.



Information Design is the practice of presenting information in a way that fosters efficient and effective understanding of it. The term has come to be used specifically for graphic design for displaying information effectively, rather than just attractively or for artistic expression. Today, information design is closely related to the field of data visualization. Information design is often taught as part of graphic design courses.



Information Silo is a management system incapable of reciprocal operation with other, related information systems. For example, a bank's management system is considered a silo if it cannot exchange information with other related systems within its own organization, or with the management systems of its customers, vendors, or business partners. "Information silo" is a pejorative expression that is useful for describing the absence of operational reciprocity. In [Information Technology]], the absence of operational reciprocity is between disparate systems also commonly referred to as disparate data systems. Derived variants are "silo thinking", "silo vision", and "silo mentality".

The expression is typically applied to management systems where the focus is inward and information communication is vertical. Critics of silos contend that managers serve as information gatekeepers, making timely coordination and communication among departments difficult to achieve, and seamless interoperability with external parties impractical. They hold that silos tend to limit productivity in practically all organizations, provide greater opportunity for security lapses and privacy breaches, and frustrate consumers who increasingly expect information to be immediately available and complete. Although much has been written about them, information silos are becoming far more recognized as the major reason why organizations are unable to take full advantage of the Internet's power to interconnect business processes.



Interface Design deals with the process of developing a method for two (or more) modules in a system to connect and communicate. These modules can apply to hardware, software or the interface between a user and a machine.[1][2][3] An example of a user interface could include a GUI, a control panel for a nuclear power plant,[4] or even the cockpit of an aircraft.



Interaction Design, often abbreviated IxD, is "about shaping digital things for people’s use",[1] alternately defined as "the practice of designing interactive digital products, environments, systems, and services."[2]:xxxi,1 Like many other design fields interaction design also has an interest in form but its main focus is on behavior.[2]:1 What clearly marks interaction design as a design field as opposed to a science or engineering field is that it is synthesis and imagining things as they might be, more so than focusing on how things are.[2]:xviii
Interaction design is heavily focused on satisfying the needs and desires of the people who will use the product.[2]:xviii Where other disciplines like software engineering have a heavy focus on designing for technical stakeholders of a project.



Iterative Design the idea that design should be done in repeated cycles where, in each cycle, the design is elaborated, refined, and tested, and the results of testing at each cycle feed into the design focus of the next cycle.

This is identical in spirit to the notion of developing a software product through a series of continually-refined prototypes, and the idea of developing generations of a software product through an iterative development cycle (such as the Spiral model of development).



JAD Joint Application Development.



jQuery is a multi-browser JavaScript library designed to simplify the client-side scripting of HTML.[4] It was released in January 2006 at BarCamp NYC by John Resig. It is currently developed by a team of developers led by Dave Methvin. Used by over 55% of the 10,000 most visited websites, jQuery is the most popular JavaScript library in use today.

jQuery is free, open source software, licensed under the MIT License. jQuery's syntax is designed to make it easier to navigate a document, select DOM elements, create animations, handle events, and develop Ajax applications. jQuery also provides capabilities for developers to create plug-ins on top of the JavaScript library. This enables developers to create abstractions for low-level interaction and animation, advanced effects and high-level, theme-able widgets. The modular approach to the jQuery library allows the creation of powerful dynamic web pages and web applications.



KPI (key performance indicator) is industry jargon for a type of performance measurement. KPIs are commonly used by an organization to evaluate its success or the success of a particular activity in which it is engaged. Sometimes success is defined in terms of making progress toward strategic goals, but often success is simply the repeated achievement of some level of operational goal (for example, zero defects, 10/10 customer satisfaction, etc.). Accordingly, choosing the right KPIs is reliant upon having a good understanding of what is important to the organization. 'What is important' often depends on the department measuring the performance - the KPIs useful to finance will be quite different than the KPIs assigned to sales, for example. Because of the need to develop a good understanding of what is important, performance indicator selection is often closely associated with the use of various techniques to assess the present state of the business, and its key activities. These assessments often lead to the identification of potential improvements; and as a consequence, performance indicators are routinely associated with 'performance improvement' initiatives. A very common way for choosing KPIs is to apply a management framework such as the balanced scorecard.



L10n numeronym for Localization, the process of adapting internationalized software for a specific region or language by adding locale-specific components and translating text.



Low-Fidelity Prototype a prototype that is sketchy and incomplete, that has some characteristics of the target product but is otherwise simple, usually in order to quickly produce the prototype and test broad concepts.



Low-Hanging Fruit Targets or goals which are easily achievable and which do not require a lot of effort.



Media Anthropology is an area of study within social or cultural anthropology that emphasizes ethnographic studies as a means of understanding producers, audiences, and other cultural and social aspects of mass media.




Mental Model the user’s perception of an object or process.



Mind Map is a diagram used to visually outline information. A mind map is often created around a single word or text, placed in the center, to which associated ideas, words and concepts are added. Major categories radiate from a central node, and lesser categories are sub-branches of larger branches. Categories can represent words, ideas, tasks, or other items related to a central key word or idea.



Mockup, or Mock-up, is a scale or full-size model of a design or device, used for teaching, demonstration, design evaluation, promotion, and other purposes. A mockup is a prototype if it provides at least part of the functionality of a system and enables testing of a design.[1] Mock-ups are used by designers mainly to acquire feedback from users. Mock-ups address the idea captured in a popular engineering one-liner: You can fix it now on the drafting board with an eraser or you can fix it later on the construction site with a sledge hammer.



Model–view–controller (MVC) is a software architecture pattern that separates the representation of information from the user's interaction with it.[1][2] The model consists of application data, business rules, logic, and functions. A view can be any output representation of data, such as a chart or a diagram. Multiple views of the same data are possible, such as a pie chart for management and a tabular view for accountants. The controller mediates input, converting it to commands for the model or view.[3] The central ideas behind MVC are code reusability and separation of concerns.

The model-view-controller pattern was originally formulated in the late 1970s by Trygve Reenskaug at Xerox PARC, as part of the Smalltalk system.



Neuro-linguistic programming (NLP) is an approach to communication, personal development, andpsychotherapy created by Richard Bandler and John Grinder in California, USA in the 1970s. Its proponents claim a connection between the neurological processes (“neuro”), language (“linguistic”) and behavioral patterns learned through experience (“programming”) and that these can be changed to achieve specific goals in life.



Non-Disclosure Agreement (NDA), also known as a confidentiality agreement (CA), confidential disclosure agreement (CDA), proprietary information agreement (PIA), or secrecy agreement, is a legal contract between at least two parties that outlines confidential material, knowledge, or information that the parties wish to share with one another for certain purposes, but wish to restrict access to or by third parties. It's a contract through which the parties agree not to disclose information covered by the agreement. An NDA creates a confidential relationship between the parties to protect any type of confidential and proprietary information or trade secrets. As such, an NDA protects nonpublic business information.

NDAs are commonly signed when two companies, individuals, or other entities (such as partnerships, societies, etc.) are considering doing business and need to understand the processes used in each other's business for the purpose of evaluating the potential business relationship. NDAs can be "mutual", meaning both parties are restricted in their use of the materials provided, or they can restrict the use of material by a single party.

It is also possible for an employee to sign an NDA or NDA-like agreement with an employer. In fact, some employment agreements will include a clause restricting employees' use and dissemination of company-owned "confidential information."



Numeronym a number-based word



Observer In an interview, the observer takes notes regarding the participants actions in response to question and task requests given by the facilitator.



Paper Prototyping is a widely used method in the user-centered design process, a process that helps developers to create software that meets the user's expectations and needs - in this case, especially for designing and testing user interfaces. It is throwaway prototyping and involves creating rough, even hand-sketched, drawings of an interface to use as prototypes, or models, of a design. While paper prototyping seems simple, this method of usability testing can provide a great deal of useful feedback which will result in the design of better products. This is supported by many usability professionals.

Paper prototyping started in the mid 1980s and then became popular in the mid 1990s when companies such as IBM, Honeywell, Microsoft, and others started using the technique in developing their products. Today, paper prototyping is used widely in user centered design by usability professionals. More recently, digital paper prototyping has been advocated by companies like Pidoco due to advantages in terms of collaboration, flexibility and cost.



Participant the “User” who is the focus of an interview.



Participatory Design (PD) (known before as 'Cooperative Design') is an approach to design attempting to actively involve all stakeholders (e.g. employees, partners, customers, citizens, end users) in the design process in order to help ensure the product designed meets their needs and is usable. The term is used in a variety of fields e.g. software design, urban design, architecture, landscape architecture, product design, sustainability, graphic design, planning or even medicine as a way of creating environments that are more responsive and appropriate to their inhabitants' and users' cultural, emotional, spiritual and practical needs. It is one approach to placemaking. It has been used in many settings and at various scales. Participatory design is an approach which is focused on processes and procedures of design and is not a design style. For some, this approach has a political dimension of user empowerment and democratization. For others, it is seen as a way of abrogating design responsibility and innovation by designers.

In several Scandinavian countries of the 1960s and 1970s, it was rooted in work with trade unions; its ancestry also includes Action research and Sociotechnical Design.[1]




Problem Solving consists in using generic or ad hoc methods, in an orderly manner, for finding solutions to problems. Some of the problem-solving techniques developed and used in artificial intelligence, computer science, engineering, mathematics, medicine, etc. are related to mental problem-solving techniques studied in psychology.


Requirements Analysis or Requirements Gathering in systems engineering and software engineering, encompasses those tasks that go into determining the needs or conditions to meet for a new or altered product, taking account of the possibly conflicting requirements of the various stakeholders, analyzing, documenting, validating and managing software or system requirements.[2]
Requirements analysis is critical to the success of a systems or software project.[3] The requirements should be documented, actionable, measurable, testable, traceable, related to identified business needs or opportunities, and defined to a level of detail sufficient for system design.

Conceptually, requirements analysis includes three types of activities:[citation needed]
Eliciting requirements: the task of identifying the various types of requirements from various sources including project documentation, (e.g. the project charter or definition), business process documentation, and stakeholder interviews. This is sometimes also called requirements gathering.
Analyzing requirements: determining whether the stated requirements are clear, complete, consistent and unambiguous, and resolving any apparent conflicts.
Recordin--117.218.27.123 (talk) 07:50, 18 February 2013 (UTC)g requirements: Requirements may be documented in various forms, usually including a summary list and may include natural-language documents, use cases, user stories, or process specifications.

Requirements analysis can be a long and arduous process during which many delicate psychological skills are involved. New systems change the environment and relationships between people, so it is important to identify all the stakeholders, take into account all their needs and ensure they understand the implications of the new systems. Analysts can employ several techniques to elicit the requirements from the customer. These may include the development of scenarios (represented as user stories in agile methods), the identification of use cases, the use of workplace observation or ethnography, holding interviews, or focus groups (more aptly named in this context as requirements workshops, or requirements review sessions) and creating requirements lists. Prototyping may be used to develop an example system that can be demonstrated to stakeholders. Where necessary, the analyst will employ a combination of these methods to establish the exact requirements of the stakeholders, so that a system that meets the business needs is produced.



Relational Database Management Systems (RDBMS)



Responsive Web Design (RWD) is a web design approach aimed at crafting sites to provide an optimal viewing experience—easy reading and navigation with a minimum of resizing, panning, and scrolling—across a wide range of devices (from desktop computer monitors to mobile phones)



Rich Data comprehensive information with a high degree of usefulness.



Rich Observations robust observations with a high degree of relevance.



Ruby on Rails, often shortened to Rails, is an open source full-stack web application framework for the Ruby programming language. Ruby on Rails runs on the general-purpose programming language Ruby, which predates it by more than a decade. Rails is a full-stack framework, meaning that it gives the web developer the ability to create pages and applications that gather information from the web server, talk to or query the database, and render templates out of the box. As a result, Rails features a routing system that is independent of the web server.

Ruby on Rails emphasizes the use of well-known software engineering patterns and principles, such as active record pattern, convention over configuration, don't repeat yourself and model-view-controller.



Scenario 1. A concrete, often narrative description of a user performing a task in a specify context. Often a use scenario describes a desired or to-be-built function. This contrasts with a task scenario, which describes a currently implemented function. 2. A prescribed set of conditions under which a suer will perform a set of task to achieve an objective defined by the developer.



Scrum is an iterative and incremental agile software development framework for managing software projects and product or application development. Its focus is on "a flexible, holistic product development strategy where a development team works as a unit to reach a common goal" as opposed to a "traditional, sequential approach".

Scrum was first defined as "a flexible, holistic product development strategy where a development team works as a unit to reach a common goal" as opposed to a "traditional, sequential approach" in 1986 by Hirotaka Takeuchi and Ikujiro Nonaka in the "New New Product Development Game".




Service Design is the activity of planning and organizing people, infrastructure, communication and material components of a service in order to improve its quality and the interaction between service provider and customers. The purpose of service design methodologies is to design according to the needs of customers or participants, so that the service is user-friendly, competitive and relevant to the customers. The backbone of this process is to understand the behavior of the customers, their needs and motivations.[citation needed] Service designers draw on the methodologies of fields such as ethnography and journalism to gather customer insights through interviews and by shadowing service users. Many observations are synthesized to generate concepts and ideas that are typically portrayed visually, for example in sketches or service prototypes. Service design may inform changes to an existing service or creation of new services.




Search Engine Optimization (SEO) is the process of affecting the visibility of a website or a web page in a search engine's "natural" or un-paid ("organic") search results.[jargon] In general, the earlier (or higher ranked on the search results page), and more frequently a site appears in the search results list, the more visitors it will receive from the search engine's users. SEO may target different kinds of search, including image search, local search, video search, academic search,[1] news search and industry-specific vertical search engines.


As an Internet marketing strategy, SEO considers how search engines work, what people search for, the actual search terms or keywords typed into search engines and which search engines are preferred by their targeted audience. Optimizing a website may involve editing its content, HTML and associated coding to both increase its relevance to specific keywords and to remove barriers to the indexing activities of search engines. Promoting a site to increase the number of backlinks, or inbound links, is another SEO tactic.

The plural of the abbreviation SEO can also refer to "search engine optimizers," those who provide SEO services.



Section 508 was enacted to eliminate barriers in information technology, to make available new opportunities for people with disabilities, and to encourage development of technologies that will help achieve these goals. The law applies to all Federal agencies when they develop, procure, maintain, or use electronic and information technology. Under Section 508 (29 U.S.C. § 794d), agencies must give disabled employees and members of the public access to information that is comparable to the access available to others.



Separation of Concerns (SoC) is a design principle for separating a computer program into distinct sections, such that each section addresses a separate concern. A concern is a set of information that affects the code of a computer program. A concern can be as general as the details of the hardware the code is being optimized for, or as specific as the name of a class to instantiate. A program that embodies SoC well is called a modular[1] program. Modularity, and hence separation of concerns, is achieved by encapsulating information inside a section of code that has a well defined interface. Encapsulation is a means of information hiding.[2] Layered designs in information systems are another embodiment of separation of concerns (e.g., presentation layer, business logic layer, data access layer, database layer).

The value of separation of concerns is simplifying development and maintenance of computer programs. When concerns are well separated, individual sections can be developed and updated independently. Of especial value is the ability to later improve or modify one section of code without having to know the details of other sections, and without having to make corresponding changes to those sections.




Skeuomorph is a physical ornament or design on an object made to resemble another material or technique. Examples include pottery embellished with imitation rivets reminiscent of similar pots made of metal, or a software calendar application which displays the days organized on animated month pages in imitation of a paper desk calendar.

Skeuomorph is pronounced /ˈskjuːəmɔrf/ or [skyoo-uh-mawrf]. It is compounded from the Greek: skeuos, σκεῦος (container or tool), and morphê, μορφή (shape). The term has been applied to material objects since 1890, and is now used to describe computer interfaces.




Skeuomorphic Design is an interface design approach which graphically mimics the physical appearance of the device which it attempts to emulate. For example, an address book application that actually looks like a book, or a streaming music service that looks like a jukebox. The advantage of such a design approach, as opposed to Flat Design, is its immediate familiarity and therefor comfort level to the user. However, as subsequent generations encounter such interfaces without a prior visual reference due to obsolescence, the benefit of the skeuomorph is questionable, although it may still have entertainment value as a compelling element in terms of the user experience.




State Diagram a type of diagram used in computer science and related fields to describe the behavior of systems. State diagrams require that the system described is composed of a finite number of states; sometimes, this is indeed the case, while at other times this is a reasonable abstraction. Many forms of state diagrams exist, which differ slightly and have different semantics.



Storyboards are graphic organizers in the form of illustrations or images displayed in sequence for the purpose of pre-visualizing a motion picture, animation, motion graphic or interactive media sequence.



Structured Query Language (SQL) is a special-purpose programming language designed for managing data in relational database management systems (RDBMS).
Originally based upon relational algebra and tuple relational calculus, its scope includes data insert, query, update and delete, schema creation and modification, and data access control.

SQL was one of the first commercial languages for Edgar F. Codd's relational model, as described in his influential 1970 paper, "A Relational Model of Data for Large Shared Data Banks".[4] Despite not adhering to the relational model as described by Codd, it became the most widely used database language.[5][6] Although SQL is often described as, and to a great extent is, a declarative language (4GL), it also includes procedural elements. SQL became a standard of the American National Standards Institute (ANSI) in 1986, and of the International Organization for Standards (ISO) in 1987. Since then, the standard has been enhanced several times with added features. But code is not completely portable among different database systems, which can lead to vendor lock-in. The different makers do not perfectly follow the standard, they add extensions, and the standard is sometimes ambiguous.



Task Analysis is the analysis of how a task is accomplished, including a detailed description of both manual and mental activities, task and element durations, task frequency, task allocation, task complexity, environmental conditions, necessary clothing and equipment, and any other unique factors involved in or required for one or more people to perform a given task.[1] Task analysis emerged from research in applied behavior analysis and still has considerable research in that area.

Information from a task analysis can then be used for many purposes, such as personnel selection and training, tool or equipment design,[2] procedure design (e.g., design of checklists or decision support systems) and automation.



Taxonomy Almost anything—animate objects, inanimate objects, places, concepts, events, properties, and relationships—may be classified according to some taxonomic scheme. Taxonomies of the more generic kinds of things typically stem from philosophical investigations. Starting with the work of Aristotle in his work 'Categories' several philosophers, especially ontologists, arranged generic categories (also called types or classes) in a hierarchy that more or less satisfy the criteria for being a true taxonomy.

Taxonomy, or categorization, in human cognition has been a major area of research in psychology. Social psychologists have sought to model the manner in which the human mind categorizes social stimuli (Self-categorization theory is a prototypical example).[16][17] Some have argued that the adult human mind naturally organizes its knowledge of the world into such systems. Anthropologists have observed that taxonomies are generally embedded in local cultural and social systems, and serve various social functions.

Other taxonomies, such as those analyzed by Durkheim and Lévi-Strauss, are sometimes called folk taxonomies to distinguish them from scientific taxonomies. Baraminology is a taxonomy used in creation science which in classifying form taxa resembles folk taxonomies. The phrase "enterprise taxonomy" is used in business (see economic taxonomy) to describe a very limited form of taxonomy used only within one organization. For example, a method of classifying boxes as "Type A", "Type B" and "Type C" used within a box company for categorizing box shipments. The military and health care/safety science fields also have their own taxonomies. In the field of modern computing, the semantic web requires formal XML extension taxonomies (like XBRL) often containing a collection of elements and attributes and qualified by an namespaces to help distinguish identically named elements.



UML Unified Modeling Language.



Usability inspection is the name for a set of methods where an evaluator inspects a user interface. This is in contrast to usability testing where the usability of the interface is evaluated by testing it on real users. Usability inspections can generally be used early in the development process by evaluating prototypes or specifications for the system that can't be tested on users. Usability inspection methods are generally considered to be cheaper to implement than testing on users.



Usability Testing is a technique used in user-centered interaction design to evaluate a product by testing it on users. This can be seen as an irreplaceable usability practice, since it gives direct input on how real users use the system. This is in contrast with usability inspection methods where experts use different methods to evaluate a user interface without involving users.

Usability testing focuses on measuring a human-made product's capacity to meet its intended purpose. Examples of products that commonly benefit from usability testing are foods, consumer products, web sites or web applications, computer interfaces, documents, and devices. Usability testing measures the usability, or ease of use, of a specific object or set of objects, whereas general human-computer interaction studies attempt to formulate universal principles.



Use Case A user-centerd design method, in which critical tasks are systematically documented with their prerequisites, the user’ steps and system steps, and the task outcome. Use cases are typically described in the abstract, which makes them particularly helpful in object-oriented design. Scenarios are concrete instantations of use-cases.

A use case can be composed of a list of steps, typically defining interactions between a role (known in UML as an “actor’) and a system, to achieve a goal. The actor can be a human or an external system.
In systems engineering, use cases are used at a higher level than within software engineering, often representing missions or stakeholder goals. The detailed requirements may then be captured in SysML or as contractual statements.



User-Centered Design (UCD) is a type of user interface design and a process in which the needs, wants, and limitations of end users of a product are given extensive attention at each stage of the design process. User-centered design can be characterized as a multi-stage problem solving process that not only requires designers to analyse and foresee how users are likely to use a product, but also to test the validity of their assumptions with regards to user behaviour in real world tests with actual users. Such testing is necessary as it is often very difficult for the designers of a product to understand intuitively what a first-time user of their design experiences, and what each user's learning curve may look like.
The chief difference from other product design philosophies is that user-centered design tries to optimize the product around how users can, want, or need to use the product, rather than forcing the users to change their behavior to accommodate the product.



User Experience (UX) encompasses all aspects of the end-user's interaction with the company, its services, and its products. The first requirement for an exemplary user experience is to meet the exact needs of the customer, without fuss or bother. Next comes simplicity and elegance that produce products that are a joy to own, a joy to use. True user experience goes far beyond giving customers what they say they want, or providing checklist features. In order to achieve high-quality user experience in a company's offerings there must be a seamless merging of the services of multiple disciplines, including engineering, marketing, graphical and industrial design, and interface design.



User Experience Design (UXD or UED) is a broad term used to explain all aspects of a person’s experience with the system, including the interface, graphics, industrial design, physical interaction, and the manual. [1] It also refers to the application of user-centered design practices to generate cohesive, predictive and desirable designs based on holistic consideration of users’ experience. In most cases, User Experience Design fully encompasses traditional Human-Computer Interaction (HCI) design, and extends it by addressing all aspects of a product or service as perceived by users.

A holistic, multidisciplinary approach to the design of user interfaces for digital products, defining their form, behavior, and content. User experience design integrates interaction design, industrial design, information architecture, information design, visual interface design, user assistance design, and user-centered design, ensuring coherence and consistency across all of these design dimensions.—Pabini Gabriel-Petit



User Interface (UI)



User Interface Design (UI) or user interface engineering is the design of computers, appliances, machines, mobile communication devices, software applications, and websites with the focus on the user's experience and interaction. The goal of user interface design is to make the user's interaction as simple and efficient as possible, in terms of accomplishing user goals—what is often called user-centered design. Good user interface design facilitates finishing the task at hand without drawing unnecessary attention to itself. Graphic design may be utilized to support its usability. The design process must balance technical functionality and visual elements (e.g., mental model) to create a system that is not only operational but also usable and adaptable to changing user needs.

Interface design is involved in a wide range of projects from computer systems, to cars, to commercial planes; all of these projects involve much of the same basic human interactions yet also require some unique skills and knowledge. As a result, designers tend to specialize in certain types of projects and have skills centered around their expertise, whether that be software design, user research, web design, or industrial design.



User Journey the users holistic experience in using more than one application or web experience to compete a task.




User Story The Stakeholder or other non-UX personnel’s version, based on aspirations, of the way they view the user’s experience as a whole.



Vertical Prototypes do not attempt to show all that will be in a system but instead focus on implementing a small set of features in a nearly-complete fashion; they are most appropriate when a certain complex feature of a system is poorly-understood and needs to be explored, e.g. as a proof-of-concept.



WCGA Web Content Accessibility Guidelines



Web Analytics is the measurement, collection, analysis and reporting of internet data for purposes of understanding and optimizing web usage.

Web analytics is not just a tool for measuring web traffic but can be used as a tool for business and market research, and to assess and improve the effectiveness of a web site. Web analytics applications can also help companies measure the results of traditional print or broadcast advertising campaigns. It helps one to estimate how traffic to a website changes after the launch of a new advertising campaign. Web analytics provides information about the number of visitors to a website and the number of page views. It helps gauge traffic and popularity trends which is useful for market research.
There are two categories of web analytics; off-site and on-site web analytics.

Off-site web analytics refers to web measurement and analysis regardless of whether you own or maintain a website. It includes the measurement of a website's potential audience (opportunity), share of voice (visibility), and buzz (comments) that is happening on the Internet as a whole.

On-site web analytics measure a visitor's behavior once on your website. This includes its drivers and conversions; for example, the degree to which different landing pages are associated with online purchases. On-site web analytics measures the performance of your website in a commercial context. This data is typically compared against key performance indicators for performance, and used to improve a web site or marketing campaign's audience response. Google Analytics is the most widely-used on-site web analytics service; although new tools are emerging that provide additional layers of information, including heat maps and session replay.

Historically, web analytics has referred to on-site visitor measurement. However in recent years this has blurred, mainly because vendors are producing tools that span both categories.



Wireframe, Website Wireframe also known as a page schematic or screen blueprint, is a visual guide that represents the skeletal framework of a website. Wireframes are created by user experience professionals called "interaction designers." The interaction designers, who have broad backgrounds in visual design, information architecture and user research, create wireframes for the purpose of arranging elements to best accomplish a particular purpose. The purpose is usually being informed by a business objective and a creative idea. The wireframe depicts the page layout or arrangement of the website’s content, including interface elements and navigational systems, and how they work together. The wireframe usually lacks typographic style, color, or graphics, since the main focus lies in functionality, behavior, and priority of content. In other words, it focuses on what a screen does, not what it looks like. Wireframes can be pencil drawings or sketches on a whiteboard, or they can be produced by means of a broad array of free or commercial software applications.
Wireframes focus on

  • The kinds of information displayed
  • The range of functions available
  • The relative priorities of the information and functions
  • The rules for displaying certain kinds of information
  • The effect of different scenarios on the display[5]

The website wireframe connects the underlying conceptual structure, or information architecture, to the surface, or visual design of the website.[2] Wireframes help establish functionality, and the relationships between different screen templates of a website. An iterative process, creating wireframes is an effective way to make rapid prototypes of pages, while measuring the practicality of a design concept. Wireframing typically begins between “high-level structural work—like flowcharts or site maps—and screen designs.” Within the process of building a website, wireframing is where thinking becomes tangible.

Aside from websites, wireframes are utilized for the prototyping of mobile sites, computer applications, or other screen-based products that involve human-computer interaction. Future technologies and media will force wireframes to adapt and evolve.