11-Glossary - Business Intelligence and Tools Glossary...

Info iconThis preview shows page 1. Sign up to view the full content.

View Full Document Right Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: Business Intelligence and Tools Glossary Glossary A Append: The append process unconditionally adds the incoming data, preserving the existing data in the target table. When an incoming record is a duplicate of an existing record, you can define the process, either to allow or reject the incoming record. Attributes: Attributes describe the characteristics of properties of the entities. B Bottom-Up Approach: The objective of bottom-up approach is to deliver business value by deploying the dimensional data marts at the earliest. Unlike the top-down approach, the data marts in this approach contain all the data (both atomic and summary) that users may want. The data is modeled in a star schema design to optimize usability and query performance. Business Analyzing phase: This is one of the phases in the phased Enterprise data modeling. This provides a means for further defining of the concepts provided in the information planning phase. This phase is described in business terms to make business people understand the data details without any special training. The purpose of the phase is to gather and arrange the business requirements and define the business terms. Business Area Analysis (BAA): This is one of the tiers in the phased enterprise data modeling as proposed by IBM in Worldwide Solution Design and Delivery Method. Business Dimensional Lifecycle: This is a methodology adopted for planning, designing, implementing and maintaining the BI system. Sikkim Manipal University Page No. 243 Business Intelligence and Tools Glossary Business Intelligence (BI): A generic term used to describe leveraging the organizational internal and external data, information for making the best possible business decisions. Business Objects: This is a popular suite that has a set of business intelligence tools. Business System Design (BSD): This is one of the tiers in the phased enterprise data modeling as proposed by IBM in Worldwide Solution Design and Delivery Method. Business System Implementation (BSI): This is one of the tiers in the phased enterprise data modeling as proposed by IBM in Worldwide Solution Design and Delivery Method. Business System Maintenance (BSM): This is one of the tiers in the phased enterprise data modeling as proposed by IBM in Worldwide Solution Design and Delivery Method. By-Product Method: This is a method employed to determine the information needs of the senior executives in an organization. Under this method, various informational by-products of the current operations of the organization are summarized and aggregated through use of the traditional TPSs and other MISs that are being used in the organization. C Capital costs: These are the costs that are associated with acquisition of a data warehouse. Capture based on Date and Time Stamp: In this method, every source record created or updated is marked with a stamp that shows the date and time. The data capture occurs at a later time after the creation or updation of a source record and the time stamp provides the basis for selecting the records for data extraction. Sikkim Manipal University Page No. 244 Business Intelligence and Tools Glossary Capture by Comparing Files: According to this method, you capture the changes to your product data by comparing the current source data with the previous captured data. Capture in Source Applications: In this method, the source application is made to assist in the data capture for the data ware house. Here, you need to modify the relevant application programs that write to the source files and databases. Capture through Database Triggers: The database triggers are specially stored procedures or programs stored on the database and are fired when a pre-defined event occurs. Capture through Transaction Logs: In this method, the transaction logs of the DBMSs are used to capture the data. Whenever there is an updation in the database table, the DBMS writes entries on the log file. Cardinality: It represents the maximum number of instances of one entity that are related to a single instance in another table and vice versa. Thus the possible cardinalities include one-to-one (1:1), one-to-many (1:M), and many-to-many (M:M). In a detailed normalized ER model, any M:M relationship is not shown because it is resolved to an associative entity. Classification: In this approach, the data mining processes are intended to discover rules that define whether an event belongs to a particular subset or class of data. This category of techniques is most applicable to different types of business problems and the technique involves two sub-processes; predicting classifications and building a model. Cluster Analysis: Clustering methods can be used to create partitions so that all members of each set are similar according to a set of metrics. A cluster is simply a set of objects grouped together by virtue of their similarity to each other. Sikkim Manipal University Page No. 245 Business Intelligence and Tools Glossary Cognos: Cognos is a rich set of tools for development of data mines, data marts and data warehouses Constructive Merge: If the primary key of an incoming record matches with the key of an existing record, it leaves the existing record, adds the incoming record and marks it as superceding the old record. Consultant: An individual who has experience and expertise in applying tools and techniques to resolve process problems and who can advise and facilitate an organization's improvement efforts. Conversion: This is a data transformation task that includes a large variety of rudimentary conversions of single fields. This task is done for two reasons; to standardize the data among the data extractions from disparate source systems, to make the fields usable and understandable to the users. Critical Success Factors (CSF) Method: This is a method employed to determine the information needs of the senior executives in an organization. Under this approach, the critical success factors of an organization are identified. The CSFs are those things that must be done right if the organization wants to be successful. Similar to the key indicators method, this method requires gathering of the information on the identified CSFs and the information is supplied to the top executives. Current Value: The current value is the stored value of an attribute at that moment of time. These values are transient and change as and when business transactions happen. D Dashboard: This is a reporting tool that consolidates aggregates and arranges measurements, metrics (measurements compared to a goal) on a single screen so that information can be monitored at a glance. Data integration: This includes combining of all relevant operational data into coherent data structures so as to make them ready for loading into data Sikkim Manipal University Page No. 246 Business Intelligence and Tools Glossary warehouse. It standardizes the names and data representations and resolves the discrepancies. Data Loading: After the creation of load images, the next set of activities is to take the prepared data, apply it to the data warehouse, and store it in the data warehouse database. Here, the data warehouse will be offline during the loads. Data Management: This is the process of controlling, protecting, and facilitating access to data in order to provide the end users with timely access to the data they need. Data mart: It is a physical and logical subset of an Enterprise data warehouse and is also termed as a department-specific data warehouse. Generally, data marts are organized around a single business process. Data mining (DM): This is the set of activities used to find new, hidden, or unexpected patterns in data. It is the process of analyzing data from different perspectives and summarizing it into the useful information. Data Mining (or Data Surfing): This is a technique geared for the user who typically does not know exactly what he is searching for, but is looking for particular patterns or trends. Data mining is the process of sifting through large amounts of data to produce data content relationships. It can predict future trends and behaviors, allowing businesses to make proactive, knowledge-driven decisions. The most valuable results from data mining include clustering, classifying, and estimating the things that occur together. There are many kinds of tools that play a role in data mining and they include neural networks, decision trees, visualization, general algorithms, fuzzy logic, etc. Data model: This is a well-organized abstraction of the data. Sikkim Manipal University Page No. 247 Business Intelligence and Tools Glossary Data Modeler: An individual in a BI project who responsible for taking the data structure that exists in the enterprise and model it into a schema that is suitable for OLAP analysis. Data Modeling: A method used to define and analyze data requirements needed to support the business functions of an organization. Data of revisions (also known as incremental data capture): This includes the revisions since the last time data was captured. If the source data is transient, the capture of the revisions is a difficult exercise. Data Partitioning: The term ‘partition’ refers to the physical status of a data structure that has been divided into two or more separate structures. But logical partitioning of the data is also required to better understand and use the data. In such a case, the logical partitioning overlaps with the physical partitioning. Data Profiling: Data Profiling is a critical step in the data migration that automates the identification of problematic data and metadata, and enables organizations correct inconsistencies, redundancies and inaccuracies in their databases. Data visualization is the process by which numerical data are converted into meaningful images. Here, the data may come from any type of sources viz., satellite photos, undersea sonic measurements, surveys, or computer simulations. Data Visualization: Data visualization involves examining the data represented by dynamic images rather than pure numbers. These are the techniques that turn the data into information by using the high capacity of the human brain to visually recognize patterns and trends. Data warehouse: A subject-oriented, integrated, non-volatile, time-variant collection of data designed to support the decision-making requirements of an organization. Sikkim Manipal University Page No. 248 Business Intelligence and Tools Glossary Data: A set of collected facts. Database Administrator (DBA): An individual in a BI project who keeps the database available for the applications run smoothly and also involves in planning and executing a backup/recovery plan, as well as performance tuning. DataStage: DataStage provides a set of powerful tools for developing a data warehouse. Decentralized Warehouse: A remote data source that users can query/access via a central gateway that provides a logical view of corporate data in terms that users can understand. The gateway parses and distributes queries in real time to remote data sources and returns result sets back to users Decision trees: This technique offers a conceptually simple mathematical method of following the effect of each event, or decision, on successive events. Deduplication: Some companies may maintain several records for a single customer and so duplicates are the result of the additional records. Therefore, it is suggested to keep a single record for one customer and link all the duplicates in the source systems to this single record in your data warehouse. This process is called deduplication. Deferred Data Extraction: All the methods in the immediate data extraction involve the real-time data capture. In contrast, these deferred data extraction methods do not capture the changes in real time and does the same in later period. Derived Data: This is the data that has been derived or created perhaps by aggregating or averaging the real-time data through a defined process. This data can represent a view of the business at a specific point of time or can be a historical record of the business over a period of time. Reconciled Data Sikkim Manipal University Page No. 249 Business Intelligence and Tools Glossary Destructive Merge: W hen you apply the incoming data to the target data, the destructive merge process updates target record, if the primary key of an incoming record matches with the key of an existing record. The incoming record simply gets added to the target table, if the incoming record is a new record. Detailed raw data: This is the lowest level of detailed transaction data available within a data warehouse or a data mart without any aggregation or summarization. Dimension: A dimension is a collection of members or units of the same type of views. Usually, it is represented by an axis. In a dimensional model, every data point in the fact table is associated with one and only one member from each of the multiple dimensions. Dimensional modeling: This is one of the data modeling techniques. It uses three basic concepts; Facts, Dimensions and Measures. Dimensional modeling is powerful in representing the requirements of the business user in the context of database tables and also in the area of data warehousing. DOLAP: This stands for desktop online analytical processing. DOLAP is a variation of ROLAP. Domain: A domain consists of all the possible acceptable values and categories that are allowed for an attribute. It is the set of all real possible occurrences. Drill-down: This is the capability to browse the information through a hierarchical structure. E Enrichment: This is a data transformation task that involves the rearrangement and simplification of individual fields to make them useful for the data warehouse environment. Sikkim Manipal University Page No. 250 Business Intelligence and Tools Glossary Enterprise Data Model: This is an approach to develop a data warehouse data model. An EDM is a consistent definition of all the data elements common to the business, from a high-level business view to a generic logical data design. Using this model, you can derive the general scope and understanding of the business requirements and the model also includes links to the physical data designs of the individual applications. Enterprise data warehouse: It consists of the data drawn from multiple operational systems of an organization. This data warehouse supports timeseries and trend analysis across different business areas of an organization and so can be used for strategic decision-making. Enterprise Information System (EIS): This is an information interface system that is specially designed to facilitate the analysis of critical information for operating an organization. These systems provide tools that support the strategic decision making needs of the top executives of the organization. Entity: An entity is defined to be a person, place, thing, or event of interest to the business or the organization. It represents a class of objects, which are things in the real business world that can be observed and classified by their properties and characteristics. In general, an entity has its own business definition and a clear boundary definition that is required to describe what is included and what is not. ER modeling: This is one of the data modeling techniques. It produces a data model of the specific area of interest, using two basic concepts; Entities and the Relationships between them. A detailed ER model may also contain attributes, which can be properties of either the entities or the relationships. The ER model is an abstraction tool as it can be used to simplify, understand and analyze the ambiguous data relationships in the real business world. Sikkim Manipal University Page No. 251 Business Intelligence and Tools Glossary ETL Developer: An individual in a BI project who involves in planning, developing, and deploying the extraction, transformation, and loading routine for the data warehouse from the legacy systems. External Data Source: This is the data that is not available in the OLTP systems, but is required to enhance the information quality in the data warehouse. The examples of this data include the data of the competitors, information of the regulatory and government bodies, research data of the professional bodies and universities. F Fact: A f act is a collection of related data items, consisting of measures and context data. A fact represents a business item, a business transaction, or an event that can be used to analyze the business or a business process. Format Revisions: Format revisions include changes to the data types and lengths of individual fields. For instance, product package types in your source systems may be indicated by codes and names in which the fields are numeric and text data types. Front End Developer: An individual in a BI project who develops the frontend, whether it be client-server or over the web. Full refresh: This involves complete erasing the contents of one or more tables and reloading with fresh data (initial load is afresh of all the tables) G Granularity: This refers to the level of details of the data provided in a data warehouse or a data mart. A typical data warehouse will have some tables in it that have a lot of detail and have other tables that are summarized or aggregated, which means less detail. The more detail data that is available, the lower the level of granularity. Sikkim Manipal University Page No. 252 Business Intelligence and Tools Glossary I Immediate Data Extraction: The immediate data extraction is a real-time data extraction. Improvement: The positive effect of a process change effort. Increment Load: This involves applying ongoing changes as necessary in a periodic manner Incremental improvement: Improvements that are implemented on a continual basis. Informatica: This is a popular ETL tool in the market and this suite consists of five components and provides the complete business intelligence solutions. Information planning phase: This is one of the phases in the phased Enterprise data modeling. This phase provides the highly consolidated view of the business wherein you can view the business concepts. These business concepts can be categorized into business entity, super entity, or subject area and each of these items maintain related data elements. Information System Planning (ISP): This is one of the tiers in the phased enterprise data modeling as proposed by IBM in Worldwide Solution Design and Delivery Method. Initial Load: This that involves populating all the data warehouse tables the first time Integrated Data Warehouse: This is an earlier development phase of a data warehouse. Data warehouses at this stage are used to generate activity or transactions that are passed back into the operational systems for use in the daily activity of the organization. Intrinsic data quality: This represents the accuracy of the data. It is the degree to which data accurately reflects the real-world object that the data represents. Sikkim Manipal University Page No. 253 Business Intelligence and Tools Glossary K Key Indicator Method: This is a method employed to determine the information needs of the senior executives in an organization. In this method, the top executives monitor only that information where the information is out-of-normal condition. Whenever such a condition happens, the top executives may gather further information for making decisions intended to correct the condition. Knowledge: Knowledge is part of the hierarchy made up of data, information and knowledge. Data are raw facts. Information is data with context and perspective. Knowledge is information with guidance for action. L Linkage Analysis: The data mining techniques that employ linkage analysis (associations) search all details or transactions from operational systems for patterns with a high probability of repetition. Logical Data Modeling phase: This is one of the phases in the phased Enterprise data modeling. This phase comes into the picture after business analyzing phase and it consists of several hundred entities and contains the identification and definition of all entities, relationships and attributes. The entities of the logical data model can be further portioned into views by subject areas or by applications. This phase can be divided into two types; ‘Generic logical data model’ for the organizational level and ‘Logical application model’ for the application level of data view. M Machine learning: These techniques, such as genetic algorithms and fuzzy logic, can derive meaning from complicated and imprecise data and can extract patterns from and detect trends within the data that are far too complex to be noticed by either human brain or more conventional automated analysis techniques. Sikkim Manipal University Page No. 254 Business Intelligence and Tools Glossary Manager: An individual charged with the responsibility for managing resources and processes. Measure: A measure is a numeric attribute of a fact, representing the performance or behavior of the business relative to the dimensions. Metadata: Metadata is data about data. The examples of metadata include data element descriptions, data type descriptions, attribute descriptions, and process descriptions. Metadata: This refers to "data about data." It is the information that describes, or supplements the main data in a data warehouse or in a data mart. Metric: A standard for measurement. Mission: An organization's purpose. MOLAP Model: This is the more traditional way of OLAP analysis. In MOLAP, data is stored in a multi-dimensional cube. The storage is not in the relational database, but in proprietary formats. N Neural networks: This technique attempts to mirror the way the human brain works in recognizing patterns by developing mathematical structures with the ability to learn. By studying combinations of variables and how different combinations affect datasets, these networks develop nonlinear predictive models. Normalization: This is a process of assigning the attributes to entities in a way to reduces data redundancy, avoid data anomalies, provide a solid architecture for updating data, and reinforce the long-term integrity of the data model. Null Method: This is a method employed to determine the information needs of the senior executives in an organization. This method assumes Sikkim Manipal University Page No. 255 Business Intelligence and Tools Glossary that the information needs of senior executives are so dynamic and fluid that the pre-defined reports generated by the typical information systems are not especially useful. O Offline Data Warehouse: This is an earlier development phase of a data warehouse. Data warehouses in this stage are updated on a regular basis (usually daily, weekly or monthly) from the operational systems and the data is stored in an integrated reporting- oriented data structure. Offline Operational Databases: This is an earlier development phase of a data warehouse. During this stage, data warehouses are developed by simply copying the database of an operational system to an off-line server where the processing load of reporting does not impact on the operational system's performance. OLAP Developer: An individual in a BI project who develops the OLAP cubes. On-Line Analytical Processing (OLAP): This is a category of software technology that enables the users gain insight into data through fast, consistent, interactive access to a wide variety of possible views of information that has been transformed from raw data to reflect the real dimensionality of the organization. This is implemented in a multi-user client/server mode and offers consistently rapid response to queries, regardless of database size and complexity. This software is also called Multidimensional Analysis Software. On-Line Transaction Processing (OLTP): This is the way the data is processed by an end user/a computer system. Here, the data is detail oriented, highly repetitive with larger amounts of updates and changes. The major task of these systems is to perform on-line transaction and query processing. These systems cover most of the day-to-day operations of the Sikkim Manipal University Page No. 256 Business Intelligence and Tools Glossary organization, such as purchasing, inventory, manufacturing, payroll, banking, accounting and registration. Operational costs: These are the costs associated with running and maintaining the data warehouse Operational Databases: These are detail oriented databases defined to meet the needs of complex processes of an organization. Here, the data is highly normalized to avoid data redundancy and double-maintenance. A large number of transactions take place every hour on these databases and are always "up to date" and represent a snapshot of the current situation. Contrast to these databases, there are Informational databases that are stable over a period of time to represent a situation at a specific point in time in the past. P Periodic Status: This is the status wherein the status value is stored with reference to the time. For example, the data about an insurance policy is stored as the status data of the policy at each point of time. So the history of the changes is preserved in the source systems themselves. Physical Data Design phase: This is one of the phases in the phased Enterprise data modeling. This phase is useful to design for the actual physical implementation and applies physical constraints, such as space, performance, and the physical distribution of the data. Policy: An overarching plan (direction) for achieving an organization's goals. Process: A set of interrelated work activities characterized by a set of specific inputs and value added tasks that make up a procedure for a set of specific outputs. Project Manager: An individual in a BI project who monitors the progress on continuum basis and is responsible for the success of the project. Sikkim Manipal University Page No. 257 Business Intelligence and Tools Glossary Q QA Group: A group of individuals in a BI project who ensures the correctness of the data in the data warehouse R Real Time Data Warehouse: This is an earlier development phase of a data warehouse. Data warehouses at this stage are updated on a transaction or event basis. The data is updated every time an operational system performs a transaction (e.g. an order or a delivery or a booking, etc.) Realistic data quality: This is the degree of utility and value the data has to support the organizational processes to accomplish the organizational objectives. Real-time Data: This data represents the current status of the business. Typically, real-time data is used by operational applications to run the business and the data constantly changes as operational transactions are processed. Reconciled data: This is the real-time data that has been cleansed, modified, or enhanced. This data provides an integrated source of quality data for use of data analysts in the data analysis. Refresh: This is a much simpler option than update. But you may have to keep the data warehouse down for unacceptably long times if you run refresh jobs every day. Relationship: Relationships represent the structural interaction and association among the entities in a model and they are represented with lines drawn between the two specific entitles. Generally, a relationship is named grammatically by a verb (such as owns, belongs, and has) and the relationship between the entities can be defined in terms of the cardinality. ROLAP Model: This methodology relies on manipulating the data stored in the relational database to give the appearance of traditional OLAP's slicing Sikkim Manipal University Page No. 258 Business Intelligence and Tools Glossary and dicing functionality. In this model, data is stored as rows and columns in relational form. This model presents data to the users in the form of business dimensions. S Selection and Splitting/Joining: This is the basic task that is done at the beginning of the entire data transformation process. Through use of this task, you may select either whole records or parts of several records from the source systems. The splitting/joining task includes the type of data manipulation you need to perform on selected records of the source systems. You can either split the selected parts further or join the parts selected from many source systems. But the joining task is quite often used in the data warehouse environment. Sequential Discovery: Techniques that use sequencing or time-series analysis relate events in time based on a series of preceding events viz., prediction of interest rate fluctuations, stock performance. This analysis reveals various hidden trends and often highly predictive of future events. Snowflake Model: The snowflake model is derived from the star model and is the result of decomposing one or more of the dimensions, which sometimes have hierarchies themselves. Source Identification: Source identification is a critical process in the data extraction process. For instance, you have intended to design a data warehouse to provide strategic information on fulfillment of orders. So you need to store the information as fulfilled and pending orders. If you deliver the orders through multiple channels, you are also need to capture the data about the delivery channels. Stakeholder: Any individual, group or organization that will have a significant impact on or will be significantly impacted by the quality of the product or service an organization provides. Sikkim Manipal University Page No. 259 Business Intelligence and Tools Glossary Star model: This is a basic structure for a dimensional model and has one large central table (a fact table) and a set of smaller tables (the dimension tables) arranged in a radial pattern around the central table Static data: This is the capture of data at a specific point of time. For current data, this capture includes all transient data identified for extraction. Statistical Analysis: Statistical analysis is the most mature of all data mining technologies and is the easiest to understand. The traditional statistical modeling techniques such as regression analysis are useful in building linear models that describe predictable data points. Strategic planning: The process by which an organization envisions its future and develops strategies, goals, objectives and action plans to achieve that future. Summarization: This is a data transformation task that is used in case you find that it is not required to keep data at the lowest level of detail in your data warehouse. Summarized data: This is the transaction data in a data warehouse aggregated at the level required for the most used queries. Supplier: A source of materials, service or information input provided to a process. System: A group of interdependent processes and people that together perform a common mission. T Task: A specific, definable activity to perform an assigned piece of work, often finished within a certain time. Team: A group of individuals organized to work together to accomplish a specific objective. Sikkim Manipal University Page No. 260 Business Intelligence and Tools Glossary Technical Architect: An individual in a BI project who develops and implements the overall technical architecture of the BI system, from the backend hardware/software to the client desktop configurations. Top-Down Approach: The top-down approach views the data warehouse as the mainstay of the entire analytic environment. The data warehouse holds atomic or transaction level data that has been extracted from the source systems and integrated within a normalized, enterprise data model. Later, the data is summarized, dimensionalized, and distributed to one or more “dependent” data marts. Total Data Quality Management (TDQM): An approach to improve the quality of the data loaded into a warehouse. Total study method: This is a method employed to determine the information needs of the senior executives in an organization. According to this approach, the information is gathered from a sample of top executives in the organization concerning the totality of their information needs. However, this method is more comprehensive in nature and is expensive as well. Trainer: An individual in a BI project who works with the end users to get them familiar with how the front end is set up so that the end users can get the most benefit out of the system Type 1 Changes - Correction of Errors: Type 1 changes are applied to the data warehouse without any need to preserve history as these changes are usually relate to the corrections of errors in the source systems. Type 2 Changes - History Preservation: Suppose there is a change in the marital status of a customer and one of the essential requirements of your data warehouse is to track of the orders according to the marital status. Sikkim Manipal University Page No. 261 Business Intelligence and Tools Glossary Type 3 Changes - Soft Revisions: Type 3 changes are tentative or soft revisions. Unlike the Type 2 changes, the orders need to be maintained in the old and new groups after an effective date. U Unit: An object on which a measurement or observation can be made. Note: Commonly used in the sense of a "unit of product, "the entity of product inspected in order to determine whether it is defective or non-defective. Update: This is an application of incremental changes in the data sources and ‘refresh’ is a complete reload of data at specified intervals. The refresh option involves the periodic replacement of complete data warehouse tables. V Visual Warehouse Administrative Clients: The Administrative Client also runs on Windows NT server and provides an interface for administrative functions, such as defining the business views, defining the target data warehouse databases, registering data resources, filtering source data. Visual Warehouse Agents: The architecture of visual warehouse agents is a key enabler for scalable business intelligence solutions. These agents run on Windows NT, OS/2, AS/400, AIX, and Sun Solaris and they handle access to the source data, filtering, transformation, sub-setting, and delivery of transformed data to the target warehouse as directed by the Visual Warehouse Server. Visual Warehouse Control Database: A control database is set up in DB2 to be used by visual warehouse to store control information used by the Server. The control database stores the metadata required to build and manage the warehouse. Visual Warehouse Server: The visual warehouse server runs on a Windows NT workstation or server. It controls the interaction of the various Sikkim Manipal University Page No. 262 Business Intelligence and Tools Glossary data warehouse components and provides for automation of data warehousing processes through a powerful scheduling facility. Visual Warehouse Target Databases: The target databases in a data warehouse contain the visual warehouse data stored in structures defined as Business Views (BVs). When visual warehouse populates a BV, the data is extracted from the source and is transformed according to the rules defined in the BV, which is then stored in the target database. Visual Warehouse: It is an integrated product for building and maintaining a data warehouse or data mart in a LAN environment. The visual warehouse integrates many of the business intelligence component functions into a single product and it can be used to automate the process of bringing the data together from heterogeneous sources into a central, integrated, informational environment. Sikkim Manipal University Page No. 263 Business Intelligence and Tools Glossary Bibliography 1. The Microsoft Data Warehouse Toolkit: with SQL Server 2005 and the Microsoft Business Intelligence Toolset, Joy Mundy and Warren Thornthwaite with Ralph Kimball, Wiley Publishing, Inc., ISBN-13: 978-0471-26715-7, ISBN-10: 0-471-26715-5. 2. The Data Warehouse Toolkit: The Complete Guide to Dimensional Modeling, Ralph Kimball and Margy Ross, Wiley Publishing, Inc., ISBN81-265-0889-2. 3. Modern Data Warehousing, Mining, and Visualization: Core Concepts; George M. Marakas, Pearson Education, ISBN:81-297-0210-X 4. The Data Warehouse Lifecycle Toolkit: Expert Methods for Designing, Developing, and Deploying Data Warehouses, Ralph Kimball, Laura Reeves, Margy Ross, Warren Thornthwaite, Wiley Publishing, Inc., ISBN: 978-0-471-25547-5 5. The Data Warehouse ETL Toolkit: Practical Techniques for Extracting, Cleaning, Conforming, and Delivering Data, Ralph Kimball, Joe Caserta, Wiley Publishing, Inc., ISBN: 978-0-7645-6757-5 6. The Data Warehouse Toolkit: Building the Web-Enabled Data Warehouse, Ralph Kimball, Richard Merz, Wiley Publishing, Inc., ISBN: 978-0-471-37680-4 7. Data Warehousing Fundamentals: A Comprehensive Guide for IT Professionals, Paulraj Ponniah, Wiley Publishing, Inc., ISBN: 978-0-47141254-0 8. Building the Data Warehouse, William H. Inmon, Wiley Publishing, Inc., ISBN:07645-9944-5 9. Data Warehouse Management with DB2 UDB V8.1 Warehouse Manager, International Business Machines Corporation, Prentice-Hall of India Pvt., Ltd., ISBN:81-203-2592-3. Sikkim Manipal University Page No. 264 ...
View Full Document

This note was uploaded on 04/15/2010 for the course MBA mba taught by Professor Smu during the Spring '10 term at Manipal University.

Ask a homework question - tutors are online