SAP Business objects DATA SERVICES Migration Considerations Guide

Data Services Migration Considerations Guide
BusinessObjects Data Services XI 3.1 (12.1.1)
Copyright
© 2008 Business Objects, an SAP company. All rights reserved. Business Objects owns the following U.S. patents, which may cover products that are offered and licensed by Business Objects: 5,295,243; 5,339,390; 5,555,403; 5,590,250; 5,619,632; 5,632,009; 5,857,205; 5,880,742; 5,883,635; 6,085,202; 6,108,698; 6,247,008; 6,289,352; 6,300,957; 6,377,259; 6,490,593; 6,578,027; 6,581,068; 6,628,312; 6,654,761; 6,768,986; 6,772,409; 6,831,668; 6,882,998; 6,892,189; 6,901,555; 7,089,238; 7,107,266; 7,139,766; 7,178,099; 7,181,435; 7,181,440; 7,194,465; 7,222,130; 7,299,419; 7,320,122 and 7,356,779. Business Objects and its logos, BusinessObjects, Business Objects Crystal Vision, Business Process On Demand, BusinessQuery, Cartesis, Crystal Analysis, Crystal Applications, Crystal Decisions, Crystal Enterprise, Crystal Insider, Crystal Reports, Crystal Vision, Desktop Intelligence, Inxight and its logos , LinguistX, Star Tree, Table Lens, ThingFinder, Timewall, Let There Be Light, Metify, NSite, Rapid Marts, RapidMarts, the Spectrum Design, Web Intelligence, Workmail and Xcelsius are trademarks or registered trademarks in the United States and/or other countries of Business Objects and/or affiliated companies. SAP is the trademark or registered trademark of SAP AG in Germany and in several other countries. All other names mentioned herein may be trademarks of their respective owners.
Third-party Contributors
Business Objects products in this release may contain redistributions of software licensed from third-party contributors. Some of these individual components may also be available under alternative licenses. A partial listing of third-party contributors that have requested or permitted acknowledgments, as well as required notices, can be found at: http://www.businessobjects.com/thirdparty
2008-11-28

Contents

Introduction 9Chapter 1
Welcome to Data Services........................................................................10
Data Services Migration Considerations 17Chapter 2
Behavior changes in version 12.1.1..........................................................18
Behavior changes in version 12.1.0..........................................................20
Behavior changes in version 12.0.0..........................................................26
Welcome..............................................................................................10
Documentation set for Data Services...................................................10
Accessing documentation....................................................................13
Business Objects information resources..............................................14
Blob data type enhancements..............................................................19
Neoview bulk loading...........................................................................20
Cleansing package changes................................................................21
DTD-to-XSD conversion.......................................................................21
Minimum requirements for international addressing directories...........22
Try/catch exception groups..................................................................22
Upgrading from version 12.0.0 to version 12.1.0.................................25
Case transform enhancement..............................................................26
Data Quality projects in Data Integrator jobs ......................................27
Data Services web address..................................................................27
Large object data type enhancements.................................................28
License keycodes.................................................................................31
Locale selection....................................................................................31
ODBC bigint data type..........................................................................33
Persistent and pageable cache enhancements...................................33
Data Services Migration Considerations Guide 3
Contents
Row delimiter for flat files.....................................................................33
Behavior changes in version 11.7.3...........................................................34
Data flow cache type............................................................................35
Job Server enhancement.....................................................................35
Logs in the Designer............................................................................35
Pageable cache for memory-intensive data flows................................35
Adapter SDK........................................................................................36
PeopleSoft 8.........................................................................................36
Behavior changes in version 11.7.2...........................................................36
Embedded data flows...........................................................................37
Oracle Repository upgrade..................................................................37
Solaris and AIX platforms.....................................................................39
Behavior changes in version 11.7.0...........................................................39
Data Quality..........................................................................................40
Distributed data flows...........................................................................42
JMS Adapter interface..........................................................................43
XML Schema enhancement.................................................................43
Password management........................................................................43
Repository size.....................................................................................44
Web applications..................................................................................44
Web services........................................................................................44
Behavior changes in version 11.6.0...........................................................45
Netezza bulk loading............................................................................45
Conversion between different data types.............................................46
Behavior changes in version 11.5.1.5........................................................46
Behavior changes in version 11.5.1...........................................................46
Behavior changes in version 11.5.0.0........................................................47
Web Services Adapter..........................................................................47
Varchar behavior..................................................................................47
Central Repository................................................................................48
Behavior changes in version 11.0.2.5........................................................48
4 Data Services Migration Considerations Guide
Contents
Teradata named pipe support...............................................................48
Behavior changes in version 11.0.2...........................................................48
Behavior changes in version 11.0.1.1........................................................49
Statistics repository tables....................................................................49
Behavior changes in version 11.0.1...........................................................49
Crystal Enterprise adapters..................................................................50
Behavior changes in version 11.0.0...........................................................50
Changes to code page names.............................................................51
Data Cleansing.....................................................................................52
License files and remote access software...........................................53
Behavior changes in version 6.5.1............................................................53
Behavior changes in version 6.5.0.1.........................................................54
Web services support...........................................................................54
Sybase bulk loader library on UNIX.....................................................55
Behavior changes in version 6.5.0.0.........................................................55
Browsers must support applets and have Java enabled......................55
Execution of to_date and to_char functions.........................................56
Changes to Designer licensing............................................................57
License files and remote access software...........................................57
Administrator Repository Login............................................................58
Administrator Users..............................................................................59
Data Quality to Data Services Migration 61Chapter 3
Overview of migration................................................................................62
Who should migrate?...........................................................................62
Why migrate?.......................................................................................62
Introduction to the interface..................................................................64
Downloading blueprints and other content objects..............................66
Introduction to the migration utility........................................................67
Terminology in Data Quality and Data Services...................................67
Naming conventions.............................................................................69
Data Services Migration Considerations Guide 5
Contents
Deprecated objects..............................................................................70
Premigration checklist..........................................................................72
Using the migration tool.............................................................................73
Overview of the migration utility...........................................................73
Migration checklist................................................................................74
Connection information........................................................................75
Running the dqmigration utility ............................................................76
dqmigration utility syntax and options..................................................79
Migration report ...................................................................................83
How Data Quality repository contents migrate..........................................85
How projects and folders migrate.........................................................85
How connections migrate.....................................................................91
How substitution files and variables migrate........................................97
How data types migrate......................................................................102
How Data Quality attributes migrate..................................................103
How transforms migrate...........................................................................103
Overview of migrated transforms.......................................................103
Address cleansing transforms............................................................111
Reader and Writer transforms............................................................121
How Data Quality integrated batch Readers and Writers migrate.....159
How Data Quality transactional Readers and Writers migrate...........165
Matching transforms...........................................................................169
UDT-based transforms.......................................................................178
Other transforms................................................................................188
Suggestion Lists options....................................................................202
Post-migration tasks................................................................................203
Further cleanup .................................................................................203
Improving performance .....................................................................210
Troubleshooting..................................................................................215
6 Data Services Migration Considerations Guide
Contents
Index 223
Data Services Migration Considerations Guide 7
Contents
8 Data Services Migration Considerations Guide

Introduction

1
Introduction
1

Welcome to Data Services

This document contains the following migration topics:
Migration considerations for behavior changes associated with each
version of the Data Integrator and Data Services products.
Migration of your Data Quality Projects into Data Services.
Welcome to Data Services

Welcome

Data Services XI Release 3 provides data integration and data quality processes in one runtime environment, delivering enterprise performance and scalability.
The data integration processes of Data Services allow organizations to easily explore, extract, transform, and deliver any type of data anywhere across the enterprise.
The data quality processes of Data Services allow organizations to easily standardize, cleanse, and consolidate data anywhere, ensuring that end-users are always working with information that's readily available, accurate, and trusted.

Documentation set for Data Services

You should become familiar with all the pieces of documentation that relate to your Data Services product.
What this document providesDocument
Documentation Map
Release Summary
Release Notes
10 Data Services Migration Considerations Guide
Information about available Data Services books, languages, and locations
Highlights of key features in this Data Services re­lease
Important information you need before installing and deploying this version of Data Services
Introduction
Welcome to Data Services
What this document providesDocument
1
Getting Started Guide
Installation Guide for Windows
Installation Guide for UNIX
Advanced Development Guide
Designer Guide
Integrator's Guide
Management Console: Administrator Guide
Management Console: Metadata Re­ports Guide
Migration Considerations Guide
An introduction to Data Services
Information about and procedures for installing Data Services in a Windows environment.
Information about and procedures for installing Data Services in a UNIX environment.
Guidelines and options for migrating applications in­cluding information on multi-user functionality and the use of the central repository for version control
Information about how to use Data Services Designer
Information for third-party developers to access Data Services functionality. Also provides information about how to install, configure, and use the Data Services Adapter for JMS.
Information about how to use Data Services Adminis­trator
Information about how to use Data Services Metadata Reports
Information about:
Release-specific product behavior changes from
earlier versions of Data Services to the latest re­lease
How to migrate from Data Quality to Data Services
Performance Optimization Guide
Reference Guide
Information about how to improve the performance of Data Services
Detailed reference material for Data Services Design­er
Data Services Migration Considerations Guide 11
Introduction
1
Welcome to Data Services
Technical Manuals
What this document providesDocument
A compiled “master” PDF of core Data Services books containing a searchable master table of contents and index:
Getting Started Guide
Installation Guide for Windows
Installation Guide for UNIX
Designer Guide
Reference Guide
Management Console: Metadata Reports Guide
Management Console: Administrator Guide
Performance Optimization Guide
Advanced Development Guide
Supplement for J.D. Edwards
Supplement for Oracle Applications
Supplement for PeopleSoft
Supplement for Siebel
Supplement for SAP
Tutorial
A step-by-step introduction to using Data Services
In addition, you may need to refer to several Adapter Guides and Supplemental Guides.
What this document providesDocument
Salesforce.com Adapter Interface
Supplement for J.D. Ed­wards
Supplement for Oracle Ap­plications
Supplement for PeopleSoft
12 Data Services Migration Considerations Guide
Information about how to install, configure, and use the Data Services Salesforce.com Adapter Interface
Information about license-controlled interfaces between Data Services and J.D. Edwards World and J.D. Edwards OneWorld
Information about the license-controlled interface between Data Services and Oracle Applications
Information about license-controlled interfaces between Data Services and PeopleSoft
Introduction
Welcome to Data Services
What this document providesDocument
1
Supplement for SAP
Supplement for Siebel
Information about license-controlled interfaces between Data Services, SAP ERP, and SAP BI/BW
Information about the license-controlled interface between Data Services and Siebel

Accessing documentation

You can access the complete documentation set for Data Services in several places.
Accessing documentation on Windows
After you install Data Services, you can access the documentation from the Start menu.
1. Choose Start > Programs > BusinessObjects XI 3.1 >
BusinessObjects Data Services > Data Services Documentation.
Note:
Only a subset of the documentation is available from the Start menu. The documentation set for this release is available in LINK_DIR\Doc\Books\en.
2. Click the appropriate shortcut for the document that you want to view.
Accessing documentation on UNIX
After you install Data Services, you can access the online documentation by going to the directory where the printable PDF files were installed.
1. Go to LINK_DIR/doc/book/en/.
2. Using Adobe Reader, open the PDF file of the document that you want
to view.
Data Services Migration Considerations Guide 13
Introduction
1
Welcome to Data Services
Accessing documentation from the Web
You can access the complete documentation set for Data Services from the Business Objects Customer Support site.
1.
Go to http://help.sap.com.
2. Cick Business Objects at the top of the page.
You can view the PDFs online or save them to your computer.

Business Objects information resources

A global network of Business Objects technology experts provides customer support, education, and consulting to ensure maximum business intelligence benefit to your business.
Useful addresses at a glance:
ContentAddress
14 Data Services Migration Considerations Guide
Introduction
Welcome to Data Services
ContentAddress
1
Customer Support, Consulting, and Education services
http://service.sap.com/
Data Services Community
https://www.sdn.sap.com/irj/sdn/businessob jects-ds
Forums on SCN (SAP Community Network)
https://www.sdn.sap.com/irj/sdn/businessob jects-forums
Blueprints
http://www.sdn.sap.com/irj/boc/blueprints
Information about Customer Support programs, as well as links to technical articles, downloads, and online forums. Consulting services can provide you with information about how Busi­ness Objects can help maximize your business intelligence investment. Education services can provide information about training options and modules. From traditional classroom learning to targeted e-learning seminars, Business Ob­jects can offer a training package to suit your learning needs and preferred learning style.
Get online and timely information about Data Services, including tips and tricks, additional downloads, samples, and much more. All con­tent is to and from the community, so feel free to join in and contact us if you have a submis­sion.
Search the Business Objects forums on the SAP Community Network to learn from other Data Services users and start posting questions or share your knowledge with the community.
Blueprints for you to download and modify to fit your needs. Each blueprint contains the neces­sary Data Services project, jobs, data flows, file formats, sample data, template tables, and custom functions to run the data flows in your environment with only a few modifications.
Data Services Migration Considerations Guide 15
Introduction
1
Welcome to Data Services
http://help.sap.com/
ContentAddress
Business Objects product documentation.Product documentation
Documentation mailbox
documentation@businessobjects.com
Supported platforms documentation
https://service.sap.com/bosap-support
Send us feedback or questions about your Business Objects documentation. Do you have a suggestion on how we can improve our docu­mentation? Is there something that you particu­larly like or have found useful? Let us know, and we will do our best to ensure that your suggestion is considered for the next release of our documentation.
Note:
If your issue concerns a Business Objects product and not the documentation, please contact our Customer Support experts.
Get information about supported platforms for Data Services.
In the left panel of the window, navigate to Documentation > Supported Platforms > BusinessObjects XI 3.1. Click the Busines­sObjects Data Services link in the main win­dow.
16 Data Services Migration Considerations Guide

Data Services Migration Considerations

2
Data Services Migration Considerations
2

Behavior changes in version 12.1.1

This chapter describes behavior changes associated with the Data Integrator product since version 6.5 and in Data Services since 12.0.0 including Data Quality functionality. Each behavior change is listed under the version number in which the behavior originated.
For information about how to migrate your Data Quality Projects into Data Services, see Data Quality to Data Services Migration on page 61.
For the latest Data Services technical documentation, consult the Data Services Technical Manuals included with your product.
This Migration Considerations document contains the following sections:
Behavior changes in version 12.1.1 on page 18
Behavior changes in version 12.1.0 on page 20
Behavior changes in version 12.0.0 on page 26
Behavior changes in version 11.7.3 on page 34
Behavior changes in version 11.7.2 on page 36
Behavior changes in version 11.7.0 on page 39
Behavior changes in version 11.6.0 on page 45
Behavior changes in version 11.5.1 on page 46
Behavior changes in version 11.5.0.0 on page 47
Behavior changes in version 11.0.2.5 on page 48
Behavior changes in version 11.0.2 on page 48
Behavior changes in version 11.0.1.1 on page 49
Behavior changes in version 11.0.1 on page 49
Behavior changes in version 11.0.0 on page 50
Behavior changes in version 6.5.1 on page 53
Behavior changes in version 6.5.0.1 on page 54
Behavior changes in version 6.5.0.0 on page 55
To read or download Data Integrator and Data Services documentation for previous releases (including Release Summaries and Release Notes), see the SAP Business Objects Help Portal at http://help.sap.com/.
Behavior changes in version 12.1.1
The following sections describe changes in the behavior of Data Services
12.1.1 from previous releases of Data Services and Data Integrator. In most
cases, the new version avoids changes that would cause existing applications
18 Data Services Migration Considerations Guide
to modify their results. However, under some circumstances a change has been deemed worthwhile or unavoidable.
If you are migrating from Data Quality to Data Services, see Data Quality to
Data Services Migration on page 61.
This section includes migration-specific information associated with the following features:
Blob data type enhancements on page 19
Neoview bulk loading on page 20

Blob data type enhancements

Data Services 12.1.1 provides the following enhancements for binary large object (blob) data types:
You can now define blob data type columns in a fixed-width file format,
and you can read from and load to blob columns in fixed-width files
The dqmigration utility now migrates Data Quality binary data types in
fixed-width flat files to Data Services blob (instead of varchar) data types in fixed-width file formats. You no longer need to change the data type from varchar to blob after migration.
Data Services Migration Considerations
Behavior changes in version 12.1.1
2
In a fixed-width file, the blob data is always inline with the rest of the data in the file. The term "inline" means the data itself appears at the location where a specific column is expected.
The 12.1.0 release of Data Services introduced support for blob data types in a delimited file. In a delimited file, the blob data always references an external file at the location where the column is expected. Data Services automatically generates the file name.
The following table summarizes the capabilities that each release provides for blob data types:
<<Filename>>Inline
File Type
12.1.112.1.012.1.112.1.0
YesYesNoNoblob in delimited file
NoNoYesNoblob in fixed-width file
Data Services Migration Considerations Guide 19
Data Services Migration Considerations
2

Behavior changes in version 12.1.0

These capabilities help customers migrate their existing Data Quality projects that handle binary data in flat files to Data Services fixed-width file formats. The Data Services blob data type now supports blob data types from Data Quality XI R2 and legacy Firstlogic products.
Related Topics
Reference Guide: Data Types, blobs

Neoview bulk loading

If you plan to bulk load data to a Neoview database, we recommend that you set Timeout to 1000 in your Neoview target table.
If you create a new repository in version 12.1.1, you do not need to set
Timeout because its default value is 1000.
If you use a 12.1.0 repository when you install version 12.1.1, the default
value for Timeout is 60. Therefore, increase Timeout to 1000 for new data flows that bulk load into a Neoview database.
Related Topics
Reference Guide: Data Services Objects, HP Neoview target table options
Behavior changes in version 12.1.0
The following sections describe changes in the behavior of Data Services
12.1.0 from previous releases of Data Services and Data Integrator. In most
cases, the new version avoids changes that would cause existing applications to modify their results. However, under some circumstances a change has been deemed worthwhile or unavoidable.
If you are migrating from Data Quality to Data Services, see the Data Quality to Data Services Migration Guide.
This section includes migration-specific information associated with the following features:
Cleansing package changes on page 21
DTD-to-XSD conversion on page 21
Minimum requirements for international addressing directories on page 22
Try/catch exception groups on page 22
20 Data Services Migration Considerations Guide
Upgrading from version 12.0.0 to version 12.1.0 on page 25

Cleansing package changes

Global Parsing Options have been renamed cleansing packages.
You can no longer use the Global Parsing Options installer to install cleansing packages directly into the repository. You must now use a combination of the cleansing package installer and the Repository Manager instead.
If you have made any changes to your existing cleansing package dictionaries, you must do the following:
1. Export the changes using Export Dictionary Changes in the Dictionary
menu of the Data Services Designer.
2. Install the latest cleansing package.
3. Use the Repository Manager to load the cleansing package into the data
cleanse repository.
4. Import the changes into the new cleansing package using Bulk Load in
the Dictionary menu in the Designer.
Data Services Migration Considerations
Behavior changes in version 12.1.0
2
Related Topics
Designer Guide: Data Quality, To export dictionary changes
Installation Guide for Windows: To create or upgrade repositories
Designer Guide: Data Quality, To import dictionary changes

DTD-to-XSD conversion

Data Services no longer supports publishing a DTD-based real-time job as a Web service if the job uses a DTD to define the input and output messages.
If you migrate from Data Services 12.0.0 to version 12.1.0, you do not need to do anything unless you change the DTD. If you change the DTD, reimport it to the repository and publish the Web service as in the following procedure.
If you migrate from Data Integrator 11.7 or earlier versions to Data Services
12.1.0 and publish a DTD-based real-time job as a Web service, you must
reimport the Web service adapter function because the Web address changed for the Management Console in version 12.0.0. Therefore, you must do the following after you upgrade your repository to version 12.1.0:
Data Services Migration Considerations Guide 21
Data Services Migration Considerations
2
Behavior changes in version 12.1.0
1. Use any DTD-to-XSD conversion tool to convert the DTD to XSD.
2. Use the Designer to import the XSD to the Data Services repository.
3. Open the original data flow that is using the DTD and replace it with XSD.
4. Publish the real-time job as Web service.
5. Reimport the service as a function in the Web Service datastore.
Related Topics
Data Services web address on page 27

Minimum requirements for international addressing directories

Due to additional country support and modified database structures (for performance tuning), the minimum disk space requirement for the international addressing directories (All World) has increased as follows:
For the Global Address Cleanse transform (ga_country.dir,
ga_loc12_gen.dir, ga_loc12_gen_nogit.dir, ga_loc34_gen.dir, ga_region_gen.dir), the minimum requirement has increased from 647 MB to 2.71 GB.
If you purchase all countries, the the disk space requirement has increased
from 6.1 GB to 9.34 GB.

Try/catch exception groups

This version of Data Services provides better defined exception groups of errors, new exception groups, and an enhanced catch editor that allows you to select multiple exception groups in one catch to consolidate actions.
After you upgrade your repository to version 12.1, your try/catch blocks created in prior versions contain the 12.1 exception group names and numbers. Be aware of the following situations and additional actions that you might need to take after you upgrade to version 12.1:
The repository upgrade will map Parser errors (1) and Resolve errors (2)
to Execution errors (1000) and will map email errors(16) to System Resource errors (1008). You need to re-evaluate all the actions that are already defined in all the catch blocks and modify them as appropriate,
22 Data Services Migration Considerations Guide
Data Services Migration Considerations
Behavior changes in version 12.1.0
based on the new catch exception group definitions. See the tables below for the mapping of exception groups from version 12.0 to version 12.1 and for the definitions of new exception groups.
All recoverable jobs in a pre-12.1 system lose their recoverable state
when you upgrade. After you upgrade to version 12.1, you need to run the job from the beginning.
If you upgrade a central repository, only the latest version of a work flow,
data flow audit script, and user function contain the 12.1 exception group names. Older versions of these objects contain the pre-12.1 exception group names.
In version 12.1, if you have a sequence of catch blocks in a workflow and
one catch block catches an exception, the subsequent catch blocks will not be executed. For example, if your work flow has the following sequence and Catch1 catches an exception, then Catch2 and CatchAll will not execute. In prior versions, both Catch1 and CatchAll will execute.
Try > DataFlow1 > Catch1 > Catch2 > CatchAll
Note:
If you import pre-12.1 ATL files, any catch objects will not contain the new exception group names and numbers. Only a repository upgrade converts the pre-12.1 exception groups to the 12.1 exception group names and numbers.
2
The following table shows how the exception groups in version 12.0 map to the exception groups in version 12.1:
12.0 Exception group (group number)
Catch All Excep­tions
Parser Errors (1)
12.0 Description
Errors encountered while parsing the language
Data Services Migration Considerations Guide 23
12.1 Exception group (group number)
Pre-execution er­rors (1000)
12.1 Description
All errorsAll exceptionsAll errors
Parser errors are not caught because parsing occurs prior to execution.
Data Services Migration Considerations
2
Behavior changes in version 12.1.0
12.0 Exception group (group number)
Resolver Errors (2)
Execution Errors (5)
Database Access Errors (7)
File Access Errors (8)
12.0 Description
Errors encountered while validating the semantics of Data Services objects which have recommend­ed resolutions
Internal errors that occur during the execution of a data movement specifica­tion
Generic Database Access Errors
Errors accessing files through file formats
12.1 Exception group (group number)
Pre-execution er­rors (1000)
Execution errors (1001)
Database Access Errors (1002)
Flat file processing errors (1004)
File Access Errors (1005)
12.1 Description
Resolver errors are not caught because parsing occurs prior to execution.
Errors from the Data Ser­vices job server or trans­forms
Errors from the database server while reading data, writing data, or bulk load­ing to tables
Errors processing flat files
Errors accessing local and FTP files
Repository Access Errors (10)
Connection and bulk loader errors (12)
Predefined Trans­forms Errors (13)
ABAP Generation Errors (14)
R/3 Execution Er­rors (15)
24 Data Services Migration Considerations Guide
Errors accessing the Data Services repository
Errors connecting to database servers and bulk loading to tables on them
Predefined transform er­rors
ABAP generation errors
R/3 execution errors
Repository access errors (1006)
Database Connec­tion errors (1003)
R/3 system errors (1007)
Errors accessing the Data Services repository
Errors connecting to database servers
Errors while generating ABAP programs, during ABAP generated user transforms, or while ac­cessing R/3 system using its API
Data Services Migration Considerations
Behavior changes in version 12.1.0
2
12.0 Exception group (group number)
System Exception Errors (17)
Engine Abort Er­rors (20)
The following table shows the new exception groups in version 12.1:
New 12.1 Exception group (group number)
XML processing errors (1010)
COBOL copybook errors (1011)
12.0 Description
Email errorsEmail Errors (16)
System exception errors
Engine abort errors
12.1 Exception group (group number)
System Resource errors (1008)
Execution errors (1001)
Description
Errors from the SAP BW system.SAP BW execution errors (1009)
Errors processing XML files and messages
Errors processing COBOL copybook files
12.1 Description
Errors while accessing or using operating system resources, or while send­ing emails
Errors from the Data Ser­vices job server or trans­forms
Errors processing Excel booksExcel book errors (1012)
Data Quality transform errors (1013)
Errors processing Data Quality transforms

Upgrading from version 12.0.0 to version 12.1.0

If you are installing version 12.1.0 and the installer detects a previous installation of version 12.0, you will be prompted to first uninstall version
12.0. The installer will maintain your configuration settings if you install in
the same directory.
Data Services Migration Considerations Guide 25
Data Services Migration Considerations
2

Behavior changes in version 12.0.0

If you are installing version 12.1.0 on top of version 11.x, you do not need to uninstall the previous version.
Behavior changes in version 12.0.0
The following sections describe changes in the behavior of Data Services
12.0.0 from previous releases of Data Integrator. In most cases, the new
version avoids changes that would cause existing applications to modify their results. However, under some circumstances a change has been deemed worthwhile or unavoidable.
If you are migrating from Data Quality to Data Services, see the Data Quality to Data Services Migration Guide.
This section includes migration-specific information associated with the following features:
Case transform enhancement on page 26
Data Quality projects in Data Integrator jobs on page 27
Data Services web address on page 27
Large object data type enhancements on page 28
License keycodes on page 31
Locale selection on page 31
ODBC bigint data type on page 33
Persistent and pageable cache enhancements on page 33
Row delimiter for flat files on page 33

Case transform enhancement

In this version, you can choose the order of Case expression processing to improve performance by processing the less CPU-intensive expressions first. When the Preserve case expression order option is not selected in the Case transform, Data Services determines the order to process the case expressions. The Preserve case expression order option is available only when the Row can be TRUE for one case only option is selected.
By default, the Row can be TRUE for one case only option is selected and the Preserve case expression order option is not selected. Therefore, when you migrate to this version, Data Services will choose the order to process your Case expressions by default.
26 Data Services Migration Considerations Guide
Data Services Migration Considerations
Behavior changes in version 12.0.0
However, the reordering of expressions can change your results because there is no way to guarantee which expression will evaluate to TRUE first. If your results changed in this version and you want to obtain the same results as prior versions, select the Preserve case expression order option.

Data Quality projects in Data Integrator jobs

To do data cleansing in version Data Integrator 11.7, you created a Data Qualilty datastore and imported integrated batch projects as Data Quality transforms. When these imported Data Quality transforms were used in an
11.7 job, the data was passed to Data Quality for cleansing, and then passed
back to the Data Integrator job.
In Data Services 12, the Data Quality transforms are built in. Therefore, if you used imported Data Quality transforms in Data Integrator 11.7, you must replace them in Data Services with the new built-in Data Quality transforms.
Related Topics
Modifying Data Integrator 11.7 Data Quality projects on page 161
Migrating Data Quality integrated batch projects on page 159
How integrated batch projects migrate on page 89
2

Data Services web address

In this release, Data Integrator has become part of Data Services. Therefore, the Web address has changed for the Management Console. In previous releases, the Web address used "diAdmin" as the following format shows:
http://computername:port/diAdmin
In Data Services, the Web address uses DataServices:
http://computername:port/DataServices
Therefore, when you migrate to Data Services you must make changes in the following situations:
Data Services Migration Considerations Guide 27
Data Services Migration Considerations
2
Behavior changes in version 12.0.0
If you created a bookmark that points to the Management Console in a
previous release, you must update the bookmark to the changed Web address.
If you generated a Web Service Definition Language (WSDL) file in a
previous version of Data Integrator, you must regenerate it to use the changed Web address of the Administrator.

Large object data type enhancements

Data Services 12.0 extends the support of large objects as follows:
Adds support for binary large object (blob) data types from the currently
supported database systems (Oracle, DB2, Microsoft SQL Server, and so on).
Extends support for character large object (clob) and national character
object (nclob) data types to other databases.
Prior versions treat the clob and nclob data types as long data types, and this version continues to treat them as long data types.
The following table shows the large data types that version 11.7 supports as long data types and the additional large data types that version 12 now supports. If your pre-version 12 jobs have sources that contain these previously unsupported large data types and you now want to use them in version 12, you must re-import the source tables and modify your existing jobs to select these newly supported data types.
28 Data Services Migration Considerations Guide
Data Services Migration Considerations
Table 2-4: Database large object data types supported
Behavior changes in version 12.0.0
2
Database
DB2
Informix
Database da­ta type
LONG VAR­CHAR
LONG VAR­GRAPHIC
LVARCHAR
Category
VAR CHAR
Version
11.7 sup­ports
Version
12.0 sup­ports
Version
12.0 data type
LONGYesYesCLOB
LONGYesYesCLOBCLOB
LONGYesNoNCLOB
LONGYesNoNCLOBDBCLOB
BLOBYesNoBLOBBLOB
VARCHARYesYes
LONGYesYesCLOBTEXT
BLOBYesNoBLOBBYTE
LONGYesYesCLOBCLOB
BLOBYesNoBLOBBLOB
Microsoft SQL Server
VARCHAR (max)
NVARCHAR (max)
VARBINA RY(max)
LONGYesYesCLOBTEXT
LONGYesNoNCLOBNTEXT
LONGYesNoCLOB
LONGYesNoNCLOB
BLOBYesNoBLOBIMAGE
BLOBYesNoBLOB
Data Services Migration Considerations Guide 29
Data Services Migration Considerations
2
Behavior changes in version 12.0.0
Database
MySQL
ODBC LONGYesNoNCLOB
Oracle
Database da­ta type
SQL_LONG VARCHAR
SQL_WLONG VARCHAR
SQL_LONG VARBINARY
Category
Version
11.7 sup­ports
Version
12.0 sup­ports
Version
12.0 data type
LONGYesYesCLOBTEXT
BLOBYesNoBLOBBLOB
LONGYesYesCLOB
BLOBYesNoBLOB
LONGYesYesCLOBLONG
BLOBYesNoBLOBLONGRAW
LONGYesYesCLOBCLOB
LONGYesYesNCLOBNCLOB
BLOBYesNoBLOBBLOB
Sybase ASE
LONG VAR-
Sybase IQ
12.6 or later
Teradata
30 Data Services Migration Considerations Guide
CHAR
LONG BINA­RY
LONG VAR­CHAR
LONGYesNoCLOBTEXT
BLOBYesNoBLOBIMAGE
LONGYesYesCLOB
BLOBYesNoBLOB
LONGYesYesCLOB
LONGYesYesCLOBCLOB
BLOBYesNoBLOBBLOB

License keycodes

In this version, Data Services incorporates the BusinessObjects Enterprise installation technology and uses keycodes to manage the licenses for the different features. Therefore, Data Services does not use .lic license files anymore but manages keycodes in the License Manager.

Locale selection

In this version, you no longer set the locale of the Job Server when you install Data Services. After installation, the locale of the Job Server is set to <default> which enables Data Services to automatically set the locale for the repository connection (for the Designer) and to process job data (for the Job Server) according to the locale of the datastore or operating system. This capability enables Data Services to automatically change the locale for better performance (for example, set the locale to non-UTF-8 if the datastore is non-Unicode data).
Data Services Migration Considerations
Behavior changes in version 12.0.0
2
The following table shows different datastores and Job Server locale settings, the resulting locale that prior versions set, and the new locale that version
12.0 sets for the data flow. In this table, the Job Server locale is set to
<default> and derives its value from the operating system.
Datastore 1 locale
Single-byte code page
Multi-byte code page
Multi-byte code page
Datastore 2 locale
Multi-byte code page
Multi-byte code page
Multi-byte code page
Job Server locale
Single-byte code page or Multi-byte code page
Single-byte code page
Multi-byte code page
Data Services Migration Considerations Guide 31
Data flow lo­cale in prior version
Same locale as Job Server
Single-byte code page
Data flow lo­cale in ver­sion 12.0
Unicode
Unicode
UnicodeUnicode
Data Services Migration Considerations
2
Behavior changes in version 12.0.0
Datastore 1 locale
Single-byte code page 1
Single-byte code page 1
Single-byte code page 3
Single-byte code page 3
Datastore 2 locale
Single-byte code page 2
Single-byte code page 2
Single-byte code page 3
Single-byte code page 3
Job Server locale
Single-byte code page 3
Multi-byte code page
Single-byte code page 1
Multi-byte code page
Data flow lo­cale in prior version
Single-byte code page 3
Single-byte code page 1
Data flow lo­cale in ver­sion 12.0
Unicode
UnicodeUnicode
Single-byte code page 3
UnicodeUnicode
The following table summarizes the locale that Data Services now sets for each data flow when the locale of the Job Server is set to <default>. Different data flows in the same job can run in either single-byte or Unicode.
Locale of datastores in data flow
One datastore has multi-byte locale
Job Server lo­cale
Single-byte or Multi-byte
Locale that Data Services sets
Unicode
Different single-byte locales
You can override the default locale for the Job Server by using the Data Services Locale Selector utility. From the Windows Start menu, select Programs > BusinessObjects XI 3.1 > BusinessObjects Data Services > Data Services Locale Selector.
32 Data Services Migration Considerations Guide
Single-byte or Multi-byte
Unicode
Single-byteSingle-byteSame single-byte locale
UnicodeMulti-byteSame single-byte locale
Data Services Migration Considerations
Behavior changes in version 12.0.0

ODBC bigint data type

For an ODBC datastore, Data Services now imports a bigint data type as decimal. In prior releases of Data Integrator, the bigint data type was imported as a double data type. If your pre-version 12 jobs have sources that contain bigint data types, you must re-import the source tables and modify your existing jobs to handle them as decimal data types.

Persistent and pageable cache enhancements

This release of Data Services provides performance enhancements for the persistent and pageable caches. Decimal data types now use only half the memory used in prior versions.
However, persistent cache tables created in prior versions are not compatible with Data Services. You must recreate them by rerunning the jobs that originally created and loaded the target persistent cache tables.
2

Row delimiter for flat files

In Data Services 12, you can now specify the following values as row delimiters for flat files:
{new line}
If you specify this value for the row delimiter, Data Services writes the appropriate characters for the operating system on which the Job Server is running:
CRLF (\r\n) in Windows
LF (\n) in UNIX
any character sequence
In this case, Data Services writes the characters you entered.
{UNIX new line}
In this case, Data Services writes the characters LF (\n) regardless of the operating system.
Data Services Migration Considerations Guide 33
Data Services Migration Considerations
2

Behavior changes in version 11.7.3

{Windows new line}
In this case, Data Services writes the characters CRLF (\r\n) regardless of the operating system.
In previous releases, you could only specify the following values as row delimiters for flat files, and the behavior is the same as in the new release:
{new line}
any character sequence
If your target appends to an existing file that was generated in a prior release, Data Services is not backward compatible for the following situations:
Your Job Server runs on a Windows platform and you choose {UNIX new
line} for the row delimiter.
Your Job Server runs on a UNIX system and you choose {Windows new
line} for the row delimiter.
In these situations, you must define a new file format, load data from the existing file into the new file specifying the new row delimiter, and then append new data to the new file with the new row delimiter.
Behavior changes in version 11.7.3
The following sections describe changes in the behavior of Data Services
12.0 from previous releases of Data Integrator. In most cases, the new version
avoids changes that would cause existing applications to modify their results. However, under some circumstances a change has been deemed worthwhile or unavoidable.
This section includes migration-specific information associated with the following features:
Data flow cache type on page 35
Job Server enhancement on page 35
Logs in the Designer on page 35
Pageable cache for memory-intensive data flows on page 35
34 Data Services Migration Considerations Guide

Data flow cache type

When upgrading your repository from versions earlier than 11.7 to an 11.7 repository using version 11.7.3.0, all of the data flows will have a default Cache type value of pageable. This is different from the behavior in 11.7.2.0, where the upgraded data flows have a default Cache type value of in-
memory.

Job Server enhancement

Using multithreaded processing for incoming requests, each Data Integrator Job Server can now accommodate up to 50 Designer clients simultaneously with no compromise in response time. (To accommodate more than 50 Designers at a time, create more Job Servers.)
In addition, the Job Server now generates a Job Server log file for each day. You can retain the Job Server logs for a fixed number of days using a new setting on the Administrator Log retention period page.
Data Services Migration Considerations
Behavior changes in version 11.7.3
2

Logs in the Designer

In Data Integrator 11.7.3, you will only see the logs (trace, error, monitor) for jobs that started from the Designer, not for jobs started via other methods (command line, real-time, scheduled jobs, or Web services). To access these other log files, use the Administrator in the Data Integrator Management Console.

Pageable cache for memory-intensive data flows

As a result of multibyte metadata support, Data Integrator might consume more memory when processing and running jobs. If the memory consumption of some of your jobs were running near the 2-gigabyte virtual memory limit in a prior version, there is a chance that the same jobs could run out of virtual memory. If your jobs run out of memory, take the following actions:
Set the data flow Cache type value to pageable.
Data Services Migration Considerations Guide 35
Data Services Migration Considerations
2

Behavior changes in version 11.7.2

Specify a pageable cache directory that:
Contains enough disk space for your data. To estimate the amount of
space required for pageable cache, consider factors such as the number of concurrently running jobs or data flows and the amount of pageable cache required for each concurrent data flow
Exists on a separate disk or file system from the Data Integrator system
and operating system (such as the C: drive on Windows or the root file system on UNIX).

Adapter SDK

The Adapter SDK no longer supports native SQL or partial SQL.

PeopleSoft 8

PeopleSoft 8 support is implemented for Oracle only.
Data Integrator jobs that ran against previous versions of PeopleSoft are not guaranteed to work with PeopleSoft 8. You must update the jobs to reflect metadata or schema differences between PeopleSoft 8 and previous versions.
Behavior changes in version 11.7.2
The following sections describe changes in the behavior of Data Integrator
11.7.2 from previous releases. In most cases, the new version avoids changes
that would cause existing applications to modify their results. However, under some circumstances a change has been deemed worthwhile or unavoidable.
This section includes migration-specific information associated with the following features:
Embedded data flows on page 37
Oracle Repository upgrade on page 37
Solaris and AIX platforms on page 39
36 Data Services Migration Considerations Guide

Embedded data flows

In this version of Data Integrator, you cannot create embedded data flows which have both an input port and an output port. You can create a new embedded data flow only at the beginning or at the end of a data flow with at most one port, which can be either an input or an output port.
However, after upgrading to Data Integrator version 11.7.2, embedded data flows created in previous versions will continue to run.

Oracle Repository upgrade

If you previously upgraded your repository to Data Integrator 11.7.0 and open the "Object State Report" on the Central repository from the Web Administrator, you may see the error "ORA04063 view ALVW_OBJ_CINOUT has errors". This occurs if you had a pre-11.7.0. Oracle central repository and upgraded the central repository to 11.7.0.
Data Services Migration Considerations
Behavior changes in version 11.7.2
2
Note:
If you upgraded from a pre-11.7.0.0 version of Data Integrator to version
11.7.0.0 and you are now upgrading to version 11.7.2.0, this issue may occur,
and you must follow the instructions below. Alternatively, if you upgraded from a pre-11.7.0.0 version of Data Integrator to 11.7.2.0 without upgrading to version 11.7.0.0, this issue will not occur and has been fixed in 11.7.2.0.
To fix this error, manually drop and recreate the view ALVW_OBJ_CINOUT using an Oracle SQL editor, such as SQLPlus.
Use the following SQL statements to perform the upgrade:
DROP VIEW ALVW_OBJ_CINOUT;
CREATE VIEW ALVW_OBJ_CINOUT (OBJECT_TYPE, NAME, TYPE, NORMNAME, VERSION, DATASTORE, OWNER,STATE, CHECKOUT_DT, CHECKOUT_REPO, CHECKIN_DT,
CHECKIN_REPO, LABEL, LABEL_DT,COMMENTS,SEC_USER,SEC_USER_COUT) AS
(
Data Services Migration Considerations Guide 37
Data Services Migration Considerations
2
Behavior changes in version 11.7.2
select OBJECT_TYPE*1000+TYPE,NAME, N'AL_LANG' , NORMNAME,VER SION,DATASTORE, OWNER, STATE, CHECKOUT_DT, CHECKOUT_REPO, CHECKIN_DT,
CHECKIN_REPO, LABEL, LABEL_DT,COMMENTS,SEC_USER ,SEC_USER_COUT
from AL_LANG L1 where NORMNAME NOT IN ( N'CD_DS_D0CAFAE2' , N'XML_TEMPLATE_FORMAT' , N'CD_JOB_D0CAFAE2' , N'CD_DF_D0CAFAE2' , N'DI_JOB_AL_MACH_INFO' , N'DI_DF_AL_MACH_INFO' , N'DI_FF_AL_MACH_INFO' )
union
select 20001, NAME,FUNC_TYPE ,NORMNAME, VERSION, DATASTORE, OWNER, STATE, CHECKOUT_DT, CHECKOUT_REPO, CHECKIN_DT,
CHECKIN_REPO, LABEL, LABEL_DT,COMMENTS,SEC_USER ,SEC_USER_COUT
from AL_FUNCINFO F1 where FUNC_TYPE = N'User_Script_Function' OR OWNER <> N'acta_owner'
union
select 30001, NAME, N'PROJECT' , NORMNAME, VERSION, N'' , N'' , STATE, CHECKOUT_DT, CHECKOUT_REPO, CHECKIN_DT,
CHECKIN_REPO, LABEL, LABEL_DT,COMMENTS,SEC_USER ,SEC_USER_COUT
from AL_PROJECTS P1
union
select 40001, NAME,TABLE_TYPE, NORMNAME, VERSION, DATASTORE, OWNER, STATE, CHECKOUT_DT, CHECKOUT_REPO, CHECKIN_DT,
CHECKIN_REPO, LABEL, LABEL_DT,COMMENTS,SEC_USER ,SEC_USER_COUT
from AL_SCHEMA DS1 where DATASTORE <> N'CD_DS_d0cafae2'
union
select 50001, NAME, N'DOMAIN' , NORMNAME, VERSION, DATASTORE, N'' , STATE, CHECKOUT_DT, CHECKOUT_REPO, CHECKIN_DT,
CHECKIN_REPO, N'' ,to_date( N'01/01/1970' , N'MM/DD/YYYY' ), N'' ,SEC_USER ,SEC_USER_COUT
from AL_DOMAIN_INFO D1
38 Data Services Migration Considerations Guide
Data Services Migration Considerations

Behavior changes in version 11.7.0

);

Solaris and AIX platforms

Data Integrator 11.7.2 on Solaris and AIX platforms is a 64-bit application and requires 64-bit versions of the middleware client software (such as Oracle and SAP) for effective connectivity. If you are upgrading to Data Integrator
11.7.2 from a previous version, you must also upgrade all associated
middleware client software to the 64-bit version of that client. You must also update all library paths to ensure that Data Integrator uses the correct 64-bit library paths.
Behavior changes in version 11.7.0
The following sections describe changes in the behavior of Data Integrator
11.7.0. from previous releases. In most cases, the new version avoids
changes that would cause existing applications to modify their results. However, under some circumstances a change has been deemed worthwhile or unavoidable.
2
This section includes migration-specific information associated with the following features:
Data Quality on page 40
Distributed data flows on page 42
JMS Adapter interface on page 43
XML Schema enhancement on page 43
Password management on page 43
Repository size on page 44
Web applications on page 44
Web services on page 44
Data Services Migration Considerations Guide 39
Data Services Migration Considerations
2
Behavior changes in version 11.7.0

Data Quality

Data Integrator 11.7.0 integrates the BusinessObjects Data Quality XI application for your data quality (formerly known as Data Cleansing) needs, which replaces Firstlogic's RAPID technology.
Note:
The following changes are obsolete with Data Services version 12.0 because the Data Quality transforms are built into Data Services, and you can use them just like the regular Data Integrator transforms in a data flow.
The following changes to data cleansing occurred in Data Integrator 11.7.0:
Depending on the Firstlogic products you owned, you previously had up
to three separate transforms that represented data quality functionality: Address_Enhancement, Match_Merge, and Name_Parsing.
Now, the data quality process takes place through a Data Quality Project. To upgrade existing data cleansing data flows in Data Integrator, replace each of the cleansing transforms with an imported Data Quality Project using the Designer.
You must identify all of the data flows that contain any data cleansing transforms and replace them with a new Data Quality Project that connects to a Data Quality blueprint or custom project.
Data Quality includes many example blueprints - sample projects that
are ready to run or can serve as a starting point when creating your own customized projects. If the existing blueprints do not completely suit your needs, just save any blueprint as a project and edit it. You can also create a project from scratch.
You must use the Project Architect (Data Quality's graphical user interface)
to edit projects or create new ones. Business Objects strongly recommends that you do not attempt to manually edit the XML of a project or blueprint.
Each imported Data Quality project in Data Integrator represents a
reference to a project or blueprint on the data quality server. The Data Integrator Data Quality projects allow field mapping.
40 Data Services Migration Considerations Guide
Data Services Migration Considerations
Behavior changes in version 11.7.0
To migrate your data flow to use the new Data Quality transforms
Note:
The following procedure is now obsolete with Data Services version 12.0 because the Data Quality transforms are now built into Data Services and you can use them just like the regular Data Integrator transforms in a data flow. If you performed this procedure in Data Integrator version 11.7, you will need to migrate these data flows to Data Services. See Data Quality
projects in Data Integrator jobs on page 27.
1. Install Data Quality XI, configure and start the server. For installation
instructions, see your Data Quality XI documentation.
Note:
You must start the server before using Data Quality XI with Data Integrator.
2. In the Data Integrator Designer, create a new Business Objects Data
Quality datastore and connect to your Data Quality server.
3. Import the Data Quality projects that represent the data quality
transformations you want to use. Each project appears as a Data Quality project in your datastore. For the most common data quality transformations, you can use existing blueprints (sample projects) in the Data Quality repository
4. Replace each occurrence of the old data cleansing transforms in your
data flows with one of the imported Data Quality transforms. Reconnect the input and output schemas with the sources and targets used in the data flow.
2
Note:
If you open a data flow containing old data cleansing transforms (address_enhancement, name_parsing, match_merge), Data Integrator displays the old transforms (even though they no longer appear in the object library). You can even open the properties and see the details for each old transform.
If you attempt to validate a data flow that contains an old data cleansing transform, Data Integrator throws an error. For example:
[Custom Transform:Address_Enhancement] BODI-1116074: First Logic support is obsolete. Please use the new Data Quality feature.
If you attempt to validate a data flow that contains an old data cleansing transform, Data Integrator throws an error. For example:
Data Services Migration Considerations Guide 41
Data Services Migration Considerations
2
Behavior changes in version 11.7.0
If you attempt to execute a job that contains data flows using the old data cleansing transforms Data Integrator throws the same type of error.
If you need help migrating your data cleansing data flows to the new Data Quality transforms, contact the SAP Business Objects Help Portal at
http://help.sap.com/.

Distributed data flows

After upgrading to this version of Data Integrator, existing jobs have the following default values and behaviors:
Job distribution level: Job.
All data flows within a job will be run on the same job server.
The cache type for all data flows: In-memory type
Uses STL map and applies to all join caches, table comparison caches and lookup caches, and so forth.
Default forCollect statistics for optimization and Collect statistics for
monitoring: deselected.
Default for Use collected statistics: selected.
Since no statistics are initially collected, Data Integrator will not initially use statistics.
Every data flow is run as a process (not as a sub data flow process).
New jobs and data flows you create using this version of Data Integrator have the following default values and behaviors:
Job distribution level: Job.
The cache type for all data flows:Pageable.
Collect statistics for optimization and Collect statistics for
monitoring: deselected.
Use collected statistics: selected.
If you want Data Integrator to use statistics, you must collect statistics for optimization first.
42 Data Services Migration Considerations Guide
Every data flow is run as a single process. To run a data flow as multiple
sub data flow processes, you must use the Data_Transfer transform or select the Run as a separate process option in transforms or functions.
All temporary cache files are created under the LINK_DIR\Log\PCache
directory. You can change this option from the Server Manager.

JMS Adapter interface

A new license key may be required to install the JMS Adapter interface. If you have a license key issued prior to Data Integrator XI R2 version 11.5.1, send a request to licensing@businessobjects.com with "Data Integrator License Keys" as the subject line.

XML Schema enhancement

Data Integrator 11.7 adds the new Include schema location option for XML target objects. This option is selected by default.
Data Services Migration Considerations
Behavior changes in version 11.7.0
2
Data Integrator 11.5.2 provided the key XML_Namespace_No_SchemaLocation for section AL_Engine in the Designer option Tools > Options > Job Server > General, and the default value, FALSE, indicates that the schema location is included. If you upgrade from 11.5.2 and had set XML_Namespace_No_SchemaLocation to TRUE (indicates that the schema location is NOT included), you must open the XML target in all data flows and clear the Include schema location option to keep the old behavior for your XML target objects.

Password management

Data Integrator now encrypts all password fields using two-fish algorithm.
To simplify updating new passwords for the repository database, Data Integrator includes a password file feature. If you do not have a requirement to change the password to the database that hosts the repository, you may not need to use this optional feature.
Data Services Migration Considerations Guide 43
Data Services Migration Considerations
2
Behavior changes in version 11.7.0
However, if you must change the password (for example, security requirements stipulate that you must change your password every 90 days), then Business Objects recommends that you migrate your scheduled or external job command files to use this feature.
Migration requires that every job command file be regenerated to use the password file. After migration, when you update the repository password, you need only regenerate the password file. If you do not migrate using the password file feature, then you must regenerate every job command file every time you change the associated password.

Repository size

Due to the multi-byte metadata support, the size of the Data Integrator repository is about two times larger for all database types except Sybase.

Web applications

The Data Integrator Administrator (formerly called the Web Administrator)
and Metadata Reports interfaces have been combined into the new Management Console in Data Integrator 11.7. Now, you can start any Data Integrator Web application from the Management Console launch pad (home page). If you have created a bookmark or favorite that points to the previous Administrator URL, you must update the bookmark to point to http://computername:port/diAdmin.
If in a previous version of Data Integrator you generated WSDL for Web
service calls, you must regenerate the WSDL because the URL to the Administrator has been changed in Data Integrator 11.7.

Web services

Data Integrator is now using Xerces2 library. When upgrading to 11.7 or above and configuring the Web Services adapter to use the xsdPath parameter in the Web Service configuration file, delete the old Web Services adapter and create a new one. It is no longer necessary to configure the xsdPath parameter.
44 Data Services Migration Considerations Guide
Data Services Migration Considerations

Behavior changes in version 11.6.0

Behavior changes in version 11.6.0
Data Integrator version 11.6.0.0 will only support automated upgrade from ActaWorks and Data Integrator versions 5.2.0 and above to version 11.5. For customers running versions prior to ActaWorks 5.2.0 the recommended migration path is to first upgrade to Data Integrator version 6.5.1.
If you are upgrading to 11.6.0.0, to take advantage of the new Netezza bulk-loading functionality, you must upgrade your repository.
If you are upgrading from Data Integrator versions 11.0.0 or older to Data Integrator 11.6.0.0, you must upgrade your repository. If upgrading from
11.5.0 to 11.6.0.0, you must upgrade your repository to take advantage of
the new:
Data Quality Dashboards feature
Preserving database case feature
Salesforce.com Adapter Interface (delivered with version 11.5 SP1 Adapter
Interfaces)
2
The following sections describe changes in the behavior of Data Integrator from previous releases. In most cases, the new version avoids changes that would cause existing applications to modify their results. However, under some circumstances a change has been deemed worthwhile or unavoidable.
This section includes:
Netezza bulk loading on page 45
Conversion between different data types on page 46

Netezza bulk loading

If you are upgrading to 11.6.0.0, to take advantage of the new Netezza bulk-loading functionality, follow this procedure:
Data Services Migration Considerations Guide 45
Data Services Migration Considerations
2

Behavior changes in version 11.5.1.5

Conversion between different data types

For this release, there is a change in behavior for data conversion between different data types. Previously, if an error occurred during data conversion (for example when converting a varchar string to an integer, the varchar string contains non-digits), the result was random. Now, the return value will be NULL for any unsuccessful conversion.
Previously Data Integrator returned random data for the result of a varchar to datetime conversion when the varchar string contains an illegal date format. Now, the return value will be NULL.
Behavior changes in version 11.5.1.5
Data Integrator version 11.5.1.5 supports only automated upgrade from ActaWorks and Data Integrator versions 5.2.0 and above to version 11.5. For customers running versions prior to ActaWorks 5.2.0 the recommended migration path is to first upgrade to Data Integrator version 6.5.1.
If you are upgrading to 11.5.1.5, to take advantage of the new Netezza bulk-loading functionality, you must upgrade your repository.
If you are upgrading from Data Integrator versions 11.0.0 or older to Data Integrator 11.5.1.5, you must upgrade your repository. If upgrading from
11.5.0 to 11.5.1.5, you must upgrade your repository to take advantage of
these features:
Data Quality Dashboards
Preserving database case
Salesforce.com Adapter Interface (delivered with version 11.5 SP1 Adapter
Interfaces)

Behavior changes in version 11.5.1

No new behavior changes in this release.
46 Data Services Migration Considerations Guide
Data Services Migration Considerations

Behavior changes in version 11.5.0.0

Behavior changes in version 11.5.0.0
Data Integrator version 11.5 will only support automated upgrade from ActaWorks and Data Integrator versions 5.2.0 and above to version 11.5. For customers running versions prior to ActaWorks 5.2.0 the recommended migration path is to first upgrade to Data Integrator version 6.5.1.
If you are upgrading from Data Integrator version 11.0.0 to Data Integrator version 11.5.0.0, you must upgrade your repository.
The following sections describe changes in the behavior of Data Integrator from previous releases. In most cases, the new version avoids changes that would cause existing applications to modify their results. However, under some circumstances a change has been deemed worthwhile or unavoidable.

Web Services Adapter

Newer versions of Data Integrator will overwrite the XSD that defines the input and output message files. Data Integrator stores those XSDs in a Tomcat directory and during Metadata import passes to the engine a URL to the XSDs. The engine expects the XSDs to be available at run time.
2
To avoid this issue, prior to installing Data Integrator version 11.5, save a copy of your LINKDIR\ext\webservice contents. After installing the new version of Data Integrator, copy the directory contents back in place. If you do not follow this procedure, the web service operations will require re-import from the web service datastore after the installation completes.

Varchar behavior

Data Integrator 11.5.0 and newer (including Data Services) conforms to ANSI SQL-92 varchar behavior. When you upgradeand save your previous configurations, the software uses the previous varchar behavior as the default. However, Business Objects recommends that you use the ANSI varchar behavior because the previous varchar behavior will not be supported in future versions.
Data Services Migration Considerations Guide 47
Data Services Migration Considerations
2

Behavior changes in version 11.0.2.5

Central Repository

Business Objects recommends the following guidelines to upgrade your central repository when installing a new version of Data Integrator.
Maintain a separate central repository for each Data Integrator version
to preserve object history.
An upgrade of the central repository might override the versions of any object whose internal representation (ATL) has changed in the new release.
Behavior changes in version 11.0.2.5
The following sections describe changes in the behavior of Data Integrator from previous releases. In most cases, the new version avoids changes that would cause existing applications to change their results. However, under some circumstances a change has been deemed worthwhile or unavoidable.

Teradata named pipe support

Beginning with Data Integrator version 11.0.2.5, you can use named pipes to bulk load data into Teradata databases. When Data Integrator tries to connect to the pipes, Teradata Warehouse Builder might not yet have created the pipes. Data Integrator waits one second and retries to connect, up to 30 seconds. If you need to change the 30-second wait time, go to the Designer select Tools > Options > Job Server > General window, enter al_engine for Section, enter NamedPipeWaitTime for Key, and enter a different value.
If you use Warehouse Builder and Teradata Tools and Utilities version 7.0 or 7.1, go to the Data Integrator Designer Tools > Options > Job Server > General window, enter al_engine for Section, enter PreTWB5_Syntax for Key, and set the value to TRUE.

Behavior changes in version 11.0.2

There are no new behavior changes for this release.
48 Data Services Migration Considerations Guide
Data Services Migration Considerations

Behavior changes in version 11.0.1.1

Behavior changes in version 11.0.1.1
The following sections describe changes in the behavior of Data Integrator from previous releases. In most cases, the new version avoids changes that would cause existing applications to change their results. However, under some circumstances a change has been deemed worthwhile or unavoidable.

Statistics repository tables

The repository tables AL_FLOW_STAT and AL_JOB_STAT are no longer populated for job, work flow, or data flow statistics. These two tables no longer exist in the newly created Data Integrator version 11.0.1 repositories. They may still exist in the repositories that are upgraded to version 11.0.1.
Beginning with Data Integrator version 11.0.1, all job, work flow, and data flow statistics are stored in the AL_HISTORY, AL_HISTORY_INFO, and AL_STATISTICS tables. In particular, the job statistics, previously stored in AL_JOB_STAT, are now stored in the AL_HISTORY and AL_HISTORY_INFO tables. The work flow and data flow statistics, previously stored in AL_FLOW_STAT are now stored in the AL_STATISTICS table. Use the metadata reporting tool to retrieve all job, work flow and data flow statistics.
2

Behavior changes in version 11.0.1

Data Integrator version 11.0 will only support automated upgrade from ActaWorks and Data Integrator versions 5.2.0 and above to version 11.0. For customers running versions prior to ActaWorks 5.2.0 the recommended migration path is to first upgrade to Data Integrator version 6.5.1.
If you are upgrading from Data Integrator version 11.0.0 to Data Integrator version 11.0.1, you must upgrade your repository to use the following features:
Sybase IQ as a source or target
New Data Integrator functions:
replace_substr_ext
long_to_varchar
Data Services Migration Considerations Guide 49
Data Services Migration Considerations
2

Behavior changes in version 11.0.0

varchar_to_long
load_to_xml
extract_from_xml
match_pattern
match_regex
literal
Generate delete row types in table comparison transform
COBOL data file reader
Impact Analysis in Metadata Reports for BusinessObjects XI
Auditing (some additional functionality was introduced in version 11.0.1)
The following sections describe changes in the behavior of Data Integrator from previous releases. In most cases, the new version avoids changes that would cause existing applications to change their results. However, under some circumstances a change has been deemed worthwhile or unavoidable.

Crystal Enterprise adapters

After installing a new version of Business Objects, you might receive a class
not found error when trying to start Crystal Enterprise adapter. Update the
Crystal Enterprise adapter configuration by updating the <BOProductName>BusinessObjects directory name in the classpath for both the Crystal Enterprise adapter and the DMAC command file.
Behavior changes in version 11.0.0
Data Integrator version 11.0 will only support automated upgrade from ActaWorks and Data Integrator versions 5.2.0 and above to version 11.0. For customers running versions prior to ActaWorks 5.2.0 the recommended migration path is to first upgrade to Data Integrator version 6.5.1.
The following sections describe changes in the behavior of Data Integrator from previous releases. In most cases, the new version avoids changes that would cause existing applications to change their results. However, under some circumstances a change has been deemed worthwhile or unavoidable.
50 Data Services Migration Considerations Guide

Changes to code page names

Starting with Version 11.0, Data Integrator displays some code pages with different names. For example, code page MS1252 displays as CP1252. Additionally, two code pages are no longer supported. These changes are visible only in the locale screens (in the installer, datastore, and file formats). The Data Integrator engine does not have any upgrade issues and can interpret the old code page names. The following new code page names apply to Data Integrator version 11.0 and later.
Data Services Migration Considerations
Behavior changes in version 11.0.0
New nameOld code page name
Not supportediso885910
CP949MS949
2
CP950MS950
CP1250MS1250
CP1251MS1251
CP1253MS1253
CP1254MS1254
CP1255MS1255
CP1257MS1257
Data Services Migration Considerations Guide 51
Data Services Migration Considerations
2
Behavior changes in version 11.0.0

Data Cleansing

The new Data Integrator installation will overwrite all template job files shipped with Data Integrator with the upgraded job files compatible with Firstlogic
7.50. If the user had manually modified the template job files with Data
Integrator version 6.5 and wants to preserve the changes, Business Objects recommends that a copy is made of the job file, or that the job file is renamed, prior to installing Data Integrator version 11.0.
New nameOld code page name
CP874MS874
CP1258MS1258
Not SupportedMS1361
US-ASCIIUSASCII
After installing Data Integrator version 11.0 and the Firstlogic 7.5 suite, you should do the following:
From the Data Integrator CD, copy pwdiqjob.upd from the \Utilities
directory to the dtr_iq directory. Click Yes to overwrite existing file.
For all your job templates, run Firstlogic's edjob utility to upgrade your
Firstlogic job files from the previous version to 7.50. Find the edjob utility under the installation of Firstlogic. Run the utility to upgrade all jobs of the same extension by specifying *.<extension> in the utility. Extension names are "diq", "iac", and "mpg". For instructions on how to use the edjob utility, refer to Firstlogic documentation.
Note:
If the jobs have already been updated to 7.50c (using the .upd file sent with the master install with edjob), you will receive the following error when trying
52 Data Services Migration Considerations Guide
Data Services Migration Considerations

Behavior changes in version 6.5.1

to open a 7.50 job: Error: The "Report Defaults" block, occurrence 1, has an incorrect number of lines and the job cannot be loaded.
These jobs will need to be manually edited to change the version string in the General block from 7.50c to 7.10c. Then edjob can be rerun with the correct .upd file.
For all job templates created under the address enhancement application, manually edit the job templates using an editor such as Notepad, and comment out the "Create Output File" block by adding the '*' character at the beginning of the line. For example:
*BEGIN Create Output File

License files and remote access software

In release 6.0 and higher, Data Integrator includes a specific change to license support when activating Data Integrator components across a terminal server, such as PC Anywhere or Terminal Services Client. You must request new license files to use release 6.x or above when using a terminal server. Specifically, you require:
New Designer licenses to use a terminal server to initiate a Designer
session across a terminal server connection (when using evaluation or emergency licenses, or node-locked licenses)
2
New Job Server and Access Server licenses to open the Service Manager
Utility across a terminal server connection (when using any type of license)
If you do not make the license change, you may see the error "Terminal Server remote client not allowed." If you require terminal services contact licensing@businessobjects.com.
Behavior changes in version 6.5.1
The following sections describe changes in the behavior of Data Integrator from previous releases. In most cases, the new version avoids changes that would cause existing applications to change their results. However, under some circumstances a change has been deemed worthwhile or unavoidable.
Data Services Migration Considerations Guide 53
Data Services Migration Considerations
2

Behavior changes in version 6.5.0.1

Behavior changes in version 6.5.0.1
The following sections describe changes in the behavior of Data Integrator from previous releases. In most cases, the new version avoids changes that would cause existing applications to change their results. However, under some circumstances a change has been deemed worthwhile or unavoidable.

Web services support

When using Data Integrator with Web services, the following limitations exist.
SAP R/3 IDoc message sources in real-time jobs are not supported.
The Data Integrator Web services server uses the first Job Server in the list of those available for a batch job. The first Job Server might be a server group.
If you are using Web services and upgrading to version 6.5.0.1, Web services server (call-in) functionality that you created with previous versions of Data Integrator is not supported. Perform the following changes to manually upgrade the Web services server:
Regenerate a WSDL file
Prior to release 6.5.0.1, every batch job and real-time service configured in Data Integrator was published in the WSDL file. In release 6.5.0.1, Data Integrator only publishes batch jobs and real-time services that you select. Clients that call Data Integrator using Web services will fail until you authorize the jobs and services being invoked. Consult Data Integrator technical manuals for instructions about how to select jobs, enable security, and generate a WSDL file for a Web services server.
Change SOAP calls to match those in the 6.5.0.1 version.
A session ID is not included in SOAP messages unless you enable security for the Web services server. Prior to release 6.5.0.1, the sessionID message part was published in every SOAP message.
To fix this problem, change SOAP calls to match those in the 6.5.0.1 version of the WSDL file.
If security is off, remove the sessionID line from the SOAP input
message.
54 Data Services Migration Considerations Guide
Data Services Migration Considerations
If security is on, copy the sessionID from the SOAP body to the SOAP
header as a session ID now appears in the SOAP header instead of the body.
Note:
When you install 6.5.0.1, the installer automatically upgrades the Web Services Adapter and your existing Web services clients will function properly.

Sybase bulk loader library on UNIX

The installed Sybase client on UNIX platforms does not include the Sybase bulk loader dynamic shared library that Data Integrator requires. Please refer to the README in either the CDROM Location/unix/PLATFORM/AWSybase directory or the $LINK_DIR/AWSybase directory for instructions about how to:
Confirm that the library is available in the Sybase client installation
Build the library

Behavior changes in version 6.5.0.0

2
Behavior changes in version 6.5.0.0
The following sections describe changes in the behavior of Data Integrator from previous releases. In most cases, the new version avoids changes that would cause existing applications to change their results. However, under some circumstances a change has been deemed worthwhile or unavoidable.

Browsers must support applets and have Java enabled

The Administrator and the Metadata Reporting tool require support for Java applets, which is provided by the Java Plug-in. If your computer does not have the Java Plug-in, you must download it and make sure that your security settings enable Java to run in your browser. Business Objects recommends you use Java Plug-in 1.3.1.
To find a plug-in go to: http://java.sun.com/products/plugin/downloads/index.html.
Data Services Migration Considerations Guide 55
Data Services Migration Considerations
2
Behavior changes in version 6.5.0.0

Execution of to_date and to_char functions

Prior to release 6.0, Data Integrator did not always push the functions
to_date and to_char to the source database for execution when it was
possible based on the data flow logic. The fix for this improved optimization requires that the data format specified in the function match the format of the extracted data exactly. The result can be significant performance improvement in flows that use these functions.
It may be that your existing function calls do not require a change because the format already matches the data being extracted. If the formats do not match, this change is required. If your function calls do not correspond to this new format requirement, you will see an error at runtime.
Compare the examples below with your to_date or to_char function calls to determine if a change is needed.
For example, the WHERE clause of the query transform in the
DF_DeltaSalesOrderStage of the Customer Order Rapid Mart (was eCache)
includes the following function calls:
WHERE (to_date(to_char(SALES_ORDER_XACT.LOAD_DATE, 'YYYY.MM.DD
HH24:MI:SS'), YYYY.MM.DD') = to_date(to_char($LOAD_DATE, 'YYYY.MM.DD HH24:MI:SS'), 'YYYY.MM.DD'))
To support this chance, change the WHERE clause to the following:
WHERE (to_date(to_char(SALES_ORDER_XACT.LOAD_DATE, 'YYYY.MM.DD'), 'YYYY.MM.DD') = to_date(to_char($LOAD_DATE, 'YYYY.MM.DD'), 'YYYY.MM.DD'))
Also, in the Sales Rapid Mart in the DF_SALESOrderHistory data flow, in the mapping column, VALID_FROM is set as:
to_date(to_char($G_LOAD_DATE, 'YYYY.MM.DD HH24:MI:SS'), 'YYYY.MM.DD')
To support this change, the mapping needs to be changed to:
to_date(to_char($G_LOAD_DATE, 'YYYY.MM.DD'), 'YYYY.MM.DD')
56 Data Services Migration Considerations Guide
Similarly, the SELECT list of the ConvertTimes query transform in the DF_Ca
pacity_WorkCenter data flow of the Profitability Rapid Mart, contains the
following function call(s):
to_date(to_char($G_LOAD_DATE, 'YYYY.MM.DD') || '00:00:00'), 'YYYY.MM.DD HH.MI.SS')
Change the function calls to:
to_date((to_char($G_LOAD_DATE, 'YYYY.MM.DD') || '<space>00:00:00'), 'YYYY.MM.DD H24:MI:SS')

Changes to Designer licensing

Data Integrator Designer version 6.5 does not support floating licenses. If you are currently using floating Designer licenses, obtain a new license key before upgrading. Send your request to licensing@businessobjects.com with Data Integration License Keys as the subject line. The new key will be added to your existing License Authorization Code.
Data Services Migration Considerations
Behavior changes in version 6.5.0.0
2
For more information, see the Online Customer Support site at
http://www.techsupport.businessobjects.com. Search for the Article titled
"Business Objects Data Integrator 6.5: Changes to Designer License Keys Q&A."

License files and remote access software

In release 6.0 and higher, Data Integrator includes a specific change to license support when activating Data Integrator components across a terminal server, such as PC Anywhere or Terminal Services Client. You must request new license files to use release 6.x when using a terminal server. Specifically, you require:
New Designer licenses to use a terminal server to initiate a Designer
session across a terminal server connection (when using evaluation or emergency licenses, or node-locked licenses)
New Job Server and Access Server licenses to open the Service Manager
Utility across a terminal server connection (when using any type of license)
Data Services Migration Considerations Guide 57
Data Services Migration Considerations
2
Behavior changes in version 6.5.0.0
If you do not make the license change, you may see the error "Terminal Server remote client not allowed." If you require terminal services contact licensing@businessobjects.com.

Administrator Repository Login

In release 6.0 and higher, in the Data Integrator Administrator, all the repository connectivity occurs using JDBC and hence the previous connectivity information is either invalid or incomplete. When you use the Administrator, you will need to re-register all of your repositories.
Use the following connection information guidelines:
DescriptionConnection FieldDatabase Type
Any logical name.Repository nameAll types
Machine name
Port
Service name/SIDOracle
58 Data Services Migration Considerations Guide
Machine name on which the database server is running
Port on which the database server is listen­ing. The Administrator provides default values; if you do not know the port number, use the default.
The database instance name to connect to. De­pending on whether you are using an 8.0.2 or 8i database, it is either the SID or the Service name.
Data Services Migration Considerations
Behavior changes in version 6.5.0.0
DescriptionConnection FieldDatabase Type
Datasource name as configured in DB2 client
DatasourceDB2
configuration utility. This is the same as in previ­ous Administrator or Designer connections.
2

Administrator Users

Administrator user configuration information in ActaWorks version 5.x was saved in a primary repository. When you upgrade to Data Integrator version
6.x, this information is lost. You must reconfigure your Administrator users manually.
Database nameSybase
Database nameMicrosoft SQL Server
Server nameInformix
Database name
Database instance name.
Database instance name.
The name of the ma­chine on which the server is running.
The name of the database instance run­ning on the Server.
Data Services Migration Considerations Guide 59
Data Services Migration Considerations
Behavior changes in version 6.5.0.0
2
60 Data Services Migration Considerations Guide

Data Quality to Data Services Migration

3
Data Quality to Data Services Migration
3

Overview of migration

This chapter provides information about:
migrating your Data Quality Projects into Data Services
understanding some of the benefits of using Data Services
seeing the differences between previous versions of Data Quality and
Data Services
using best practices during migration
learning how to troubleshoot during migration
Overview of migration

Who should migrate?

Anyone who is using Data Quality XI and Data Services as standalone applications should migrate to Data Services.
The migration utility works with these versions of software:
Data Integrator 11.7.x
Data Quality XI 11.7.x and 11.6.x
Data Integrator XI R2
Data Quality XI R2 11.5 and newer
Firstlogic IQ8 8.05c and newer
Those who are using the Firstlogic Data Quality Suite (Job file, RAPID, Library and/or eDataQuality) cannot use the migration utility to convert the existing projects into Data Services. The only option is to create the projects again in Data Services.

Why migrate?

You may have seen some literature that includes a comprehensive list of reasons to migrate. Here are a handful of the main reasons why you should migrate.
1
Some manual steps are required.
2
Some manual steps are required.
62 Data Services Migration Considerations Guide
1
2
Data Quality to Data Services Migration
Overview of migration
Performance
The new platform utilizes the past Data Integrator features with the improved Data Quality features in one user interface.
Data profiling of source and target
You can monitor, analyze, and report on the quality of information contained in the data marts, data warehouses, and any other data stored in databases. You can test business rules for validity and prioritize data quality issues so that investments can be made in the high impact areas.
Improved multi-user support
With Data Services, you have access to both a central repository (if purchased) for multi-user storage and a local repository for each user. Version control for the repository objects keeps you in control by labeling and comparing objects. This version includes top-notch security with authentication to access the central repository, authorization for group-based permissions to objects, and auditing for changes to each object.
3
Powerful debugging capabilities
You can set break points, view the data before and after each transform, set filters when previewing data and save and print the preview data.
Repository management
You can easily manage your fully relational repository across systems and have the ability to import repository objects from a file. You can also import source and target metadata for faster UI response times. With datastore configurations, you can define varying connection options for similar datastores between environments, use different relational database technology between environments without changing your jobs, and use different database owners between systems without making changes during migration to each environment. With system configurations, you can associate substitution configurations per system configuration by associating different substitution parameter values by environment.
Data Services Migration Considerations Guide 63
Data Quality to Data Services Migration
3
Overview of migration
Import and export metadata
You can import and export metadata with Common Warehouse Model (CWM)
1.0/1.1 support and ERWIN (Computer Associates) 4.x XML. You can also export on Meta Integration Model Bridge (MiMB), if you have it installed.
Auditing
You can audit your projects with statistics collection, rule definitions, email notification, and audit reporting.
Reports and dashboards
With the reporting tool, you can view daily and historical execution results and duration statistics. With the data validation dashboard, you can view the results of validation rules, organize the validation rules across jobs into functional areas and drill down statistics to a functional area's validation rules. You can also define high-level business rules based on results of validation rules.
With impact and lineage analysis, you can understand the cost and impact of other areas when the datasource is modified. You can view, analyze and print jobs, work flows, and data flow details, view table/file usage based on the source and target, view a summary of job variables and parameters, generate PDF/Word documents on a job-by-job basis, and view a transform option and field mapping summary.

Introduction to the interface

The Data Services user interface is different from the Data Quality user interface. It has similar elements, but in a different presentation.
64 Data Services Migration Considerations Guide
Data Quality to Data Services Migration
Overview of migration
3
Note:
The window shows a project named my_gsl_proj open on the left portion of the screen. The right portion shows the GlobalSuggestions transform input and output fields and the Option groups.
In the upper left corner of the Data Services UI, you can see the Project Area. You can see your project folders and any jobs that you have. It's a hierarchical view of the objects used in each project.
Below the Project Area is the is the Local Object Library where you have access to all of the reusable objects in your jobs. It is a view into your repository, so that you do not need to access the repository directly. There are tabs at the bottom so that you can view projects, jobs, work flows, data flows, transforms, data sources, file formats and functions.
The right side of the window is the workspace. The information presented here will differ based on the objects you have selected. For example, when you first open Data Services, you will see a Business Objects banner followed by Getting Started options, Resources, and Recent Projects. In the example, the workspace has the GlobalSuggestions transform open to the object editor.
Data Services Migration Considerations Guide 65
Data Quality to Data Services Migration
3
Overview of migration
The editor displays the input and output schemas for the object and the panel below lists the options for the object.
See the Data Services Designer Guide: Designer User Interface for information.
Related Topics
Introduction to the interface on page 64

Downloading blueprints and other content objects

We have identified a number of common scenarios that you are likely to handle with Data Services. Instead of creating your own job from scratch, look through the blueprints. If you find one that is closely related to your particular business problem, you can simply use the blueprint and tweak the settings in the transforms for your specific needs.
For each scenario, we have included a blueprint that is already set up to solve the business problem in that scenario. Each blueprint contains the necessary Data Services project, jobs, data flows, file formats, sample data, template tables, and custom functions to run the data flows in your environment with only a few modifications.
You can download all of the blueprints or only the blueprints and other content that you find useful from the SAP Community Network website. Here, we periodically post new and updated blueprints, custom functions, best practices, white papers, and other Data Services content. You can refer to this site frequently for updated content and use the forums to provide us with any questions or requests you may have. We have also provided the ability for you to upload and share any content that you have developed with the rest of the Data Services development community.
Instructions for downloading and installing the content objects are also located on the Blueprints web page.
1. To access the BusinessObjects Data Services Blueprints web page, go
to https://boc.sdn.sap.com/dataservices/blueprints in your web browser.
2. Open the Content Objects User's Guide to view a list of all of the available
blueprints and content objects and their descriptions, and instructions for downloading and setting up the blueprints.
3. Select the blueprint that you want to download. To download all blueprints,
select Data Quality Blueprints - All.
66 Data Services Migration Considerations Guide
Data Quality to Data Services Migration
4. Follow the instructions in the user's guide to download the files to the
appropriate location and make the necessary modifications in Data Services to run the blueprints.

Introduction to the migration utility

The Data Quality Migration Utility is a Windows-based utility command line that migrates your Data Quality repository to the Data Services repository. The utility is in the LINK_DIR\DQMigration folder. It uses an XML-based configuration file.
You can set options on this Windows-based utility to migrate the entire repository (recommended) or on a project-by-project or transform-by-transform basis. You can also set the utility to Analyze Mode where the utility identifies errors and warning during migration so that you can either fix them in Data Quality before fully migrating.
After running the utility you can optionally view the Migration Report in a web broswer for details of possible errors and warnings. We highly recommend you fix these before trying to run the job in Data Services.
Overview of migration
3
In addition, if your Data Quality jobs were published as Web services, after running the utility you can publish the migrated jobs as Data Services Web services. For information on publishing jobs as Web services, see the Data Services Integrator's Guide.
Related Topics
Running the dqmigration utility on page 76
dqmigration utility syntax and options on page 79

Terminology in Data Quality and Data Services

Several terms are different between Data Quality and Data Services.
Data Services Migration Considerations Guide 67
Data Quality to Data Services Migration
3
Overview of migration
Data Quality
er/option editor
vices
projectfolder
jobproject
workspacecanvas
data flowdataflow
object editoroption explor-
real timetransactional
sourcereader
DescriptionData Ser-
Terms are different, but it holds the project or job that runs.
In Data Quality, a project is able to run. In Data Services, a project is a level higher. The project contains the job, and the job is able to run.
In Data Quality, you dragged a transform onto a canvas. In Data Services, you drag a transform onto a workspace.
In Data Quality, a dataflow is a series of trans­forms hooked together, that may or may not run. In Data Services, the data flow includes every­thing that will extract, transform and load data.
The terms are different, but you do the same things: set your options.
The terms are different, but they mean the same thing: processing one or many records at a time, usually through a web service.
The terms are different, but they mean the same thing: a place where the incoming data is held.
The terms are different, but they mean the same thing: a place where the output data is held.
The terms are different, but they mean the same thing: a text string alias.
substitution variables
targetwriter
substitution parameters
In Data Quality, you had a few basic layers: a folder, a project, and a dataflow that contains a series of transforms which may or may not run.
In Data Services, the top layer is called a project. The next layer is a job that runs. The job may hold a work flow which is where you can set up conditional processing. The work flow, if you use one, will contain the data flow that contains a series of transforms.
See the Data Services Designer Guide: Designer User Interface for information.
68 Data Services Migration Considerations Guide

Naming conventions

Object names
Your objects, when migrated, will have the prefix DQM_. If the name of the object is longer than 64 characters, then a numeric suffix will be added (for example, _001) to preserve the unique names.
Data Quality to Data Services Migration
Overview of migration
3
Object
uration
named_connections
Datastores created from Reater/Writer settings
named_connections
File formats created from Reader/Writer settings
tions
and jobs
Input and output field names
Pre-migrated Data Quality name
dqserver1_substitutions.xmlSubstitution variable config-
Project "first.xml" with Reader called "My_DB_Reader"
Project "first.xml" with Reader called "Cust_Det"
my_usa_reg_addr_cleanseSDK transform configura-
Migrated Data Services name
DQM_dqserv­er1_substitu­tions.xml
DQM_first_connfirst_connDatastores created from
DQM_first_My_DB_Read­er
DQM_CRM_usaCRM_usaFile formats created from
DQM_first_Cust_De­tect
DQM_my_usa_reg_ad­dr_cleanse
DQM_firstfirst.xmlData Integrator data flows
Data Quality input and output fields have a period in their names. However, Data Services does not allow a period (.) in the column names. Therefore, the dqmigration utility replaces the period with an underscore (_).
For example, suppose your Data Quality Reader transform has input field names input.field1, input.field2, and input.field3. After migration, these field names become input_field1, input_field2, and input.field3.
Data Services Migration Considerations Guide 69
Data Quality to Data Services Migration
3
Overview of migration

Deprecated objects

Differences between Data Quality and Data Services
While this list is not exhaustive, it lists the major differences that you will see between Data Quality and Data Services.
There are also changes to transform options and option groups. See each transform section in this document for details.
Web Services
The .Net deployment is deprecated.
Match transforms
The Aggregator, Sorter3, Candidate Selector, Match or Associate, Group Statistics, Unique ID, and Best Record transforms are now included together in one Match transform. You will also notice the performance improvement in the Match and UDT transforms.
GAC transform
Support for Australia, Canada, Global Address, EMEA, Japan and USA engines. There is also a new Suggestion List option group.
URAC transform
You can only output one Suggestion Lists output field within URAC.
Search/Replace transform
The Search/Replace Transform is replaced by a Query transform using a new search_replace() function call.
Data Cleanse transform
New dictionary connection management window.
Global Suggestion Lists transform
New Suggestion List Option group.
Phonetic Key transform
3
The Sorter transform only becomes part of the Match transform when it is migrated in a specific transform order like Sorter, Aggreator and then Match.
70 Data Services Migration Considerations Guide
Data Quality to Data Services Migration
Overview of migration
The Phonetic Key transform is replaced by a query transform with either a double_metaphone() or soundex() function call.
Substitution variables and files
Substitution variables are now referred to as Substitution parameters.
Python changes
Several Python methods have been deprecated.
Deprecated objects
The improved technology in Data Services requires the depreciation of certain aspects of Data Quality.
Compound Transforms
Shared options
Candidate Selector's Unique ID record stem support
Some options and behaivours related to pre/post SQL operations in the
database Reader and Writer transforms
UDT's per dataflow mode (use a workflow and scripts as a workaround)
Disabled transforms
3
Note:
Disabled transforms in Data Quality projects are enabled after migration. If you don't want to the enable the transform, then removed them prior to migration.
Flat Files
Binary file type support in delimited files in versions of Data Quality
earlier than 11.7
Logical and packed field support in versions of Data Quality earlier
than 11.7.
Data collections
Thread and watermark settings on a per-transform basis
Observer transform
Progress Service transform
Integrated batch API
Admin methods in real-time API
Netscape and Firefox browsers
JIS_Encoding for flat files
Data Services Migration Considerations Guide 71
Data Quality to Data Services Migration
3
Overview of migration
Some less popular code page support (most used code pages are
supported)
Several Python methods
Web Services .Net deployment
Several Web Services functions
Sun Java application server for web tier
Related Topics
Introduction to the interface on page 64
Overview of migrated transforms on page 103

Premigration checklist

To ensure a smooth migration, make sure you have completed the following tasks.
Upgrade to Data Quality XI 11.7, if possible.
Being on the most current version ensures that the latest options and functionality are properly installed and named. The options will more easily map to the Data Services options. Upgrade the repository using RepoMan.
Verify that you have permissions to both the Data Quality and Data
Services repositories.
If you don't have a connection to the repositories or permissions to access the repositories, then you will not be able to migrate to Data Services.
Back up your Data Quality and Data Services repositories.
Your projects will look different in Data Services. You may want to keep a backup copy of your repository so that you can compare the Data Quality and/or Data Services setup with Data Services.
Clean your repository.
Delete any unused projects or projects that have verification errors. Projects that do not run on Data Quality XI will not migrate well and will not run on Data Services without making some changes in Data Services. Projects that have verification errors due to input or output fields will not migrate. Remove any custom transforms, compound transforms and projects that you do not use anymore from the file system repository.
72 Data Services Migration Considerations Guide
Verify that your support files are accessible.
If you have a flat file reader or writer, ensure that the corresponding FMT or DMT file is in the same directory location as the flat file reader or writer.
Install Data Services.
Follow the installation instructions in the BusinessObjects Data Services
XI 3.1 Installation Guide.

Using the migration tool

Overview of the migration utility

You invoke the Data Quality migration utility with the dqmigration command and specify a migration configuration file name. The utility is only available on Windows. If your repository is on UNIX, you must have a shared system to the repository, or FTP the repository file to a Windows system prior to running the utility. UNIX customers must run the migration from the Windows client.
Data Quality to Data Services Migration
Using the migration tool
3
The configuration file specifies the Data Quality repository to migrate from, the Data Services repository to migrate to, and processing options for the migration utility. Data Services provides a default migration configuration file named dqmig.xml in the directory LINK_DIR\DQMigration. You can either edit this file or copy it to customize it. For details, see Running the dqmigration
utility on page 76.
The value of the LINK_DIR system variable is the path name of the directory in which you installed Data Services.
The dqmigration utility creates the following files:
Migration report in LINK_DIR \DQMigration\mylogpath
This migration report provides the status of each object that was migrated and displays the informational, warning, and error messages from the migration utility. After the dqmigration utility completes, it displays a prompt that asks you if you want to view the migration report. Always open the file in Internet Explorer.
Work files
Data Services Migration Considerations Guide 73
Data Quality to Data Services Migration
3
Using the migration tool
The dqmigration utility creates a directory LINK_DIR \DQMigration\work_files that contains the following files. Use these files if you need to troubleshoot any errors in the migration report.
Directories configuration_rules_1 and configuration_rules_2 – These
directories contain a copy all of the intermediate XML created during migration.
.atl files – These files contain the internal language that Data Services
uses to define objects.
The last step of the dqmigration utility imports the .atl files and creates the equivalent jobs, data flows, and connections in the Data Services repository.
Related Topics
Running the dqmigration utility on page 76
dqmigration utility syntax and options on page 79
Migration report on page 83

Migration checklist

During migration, follow these steps.
Complete the steps in the premigration checklist.
Run the migration utility on your entire repository (recommended) or on
your project that has all verification errors fixed.
Follow the utility prompts to complete the migration steps.
Review the view the migration report by selecting to view the report at
the end of the migration.
If you have errors or warnings that can be fixed in Data Quality 11.7, then
fix them and run the utility again. The files for the repository (or project, as the case may be) will be overwritten in Data Services when the utility is rerun.
Fix any other errors or warnings in Data Services.
Follow the recommendations in each transform section to optimize
performance.
Test the jobs in Data Services and compare the results with Data Quality
results.
Make changes in Data Services, as appropriate.
74 Data Services Migration Considerations Guide
After you have your jobs migrated to Data Services, you should set aside some time to fully analyze and test your jobs in a pre-production environment.
Related Topics
Premigration checklist on page 72

Connection information

The database connection options may be confusing, especially for <DATABASE_SERVER_NAME> which may be the name of the server, or the name of the database. To set your database connection information, open the dqmig.xml file in the directory LINK_DIR \DQMigration.
Locate the <DI_REPOSITORY_OPTIONS> section.
Based on your database type, you would enter information similar to the following.
Example: DB2
Data Quality to Data Services Migration
Using the migration tool
3
<DATABASE_TYPE>DB2</DATABASE_TYPE> <!-- Note here that Server name
is actually database name-->
<DATABASE_SERVER_NAME>REPORTS1</DATABASE_SERVER_NAME>
<DATABASE_NAME>REPORTS1</DATABASE_NAME>
<WINDOWS_AUTHENTICATION>NO</WINDOWS_AUTHENTICATION>
Example: MS SQL
<DATABASE_TYPE>Microsoft_SQL_Server</DATABASE_TYPE> <!-- Note
here that Server name is actual machine if it is default instance.-->
<DATABASE_SERVER_NAME>MACHINE-XP2</DATABASE_SERVER_NAME> <--If
you have separate instance then you have to mention something like IQ8TEST-XP2\MSSQL2000 -->
<DATABASE_NAME>REPORTS_MDR</DATABASE_NAME>
<WINDOWS_AUTHENTICATION>NO</WINDOWS_AUTHENTICATION>
Data Services Migration Considerations Guide 75
Data Quality to Data Services Migration
3
Using the migration tool
Example: MySQL
<DATABASE_TYPE>MySQL</DATABASE_TYPE> <!-- Note here that Server
name is actually ODBC DSN name created by installer.-->
<DATABASE_SERVER_NAME>BusinessObjectsEIM</DATABASE_SERVER_NAME>
<DATABASE_NAME>BusinessObjectsEIM</DATABASE_NAME>
<WINDOWS_AUTHENTICATION>NO</WINDOWS_AUTHENTICATION>
Example: Oracle
<DATABASE_TYPE>Oracle</DATABASE_TYPE> <!-- Note here that Server
name is actually database name-->
<DATABASE_SERVER_NAME>DSORA103.COMPANY.NET</DATABASE_SERV
ER_NAME>
<DATABASE_NAME>DSORA103.COMPANY.NET </DATABASE_NAME>
<WINDOWS_AUTHENTICATION>NO</WINDOWS_AUTHENTICATION>
Example: Sybase
<DATABASE_TYPE>Sybase</DATABASE_TYPE> <!-- Note here that Server
name is actually machine name-->
<DATABASE_SERVER_NAME>sjds003</DATABASE_SERVER_NAME>
<DATABASE_NAME>test_proj </DATABASE_NAME>
<WINDOWS_AUTHENTICATION>NO</WINDOWS_AUTHENTICATION>

Running the dqmigration utility

When you run the dqmigration utility, you can specify the options in one of the following ways:
Run the dqmigration utility and specify all options on the command line.
76 Data Services Migration Considerations Guide
Data Quality to Data Services Migration
Using the migration tool
Specify all options in a configuration file and run the dqmigration utility
and select only the option that references that file on the command line.
Specify default values for the options in the configuration file and run the
dqmigration utility with some options specified on the command line.
The command line options override the values specified in the configuration file.
To run the dqmigration utility:
1. Make sure the PATH system environment variable contains
%LINK_DIR%/bin. The utility will not run if it is not there.
2. Set up the configuration file that contains the options for the dqmigration
utility. For example, create a customized configuration file named
dqmig_repo.xml and copy the contents of file dqmig.xml file to dqmig_repo.xml.
3. Specify the information for the Data Quality repository that you want to
migrate.
a. In the DQ_REPOSITORY_OPTIONS section of the configuration file,
specify the following options.
Absolute path name of your Data Quality repository configuration
file in the <CONFIGURATION_RULES_PATH> option.
Path name, relative to the absolute path name, of your Data Quality
substitution file in the <SUBSTITUTION_FILE_NAME> option.
3
Note:
Business Objects recommends that when you run the dqmigration utility the first time, you migrate the entire Data Quality repository instead of an individual project file to ensure that all dependent files are also migrated. Therefore, do not specify a value in the <FILE_OR_PATH> option the first time you run the utility. However, it is possible that after you migrate the entire repository and you fix errors in the resulting Data Services jobs, you might find an error that is easier to fix in Data Quality. In this case, you can run the migration utility on just the project after fixing it in Data Quality.
b. Specify the information for the Data Services repository to which you
want to migrate.
Data Services Migration Considerations Guide 77
Data Quality to Data Services Migration
3
Using the migration tool
For example, change the options in the dqmig_repo.xml configuration file to migrate the Data Quality repository at location D:\dqxi\11_7\repos
itory\configuration_rules to the Data Services repository repo.
<?xml version="1.0" encoding="UTF-16"?> <REPOSITORY_DOCUMENT FileType = "MigrationConfiguration">
<MIGRATION_OPTIONS>
<PROCESSING_OPTIONS>
<LOG_PATH>mylogpath</LOG_PATH>
<ANALYZE_ONLY_MODE>YES</ANALYZE_ONLY_MODE> </PROCESSING_OPTIONS> <DQ_REPOSITORY_OPTIONS>
<CONFIGURATION_RULES_PATH>D:\dqxi\11_7\repository\configu
ration_rules</CONFIGURATION_RULES_PATH>
<SUBSTITUTION_FILE_NAME>dqxiserver1_substitutions.xml</SUB
STITUTION_FILE_NAME>
<FILE_OR_PATH></FILE_OR_PATH> </DQ_REPOSITORY_OPTIONS> <DI_REPOSITORY_OPTIONS>
<DATABASE_TYPE>Microsoft_SQL_Server</DATABASE_TYPE>
<DATABASE_SERVER_NAME>my_computer</DATABASE_SERVER_NAME>
<DATABASE_NAME>repo</DATABASE_NAME>
<WINDOWS_AUTHENTICATION>NO</WINDOWS_AUTHENTICATION>
</DI_REPOSITORY_OPTIONS>
</MIGRATION_OPTIONS>
</REPOSITORY_DOCUMENT>
<USER_NAME>repo</USER_NAME> <PASSWORD>repo</PASSWORD>
4. Open a command window and change directory to LINK_DIR\DQMigration
where the dqmigration executable is located.
5. Run the dqmigration utility in analyze mode first to determine the status
of objects within the Data Quality repository. Specify the value YES for <ANALYZE_ONLY_MODE> in the configuration
file, or use the option -a when you run the dqmigration utility. For example, type the following command and options in a command window:
dqmigration -cdqmig_repo.xml -aYES
Note:
There is no space between the option -a and the value YES .
This step does not update the Data Services repository.
6. When the migration utility displays the prompt that asks if you want to
view the migration report, reply y.
7. In the migration report, review any messages that have type Error.
78 Data Services Migration Considerations Guide
Data Quality to Data Services Migration
Using the migration tool
You should migrate a production Data Quality repository where all of
the projects have executed successfully. If your Data Quality repository contains unverifiable or non-runnable projects, then they will likely not migrate successfully. Delete these types of projects before you run the dqmigration utility with <ANALYZE_ONLY_MODE> set to NO.
Note:
To delete a project in Data Quality, you delete its .xml file from the directory \repository\configuration_rules\projects.
If your Data Quality repository contains objects that the migration utility
currently does not migrate (for example dBase3 data sources), you need to take some actions before you migrate. For details, see Trou
bleshooting on page 215.
Other errors, such as a Reader or Writer that contains a connection string, can be fixed in Data Services after you run the migration utility with <ANALYZE_ONLY_MODE> set to NO.
For information about how to fix errors listed in the migration report, see
Troubleshooting on page 215.
8. After you have fixed the serious errors that pertain to the Data Quality
projects that you want to migrate, run the dqmigration utility with <ANALYZE_ONLY_MODE> set to NO.
This step imports the migrated objects into the Data Services repository.
3
Some of the Data Services jobs and data flows that result from the migration from Data Quality might require clean-up tasks before they can execute. For details, see Further cleanup on page 203.
Other migrated jobs or data flows might require changes to improve their performance. For details, see Improving performance on page 210.

dqmigration utility syntax and options

Use the dqmigration utility to migrate the contents of a Data Quality repository to a Data Services repository. You can also specify an individual project file or folder.
Data Services Migration Considerations Guide 79
Data Quality to Data Services Migration
3
Using the migration tool
Note:
Business Objects recommends that when you run the utility the first time, migrate the entire Data Quality repository instead of an individual project file to ensure that all dependent files are also migrated. You would need to obtain the names of these dependent files (click View > Referenced File Explorer in the Project Architect) and run the migration utility multiple times, once for each file.
If you specify an option in the command line, it overrides the value in the configuration file.
Note:
If your configuration file is invalid, the command line options will not be processed.
The following table describes the dqmigration utility options.
Option
-h or /h or /?
-cmig_conf
file_pathname
-lmig_log
file_path
XML tag in dqmig configura­tion file
None
None
<LOG_PATH> mig_log file_path </LOG_PATH>
Description
Prints the options available for this utility.
Name of the configuration file for this utility.
Default value is dqmig.xml which is in the LINK_DIR\DQMigration directory.
Path for the log file that this dqmigra­tion utility generates.
Default value is migration_logs which is in the current directory from which you run this utility.
80 Data Services Migration Considerations Guide
Data Quality to Data Services Migration
Using the migration tool
3
Option
-amode
-ttimeout_in_sec
onds
XML tag in dqmig configura­tion file
<ANALYZE_ONLY_MODE>
mode</ANALYZE_ON­LY_MODE>
<IMPORT_TIMEOUT> time out_in_seconds </IM­PORT_TIMEOUT>
Description
Analyze mode.
Specify YES to analyze the Data Quality repository and provide warning and error messages for objects that will require manual steps to complete the migration, but do not update the Data Ser­vices repository.
Specify NO to also update the Data Services repository with the migrat­ed objects.
Default value is NO.
Amount of time that the migration utility waits to import one Data Qual­ity project into the Data Services repository. If one project times out, the migration utility will continue with the next project. If a timeout occurs, the migration log indicates which object was being processed.
-ridq_input_re
pos_path
<CONFIGURA­TION_RULES_PATH> dq_in put_repos_path</CONFIGU­RATION_RULES_PATH>
Data Services Migration Considerations Guide 81
Default value is 300 seconds.
Path of the Data Quality repository files.
Default value blank, but the sample has the value
dq_directory\repository\con figuration_rules
where dq_directory is the directo­ry in which you installed Data Quali­ty.
Data Quality to Data Services Migration
3
Using the migration tool
Option
-rsdq_substitu
tion_file
-fidq_input_re
pos_file
-dtds_re
po_db_type
XML tag in dqmig configura­tion file
<SUBSTITU­TION_FILE_NAME> dq_sub stitution_file</SUBSTITU­TION_FILE_NAME>
<FILE_OR_PATH>dq_in
put_re pos_file</FILE_OR_PATH>
<DATABASE_TYPE>ds_re po_db_type</DATABASE_TYPE>
Description
Name of the substitution file to use during migration.
Default value is blank, but the sample has the value
dqxiserver1_substitutions.xml
Optional. Name of XML file or folder to migrate. If you specify a file, only that file migrates. If you specify a folder, all the contents of the folder and its subfolders migrate.
Default value is NULL which means migrate the entire Data Quality repository.
Database type for Data Services repository. Values can be one of the following:
ODBC
Oracle
Microsoft_SQL_Server
Sybase
MySQL
DB2
<DATABASE_SERV
-dsdi_re
po_db_server
82 Data Services Migration Considerations Guide
ER_NAME>di_re
po_db_serv er</DATABASE_SERV
ER_NAME>
Database server name for a Data Services repository that is Microsoft SQL Server or Sybase ASE.
Data Quality to Data Services Migration
Using the migration tool
3
Option
-dndi_re
po_db_name
-dwwin_auth
-dudi_repo_us
er_name
-dpdi_repo_pass
word
XML tag in dqmig configura­tion file
<DATABASE_NAME>di_re po_db_name</DATABASE_NAME>
<WINDOWS_AUTHENTICA TION>win_auth</WIN DOWS_AUTHENTICATION>
<USER_NAME>di_repo_us er_name </USER_NAME>
<PASSWORD>di_repo_pass word </PASSWORD>
Description
Connection name for a Data Ser­vices repository that is Oracle.
Data source name for a Data Ser­vices repository that is DB2 or MySQL.
Database name for a Data Ser­vices repository that is Microsoft SQL Server or Sybase ASE.
Whether or not to use Windows Au­thentication (instead of SQL Server authentication) for Microsoft SQL Server. Values can be Yes or No. The default is No.
User name of authorized user to the Data Services repository.
Password for the user authorized to the Data Services repository.

Migration report

When you run the dqmigration utility, it generates a migration report in your LINK_DIR\DQMigration\mylogpath directory.
The dqmigration utility displays the name of the migration report and a prompt that asks if you want to view it. The dqmigration log file has a name with the following format:
dqmig_log_processID_threadID_yyyymmdd_hhmmssmmm.xml
For example: dqmig_log_005340_007448_20080619_092451642.xml
The migration report consists of the following sections:
Data Services Migration Considerations Guide 83
Data Quality to Data Services Migration
3
Using the migration tool
Migration Summary
Migration Settings – Lists the configuration option values used for this
migration run.
General Processing Messages – Lists the status message of every
.xml file processed in the Data Quality repository. Possible message types are:
Info when the .xml file was processed successfully.
Error when the .xml file was not processed successfully, and you
Warning when the .xml file was processed and migrated, but you
The dqmigration utility executes in the following three stages, and this General Processing section displays a different color background for the messages that are issued by each stage.
Stage 1 — Determines the version of the source Data Quality
Stage 2 — Migrates Global Address Cleansing and Match family
Stage 3 — Migrates all other Data Quality objects and creates the
will need to take corrective action.
might need to take additional actions after migration to make the resulting data flow run successfully in Data Services. For example, a substitution variable is not migrated and you would need to enter the actual value in the Data Services object after migration.
repository. If the Data Quality repository is a version prior to 11.7, this stage migrates it to version 11.7. The background color of these stage 1 messages is beige.
transforms. The background color of these stage 2 messages is light pink.
.atl file to import into the Data Services repository. The background color of these stage 3 messages is the background color set for your Web browser.
General Objects – Lists the status of substitution files and connection
files.
Transform Configurations – Lists the status of each transform
configuration that was migrated.
Jobs – Lists the status of each job that was migrated.
Migration Details
This section provides detailed migration status for each of the following objects:
Each substitution file
84 Data Services Migration Considerations Guide
Data Quality to Data Services Migration

How Data Quality repository contents migrate

Each named connection in the named_connections.xml file.
Each transform configuration
Each job (project)
How Data Quality repository contents migrate
This section describes how the dqmigration utility migrates the following Data Quality Repository components which consist mainly of different XML files and folders:
Projects and folders
Substitution files and variables
Named connections
Data types
Content type
3
Related Topics
Deprecated objects on page 70

How projects and folders migrate

In Data Quality, folders contain objects, and the folder hierarchy can be multiple levels. After migration to Data Services, the utility prefixes each object with DQM_, and then they appear together in the Local Object Library. The "Properties" of the migrated object indicates its source Data Quality project.
Note:
All of the files are not migrated. For example, the contents of the blueprints folder, and the sample transforms that are automatically installed in Data Quality. If you want to migrate a file from that folder, move the contents to another folder that is migrated.
In Data Quality, you can create three types of projects:
Data Services Migration Considerations Guide 85
Data Quality to Data Services Migration
3
How Data Quality repository contents migrate
Batch – A batch project executes at a specific time and ends after all data
in the specified source is processed. See How batch projects migrate on page 86.
Transactional – A transactional project receives data from a calling
application (such as Web services or SDK calls) and processes data as it arrives. It remains active and ready for new requests. See How
transactional projects migrate on page 88.
Integrated Batch -- A project run using batch processing with an Integrated
Batch Reader and an Integrated Batch Writer transform. This type of project can be used to pass data to and from an integrated application, including Data Integrator XI Release 2 Accelerated (11.7). See How
integrated batch projects migrate on page 89.
Related Topics
How database Reader transforms migrate on page 121
How database Writer transforms migrate on page 131
How flat file Reader transforms migrate on page 142
How flat file Writer transforms migrate on page 150
How transactional Readers and Writers migrate on page 165
How batch projects migrate
The dqmigration utility migrates a Data Quality Batch project to the following two Data Services objects:
Data Flow -- A Data Services "data flow" is the equivalent of a Data Quality
project. Everything having to do with data, including reading sources, transforming data, and loading targets, occurs inside a data flow.
Job -- A "job" is the only object you can execute in Data Services.
Therefore, the migration utility creates a job to encompass the migrated data flow.
The dqmigration utility also creates additional objects, such as a datastore or file format, as sections How database Reader transforms migrate on page 121 and How flat file Reader transforms migrate on page 142 describe.
For example, suppose you have the following Data Quality my_address_cleanse_usa project.
86 Data Services Migration Considerations Guide
Data Quality to Data Services Migration
How Data Quality repository contents migrate
The dqmigration utility migrates this batch project to a Data Services job named DQM_my_address_cleanse_usa which contains a data flow with the same name. The dqmigration utility adds a DQM_ prefix to your project name from Data Quality to form the Data Services job and data flow name.
3
The Project Area shows that the data flow DQM_my_address_cleanse_usa contains the following migrated transforms:
The Data Quality Reader transform has become the following Data
Services objects:
Source object named Reader_Reader, which reads data from file
format DQM_my_address_cleanse_usa_Reader_Reader. The file format is visible on the Formats tab of the Local Object Library.
Query transform named Reader_Reader_Query, which maps fields
from the file format to the field names used by the rest of the data flow.
The Data Quality UsaRegAddressCleanseCass transform has become
the Data Services UsaRegAddressCleanseCass_UsaRegAddressCleanse transform.
The Data Quality Writer transform has become the following Data Services
objects:
Query transform named Writer_Writer_Query
Target object named Writer_Writer, which maps fields from field
names used in the data flow to the file format DQM_my_ad
Data Services Migration Considerations Guide 87
Data Quality to Data Services Migration
3
How Data Quality repository contents migrate
dress_cleanse_usa_Writer_Writer (visible in the Local Object
Library).
When the dqmigration utility creates the Data Services objects, it updates each object's description to indicate its source Data Quality project and transform. The description also contains the original Data Quality description.
For example, in the Data Services Designer, the Formats tab in the Local Object Library shows the following descriptions.
To view the full description, select a name in the Format list, right-click, and select Properties.
Related Topics
How database Reader transforms migrate on page 121
How database Writer transforms migrate on page 131
How flat file Reader transforms migrate on page 142
How flat file Writer transforms migrate on page 150
How transactional projects migrate
The dqmigration utility migrates a Data Quality transactional project to a Data Services real-time job that consists of the following objects:
Initialization — Starts the real-time process. The initialization component
can be a script, work flow, data flow, or a combination of objects. It runs only when a real-time service starts.
A data flow which contains the resulting Data Services objects from the
Data Quality project:
A single real-time source — XML message
88 Data Services Migration Considerations Guide
Data Quality to Data Services Migration
How Data Quality repository contents migrate
A single real-time target — XML message
Clean-up -- Ends the real-time process. The clean-up component (optional)
can be a script, work flow, data flow,or a combination of objects. It runs only when a real-time service is shut down.
For example, suppose you have the following Data Quality project my_trans_address_suggestions_usa.xml.
The dqmigration utility migrates this transactional project my_trans_address_suggestions_usa to a Data Services real-time job named DQM_my_trans_address_suggestions_usa which contains a data flow with the same name.
3
Related Topics
How transactional Readers and Writers migrate on page 165
Post-migration tasks for transactional Readers and Writers on page 169
How integrated batch projects migrate
In Data Integrator XI Release 2 Accelerated (version 11.7), a Data Quality integrated batch project passed data from a job in Data Integrator, cleansed the data in Data Quality, and passed the data back to the Data Integrator job. These Data Quality projects were imported into Data Integrator 11.7 through a Data Quality datastore and the imported Data Quality project was used as a cleansing transform within a Data Integrator data flow.
Data Services Migration Considerations Guide 89
Data Quality to Data Services Migration
3
How Data Quality repository contents migrate
The Data Quality transforms are now integrated into Data Services, and you can use them just like the regular Data Integrator transforms in a data flow. You no longer need to create a Data Quality datastore and import the integrated batch projects.
If you used imported Data Quality projects in Data Integrator 11.7, you will see them in Data Services.
Data Quality datastores are still visible, but you cannot edit them and you
cannot create new ones. You can delete the Data Quality datastores.
You cannot connect to the Data Quality Server and browse the integrated
batch projects.
The previously imported Data Quality projects are still visible:
under each Data Quality datastore, but you cannot drag and drop them
into data flows. You can use the option View Where Used to see what existing data flows use each imported Data Quality project.
as a Data Quality transform within the data flow, and you can open
this transform to view the field mappings and options.
To use your existing Data Integrator 11.7 integrated batch projects in Data Services, you must modify them in one of the following ways:
If your Data Integrator 11.7 data flow contains a small number of imported
Data Quality transforms, modify your data flow in Data Services to use the new built-in Data Quality transforms.
If your Data Integrator 11.7 data flow contains a large number of Data
Quality transforms, migrate the Data Quality integrated batch project and replace the resulting placeholder Readers and Writers with the appropriate sources and targets. A large number of Data Quality transforms can exist as either:
An imported Data Quality project that contains a large number of Data
Quality transforms.
Multiple imported Data Quality projects that each contain a few Data
Quality transforms.
Related Topics
Modifying Data Integrator 11.7 Data Quality projects on page 161
Migrating Data Quality integrated batch projects on page 159
How batch projects migrate on page 86
90 Data Services Migration Considerations Guide

How connections migrate

The dqmigration utility migrates Data Quality connections to one of the following Data Services objects, depending on whether the data source is a file or a database:
Datastores when the connection is a database -- "Datastores" represent
connection configurations between Data Services and databases or applications.
File Formats when the connection is a flat file -- A "file format" is a set of
properties describing the structure of a flat file (ASCII). File formats describe the metadata structure, which can be fixed or delimited. You can use one file format to access multiple files if they all have the same metadata structure.
For a named connection, the dqmigration utility generates the name of the resulting datastore or file format as follows:
DQM_connectionname
Data Quality to Data Services Migration
How Data Quality repository contents migrate
3
For connections that are not named (the Reader or Writer transform provides all of the connection information instead of the file named_connections.xml), the resulting name of the datastore or file format is:
DQM_projectname_readername_Reader
DQM_projectname_writername_Writer
The following table shows to which Data Services objects each specific Driver_Name value is migrated.
Table 3-2: Mapping of Data Quality Driver_Name options
Value of Data Quality connection Driv­er_Name option
FLDB_DBASE3
Data Services Migration Considerations Guide 91
Data Services object
Placeholder Fixed width File For­mat
DatastoreFLDB_STD_DB2
Data Quality to Data Services Migration
3
How Data Quality repository contents migrate
Value of Data Quality connection Driv­er_Name option
Rather than not migrate the Data Quality xml file, the dqmigration utility creates a placeholder file format or placeholder datastore that you can fix after migration in situations that include the following:
The flat file type or database type (such as dBase3) is not supported in
Data Services.
A substitution variable was used in the Data_Source_Name and the file
that the substitution variable refers to was not found.
For information about how to change the placeholder objects, see "Connection errors and warnings" in Troubleshooting on page 215.
Data Services object
File Format for a flat fileFLDB_DELIMITED
File Format for a flat fileFLDB_FIXED_ASCII
ODBC DatastoreFLDB_STD_MSSQL
Placeholder ODBC DatastoreFLDB_STD_MYSQL
DatastoreFLDB_STD_ODBC
DatastoreFLDB_STD_ORACLE9
DatastoreFLDB_STD_ORACLE10
The following table shows how the Data Quality connection options are mapped to the Data Services datastore options.
Note:
If the dqmigration utility cannot determine the database type from a named connection (for example, the connection name specified in the Data Quality Reader does not exist), then the utility creates an Oracle 9 datastore.
92 Data Services Migration Considerations Guide
Table 3-3: Connection_Options mapping
Data Quality to Data Services Migration
How Data Quality repository contents migrate
3
Data Quality Connection_Options option
FLDB_STD_DB2
FLDB_STD_MYSQL
Data_Source_ Name
Connection string
Data Services datastore option
Datastore nameNamed_Connection
Data Source
Data SourceFLDB_STD_MSSQL
Cleanup required. See Fixing
invalid datastores on
page 203.
Data SourceFLDB_STD_ODBC
Connection nameFLDB_STD_ORACLE9
Connection nameFLDB_STD_ORACLE10
Cleanup required. See Fixing
invalid datastores on
page 203.
Data Services Migration Considerations Guide 93
Data Quality to Data Services Migration
3
How Data Quality repository contents migrate
Data Quality Connection_Options option
FLDB_STD_DB2
FLDB_STD_MSSQL
FLDB_STD_MYSQL
Driver_Name
FLDB_STD_ODBC
FLDB_STD_ORACLE9
Data Services datastore option
Database type:DB2
Database version:DB2 UDB
8.x
Database type:ODBC
A placeholder ODBC datas­tore is created. Cleanup re­quired. See Fixing invalid
datastores on page 203.
Database type:ODBC
Driver_name in ODBC Data Source Administra­tion
Database type:Oracle
Database version:Oracle
9i
FLDB_STD_ORACLE10
Host_Name
PORT_NUMBER
94 Data Services Migration Considerations Guide
Database type:Oracle
Database version:Oracle
10g
Ignored, but the dqmigra
tion log displays a warning.
Ignored, but the dqmigra
tion log displays a warning.
In Edit Datastore, User nameUser_Name
In Edit Datastore, PasswordPassword
Data Quality to Data Services Migration
How Data Quality repository contents migrate
For example, suppose your Data Quality projects use the following named connections:
The following Formats tab in the local object library in the Data Services Designer shows:
The Data Quality named connection namedoutff is now the Data Services
file format DQM_namedoutff.
Note:
A Data Quality named connection can have two contexts: Client and Server. The dqmigration utility migrates the Server named connection and ignores the Client one.
For a named connection, the description of the resulting file format refers to named_connections.xml.
3
Other Data Quality connections migrated to file formats in addition to the
namedoutff flat file connection.
These other Reader and Writers do not use named connections; therefore, the description indicates the name of the Data Quality project from which it was created.
To view the connection information, select a name in the Format list, right-click, and select Edit.
Data Services Migration Considerations Guide 95
Data Quality to Data Services Migration
3
How Data Quality repository contents migrate
The Data Quality Driver Name FLDB_FIXED_ASCII becomes the Data
Services File Format Type Fixed width.
The Data Quality connection Name namedoutff becomes the Data
Services File Format Name DQM_namedoutff.
The Data Quality Data Source Name D:\dq_files\outff.txt becomes
the Data Services File name(s).
The following Datastores tab in the local object library in the Data Services Designer shows:
The Data Quality named connection orcl is now the Data Services
datastore DQM_orcl.
For a named connection, the description of the resulting datastore refers to named_connections.xml. Otherwise, the description indicates the name of the Data Quality project from which it was created.
Other Data Quality connections migrated to datastores in addition to the
orcl database connection.
96 Data Services Migration Considerations Guide
Data Quality to Data Services Migration
How Data Quality repository contents migrate
To view the connection information, select a datastore name in the datastore list, right-click, and select Edit .
The Data Quality connection Name orcl becomes the Data Services
Datastore name DQM_orcl.
The Data Quality Data Source Name orcl becomes the Data Services
Connection name orcl.
3

How substitution files and variables migrate

The dqmigration utility migrates Data Quality substitution files and variables as follows :
Each Data Quality substitution file becomes a Data Services substitution
configuration in the Substitution Parameter Store. The name of the resulting substitution configuration has the following format:
DQM_substitutionfilename
Each Data Quality substitution variable becomes a Data Services
substitution parameter. The name of the resulting substitution parameter is the same as the substitution variable. Afer migration, the substitution parameter is enclosed in square brackets ([]) when used in the data flow.
Data Services Migration Considerations Guide 97
Data Quality to Data Services Migration
3
How Data Quality repository contents migrate
After migration, you need to take additional steps to ensure that the migrated substitution parameters use the correct values. For details, see Setting
substitution parameter configuration on page 208.
Note:
In Data Quality, the substitution variables are case-sensitive. However, in Data Services, the substitution parameters are case-insensitive. Therefore, if two substitution variables exist with the same name but different casing, only the first substitution variable migrates and the dqmigration utility issues a warning message in the migration report.
Related Topics
Name differences with substitution parameters for reference files on
page 99
How built-in substitution variables migrate on page 98
Example of how substitution files and variables migrate on page 101
How built-in substitution variables migrate
The following table shows that $$Dataflow_Name is the only Data Quality
11.7 built-in substitution variable that migrates to Data Services . The dqmi
gration utility replaces all instances of $$Dataflow_Name with the Data
Quality project name.
98 Data Services Migration Considerations Guide
Migrate to Data ServicesData Quality built-in substitution variable
Yes$$Dataflow_Name
No$$Dataflow_XML_Path
No$$Log_Path
No$$Report_Path
No$$Stat_Path
No$$Time_Stamp
No$$Dataflow_ID
Data Quality to Data Services Migration
How Data Quality repository contents migrate
Migrate to Data ServicesData Quality built-in substitution variable
No$$Runtime_Metadata_Path
No$$Repos_Path
The dqmigration utility leaves the unsupported built-in substitution variables as they are in the migrated data flow and issues warning messages in the Migration Report. You must change these substitution variables to the actual values in the Data Services data flow. If you do not change them, Data Services produces run-time errors.
Note:
If any of these built-in substitution variables are in places that the ATL syntax does not allow substitution variables, the import of the generated .atl file will fail. You must take one of the following actions before you migrate the project:
Delete or change these variables to the actual values.
Remove the references to these variables.
When $$Dataflow_Name is used in a flat file Reader transform, the dqmigra
tion utility does the following:
Replaces the substitution variable with the actual value.
Locates the .fmt or .dmt file with the dataflow name.
Creates a corresponding file format.
However, $$Dataflow_Name is no longer a substitution variable. Therefore, if the name of the dataflow changed, you must change the actual value to avoid unexpected results.
3
Name differences with substitution parameters for reference files
The installation of Data Services creates a substitution configuration that includes substitution parameters for Data Quality reference files used by various content transforms. Most of the Data Quality substitution variables have been replaced by a single Data Services substitution parameter. These substitution parameter settings are now part of the base configurations for various content transforms.
Data Services Migration Considerations Guide 99
Data Quality to Data Services Migration
3
How Data Quality repository contents migrate
The Data Services names of these substitution parameters differ from the names of the corresponding substitution variables in Data Quality as the following table shows.
Table 3-5: Name differences for substitution parameters for reference files
Data Quality substitution variable name
$$DIR_PATH_GLOBAL_ADDRESS
$$DIR_PATH_AUSTRALIA
$$DIR_PATH_CANADA
$$DIR_PATH_JAPAN
$$DIR_PATH_MULTI_COUNTRY
$$DIR_PATH_USA
$$DIR_PATH_USA_DPV
$$DIR_PATH_USA_ELOT
$$DIR_PATH_USA_EWS
$$DIR_PATH_USA_GEO
$$DIR_PATH_USA_LACS
$$DIR_PATH_USA_RDI
Data Services substitution param­eter name
$$RefFilesAddressCleanse
$$DIR_PATH_USA_Z4CHANGE
100 Data Services Migration Considerations Guide
$$RefFilesDataCleansePart of $$DCT_PATH
$$ReportsAddressCleanseDid not exist in Data Quality
$$ReportsMatchDid not exist in Data Quality
$$SamplesInstallDid not exist in Data Quality
Loading...