Business Objects products in this release may contain redistributions of software
licensed from third-party contributors. Some of these individual components may
also be available under alternative licenses. A partial listing of third-party
contributors that have requested or permitted acknowledgments, as well as required
notices, can be found at: http://www.businessobjects.com/thirdparty
2008-03-12
Contents
Introduction5Chapter 1
Audience and assumptions..........................................................................6
This user guide tells you how to use the BusinessObjects™ Data Services
Adapter for Salesforce.com interface to integrate Salesforce.com version
6.0 with Data Services.
The Data Services Adapter for Salesforce.com interface allows you to create
a datastore that connects to the Salesforce.com web service and retrieves
data using Data Services data flows.
Audience and assumptions
This user guide assumes the following:
•You understand how to use the most current version of Data Services to
design and run batch and real-time data flows and administer Data
Services processes. (Administer adapters from the Administrator.)
•You have a working knowledge of Salesforce.com version 6.0.
•You know what an adapter is and the role it plays in business systems
integration.
•You are familiar with how to use SQL query statements.
•You understand Changed Data Capture concepts.
•You are familiar with object-oriented modeling and can work with an
object-oriented XML configuration file.
•Because you will integrate Data Services, the Data Services Adapter for
Salesforce.com, and Salesforce.com version 6.0, familiarity with systems
administration and systems integration issues is recommended.
The Data Services Adapter for Salesforce.com allows you to access
Salesforce.com data from within the native Data Services extraction,
transformation and loading (ETL) environment. The Data Services Adapter
for Salesforce.com interface allows you to quickly and easily take advantage
of Salesforce.com version 6.0 by:
•Supporting a fully automated process for Salesforce.com configuration
•Allowing you to browse Salesforce.com schema metadata in the same
manner as all sources and targets from within the Data Services Designer
interface
Installing the Adapter for Salesforce.com
The Data Services Adapter for Salesforce.com is automatically installed
when you install Data Services version 12.0.0 or later The adapter is
associated with several files including:
•Adapter jar files
•Adapter configuration templates
•Salesforce.com Software System extensions
•
User's Guide for Adapter for Salesforce.com (this document)
Requirements
The Job Server you associate with adapters must be configured to manage
adapters. (For more information, see Deployment overview on page 10. For
general Job Server installation and configuration information, see the DataServices Getting Started Guide.)
Note: For information about Salesforce.com, visit the Salesforce.com Web
This section explains the actions required to deploy the Data Services Adapter
for Salesforce.com interface. Tasks are sequenced in logical order of
performance. However, you may need to modify the sequence based on
your environment.
Deployment overview
All Data Services adapters communicate with Data Services through a
designated Adapter Manager Job Server. An adapter must be installed on
the same computer as this Job Server before you can integrate the adapter
with Data Services using the Administrator and Designer. After the adapter
is installed:
1. Use the Data Services Server Manager utility to configure adapter
connections with the Adapter Manager Job Server. For details, see the
"Configuring Job Servers" section in the Data Services Getting Started
Guide as well as the "Adapter considerations" section in the Data Services
Administrator's Guide.
2. From the Data Services Administrator:
•
Configure an adapter instance on page 11.
•
Start and stop the adapter instance on page 11.
3. From the Data Services Designer:
•
Create the datastore on page 11 in the object library. The datastore
and adapter make it possible for you to import metadata from
Salesforce.com version 6.0 into Data Services.
•
Browse and import metadata on page 12 through the datastore. Use
metadata accessed through the adapter to create batch and/or
real-time jobs. For details, see the "Adapter datastores" section of the
Data Services Designer Guide.
•
Design flows on page 14 that move Salesforce.com data through the
applications you design using Data Services.
•
Run applications on page 16 to finalize the integration process
(includes troubleshooting and parameter adjustments).
Integrate Data Services with Salesforce.com by combining an instance of
the Adapter for Salesforce.com with a Data Services data flow created in
the Designer. To use an adapter instance, you must first configure it as
described in this section. You can configure one or more adapter instances.
Configure an adapter instance
Use the Administrator to add an Adapter for Salesforce.com to the Data
Services system and to edit existing adapter configurations. Until you add
the adapter in the Administrator, you cannot run jobs using information from
that adapter.
Note: Before you add an adapter in the Administrator, you must first establish
Administrator connection to your adapter-enabled repository. For general
information on connecting repositories to the Administrator, refer to the
Administrator Management section of the Data Services Administrator Guide.
Deploying the Adapter
Configure the adapter
3
Start and stop the adapter instance
Click the Status tab to view the status of all adapter instances you configured.
From this tab, you can Start adapter instances and Shutdown or Abort
instances that are running.
From the Status tab, you can also navigate to view Adapter Instance
configuration details, Log Files, and Dependent Objects for each configured
adapter instance.
Create the datastore
To associate the Data Services Adapter for Salesforce.com with data flows,
you must create an adapter datastore in the Data Services Designer. For
general information on creating an adapter datastore, refer to the Datastores
section of the Data Services Designer Guide.
Data Services Salesforce.com Adapter Interface Guide11
Deploying the Adapter
3
Working with Salesforce.com metadata
Working with Salesforce.com metadata
The Salesforce.com adapter supports only tables (not function calls,
documents, and so on).
Browse and import metadata
For general information on how to browse and import metadata using a Data
Services datastore, see the "Datastores" section of the Data ServicesDesigner Guide.
The DI_PICKLIST_VALUES table
The Salesforce.com adapter includes a Data Services proprietary table you
can import like any other Salesforce.com table. This table contains all
Salesforce.com picklists (a set of enumerated values from which to select).
To use the DI_PICKLIST_VALUES table as a source in data flows, import
the DI_PICKLIST_VALUES just like you would any other table, then
drag-and-drop it as a source in your data flow. Connect to a Query transform
and drill down to add a WHERE clause and filter the values you require.
Columns defined for this table include:
OBJECT_NAME, FIELD_NAME, VALUE, IS_DEFAULT_VALUE, IS_ACTIVE,
and LABEL.
Note: If you have translated pickup values in Salesforce.com, the LABEL
column returns values for the language specified in your personal information
settings. If pickup values are not translated, the VALUE and LABEL columns
return the same values.
Open and delete imported metadata
You can open imported metadata to view input and output schemas. To open
an imported table, double-click its icon. To find the icon go to the adapter
datastore in the object library and open Tables.
From the Data Services Designer, you can also delete imported metadata
by right-clicking an imported object and selecting Delete from the menu.
After you import metadata, it is available for use in Data Services data flows.
Metadata mapping
Salesforce.com data types map to Data Services data types as follows:
Deploying the Adapter
Working with Salesforce.com metadata
Data Services DatatypeDescriptionSalesforce Datatype
3
xsd:base64Binary
xsd:boolean
xsd:datetime
The date/time values that the Salesforce.com adapter retrieves from
Salesforce.com are all in ISO 8601 format, reflect GMT time, and include a
time zone field. To adjust for any time zone differences, the Salesforce.com
adapter automatically performs a translation based on the associated local
and server clocks. When the Salesforce.com adapter communicates datetime
information to Data Services, Data Services receives those values in local
time and the time zone field is not considered.
Base 64-encoded binary
data
Boolean (True/False)
values
Date/time values (timestamps)
varchar
varchar ('true' or 'false')
dateDate valuesxsd:date
datetime
decimalDouble valuesxsd:double
intInteger valuesxsd:int
varcharCharacter stringsxsd:string
Note: If your local and server clocks are not synchronized, translation speed
is unaffected. However, if your local clock is not set to the correct time, Data
Services may send incorrect times to Salesforce.com and changes that you
expected to be returned may not be returned until a later synchronization.
Data Services Salesforce.com Adapter Interface Guide13
Deploying the Adapter
3
Design flows
Examples:
•If we are in Pacific Standard Time (PST) and the adapter receives
'2005-08-10T23:00:00Z' (where 'Z' means GMT time) from Salesforce.com,
the value sent to Data Services will be '2005.08.10 15:00:00'.
•You want to retrieve information that has changed since yesterday at 6:00
PM local time. You write a condition stating: SFDC_TIMESTAMP
>='2005.08.10 18:00:00' and Data Services sends this condition "as is"
to the adapter. Because Salesforce.com will not understand this timestamp
(it lacks a time zone indicator), the Salesforce.com adapter automatically
converts the time specified in Data Services to a format that
Salesforce.com understands, formatting the value to
'2005-08-11T01:00:00Z'.
CDC datastore tables and generated columns
The CDC table nodes differ from normal tables. If you expand a CDC table
node, you will only see a Columns folder that contains the same columns as
the original table with three generated columns. The generated columns are
used for CDC data retrieval. Generated columns include:
•DI_SEQUENCE_NUMBER, The Data Services sequence number (int).
•DI_OPERATION_TYPE, The Data Services operation type (varchar).
•SFDC_TIMESTAMP, The Salesforce.com timestamp (datetime).
Design flows
After importing metadata as datastore objects in the Data Services Designer,
you can use that metadata when designing data flows.
(For general application design and administration information, see the DataServices Designer Guide and the Data Services Administrator Guide.)
Changed data and Salesforce.com
One simple usage of the Salesforce.com tables is to read changed data.
The following example explains one way you can schedule Data Services
to query Salesforce.com for changed data after loading Salesforce.com
tables into your local repository.
Using check-points
If you can replicate an object, Salesforce.com allows applications to retrieve
the changed data for that object. Salesforce.com saves changed data for a
limited amount of time (for details, see your Salesforce.com technical
documentation). Salesforce.com monitors neither the retrieving application
nor the data retrieved.
When you enable check-points, a CDC job in Data Services uses the
subscription name to read the most recent set of appended rows and to mark
the end of the read (using the SF_Timestamp of the last record). If you disable
check-points, the CDC job always reads all the rows in the CDC data source
which increases processing time.
To use check-points, on the Source Table Editor enter the CDC Subscription
name and select the Enable check-point option. If you enable check-points
and run a CDC job in recovery mode, the recovered job begins to review the
CDC data source at the last check-point.
Deploying the Adapter
Design flows
3
Note: To avoid data corruption problems, do not reuse data flows that use
CDC datastores because each time a source table extracts data it uses the
same subscription name. This means that identical jobs, depending upon
when they run, can get different results and leave check-points in different
locations in the file.
Using the CDC table source default start date
The CDC table source default start date is dependent on several factors.
This date can be a value you specify, a check-point value, or a date related
to the Salesforce.com retention period.
When you do not specify a value for the start date:
•Data Services uses the beginning of the Salesforce.com retention period
as the start date if a check-point is not available (during initial execution).
•Data Services uses the check-point as the start date if a check-point is
available and occurs within the Salesforce.com retention period. If the
check-point occurs before the retention period, Data Services uses the
beginning of retention period as the start date.
Data Services Salesforce.com Adapter Interface Guide15
Deploying the Adapter
3
Run applications
Limitations
•However, if a table is created within the Salesforce.com retention period
and a check-point is not available, the execution returns an error message.
Drill into the source object and enter a value for the CDC table source
default start date. The value must be a date that occurs after the date the
table was created to work around this problem.
When you specify a start date value, if your date occurs:
•Within the Salesforce.com retention period and no check-point is available,
then Data Services uses your specified value.
•Within the Salesforce.com retention period and after the check-point,
Data Services uses your specified value.
•Within the Salesforce.com retention period and before the check-point,
Data Services uses the check-point value as the start date.
•Outside of the Salesforce.com retention period, the Salesforce.com
Adapter ignores the value.
The table comparison and SQL transforms and the lookup and lookup_ext
functions cannot be used with a source table imported with a CDC datastore
because of the existence of the Data Services generated columns. You
cannot compare or search these columns.
Run applications
After you design your application(s), you must run them to finalize Data
Services-Salesforce.com integration. These are the basic startup tasks:
•In the Data Services Administrator, start each application to be used in
the integration.
Real-time: Start services and applications that use this service.
Batch: Start/schedule the Data Services job.
•In the Administrator, monitor progress for each job. You can monitor
pending requests, processed requests, failed requests, and status.
Note: The Administrator does not automatically refresh views. To refresh
•In the Administrator, monitor progress for each (real-time) service.
•On the Salesforce.com Server, monitor messaging progress for the
configured queues.
If problems occur:
•For error message descriptions and suggested troubleshooting actions,
see the Understanding error messages on page 17 section.
•To understand the source of a problem, use Data Services error and log
tracing (for details, see Data Services documentation).
•To enable debug tracing for the adapter instance, use the Administrator.
Understanding error messages
During the course of designing and deploying your jobs, you may encounter error
messages. Find error messages and their descriptions (including suggested
actions) listed in the following table:
DescriptionError Message
Deploying the Adapter
Understanding error messages
3
Login operation has
failed. SForce.com
message is {0}
Unknown object type.
SForce.com message is {0}
Invalid user name/password or user account is
blocked for another reason, which is explained by the
Salesforce.com message.
ACTION: Confirm password or contact Salesforce.com
for more information.
The table used in the query is no longer available or
visible to the user.
ACTION: Browse Salesforce.com metadata and look
for the table.
Data Services Salesforce.com Adapter Interface Guide17
Deploying the Adapter
3
Understanding error messages
Invalid field.
SForce.com message is {0}
DescriptionError Message
One or more fields used in the query are no longer
available.
ACTION: Browse Salesforce.com metadata to determine if there is a difference between the imported table and the actual metadata. If necessary, rebuild your
data flow.
Unsupported SQL
statement: {0}
Malformed query: {0}.
SForce.com message is {1}
Invalid session parameter: name = {0},
value = {1}
Invalid CDC query:
{0}
Your data flow is not supported by Salesforce.com.
ACTION: Rebuild according to the restrictions described in this document.
The submitted query is unsupported by Salesforce.com. Most likely you have encountered a bug
translating between Data Services data flows and
Salesforce.com queries.
ACTION: Contact product support.
The URL or batchSize session parameter is invalid.
Either the URL is malformed or batchSize is not a
positive integer.
ACTION: Check the integrity of the URL and confirm
that the batchSize is a positive integer.
The data flow built over a CDC table is invalid.
ACTION: Check for (and fix) any missing WHERE
clause condition for SFDC_TIMESTAMP.
There was a service
connection error
when talking to
SForce.com: {0}
reading from 8
CDC subscription name 8
Columns folder 14
configuring the adapter 11
D
G
generated columns 14
I
imported metadata
deleting 12
opening 12
Installing the adapter 8
Integrating Data Services with Salesforce.com
11
J
Job Server 8
adapter manager 10
data flow application, designing 14
Data Services Server Manager utility 10
datastore, for adapter 11
deploying the adapter 10
designing data flow applications 14
DI_PICKLIST_VALUES table 12
E
edit existing adapter configurations 11
F
flows, designing 14
Data Services Salesforce.com Adapter Interface Guide21
L
loading data to Salesforce.com 8
N
nodes
for CDC tables 14
for normal tables 14
normal sources, reading from 8