SAP BODS Online Training


SAP BODS Online Training Course Details…

What is BODS:- It is an ETL tool currently acquired by SAP used for integration with all types of disparate systems, extracting data from them, transforming them into meaningful information and loading them into all types of systems.

It is tightly integrated with SAP systems and hence really a good tool to migrate data from legacy systems to SAP systems with ease and less development efforts with effective debugging and monitoring capabilities.

SAP Bods Online Training Course Content

Introduction of Data Services Architecture
– Data Services Designer
– Data Services repository
– Data Services Job Server
– Data Services engine
– Data Services Access Server
– Data Services Address Server
– Data Services Administrator
– Data Services Metadata Reports applications
– Data Services Service
– Data Services SNMP Agent
– Data Services Adapter SDK
– Data Services SAP RAPID MARTS

Preparing to Install Data Services Client /Server Components
– Pre-installation overview
– Installation scenarios

Repository Creation
– Repository database requirements and preparation
– Creating a Data Services repository and Selecting a repository
– Central versus local repository creation with live examples
– Using the Repository Manager.
– Multi-user Environment Setup
– Activating a central repository
– Implementing Central Repository Security
– Data Services and multiple users
– Security and the central repository
– Version Checking.
– Adding objects to the central repository Checking out

Logging into the Designer
– Project area
– Tool palette
– Workspace.
– Local object library
– Object editors and Working with objects

About Projects and Jobs
– Executing Jobs
– Overview of Data Services job execution
– Preparing for job execution
– Monitoring Jobs

Datastore creation and Overview
– Datastores and Data Flows — What is a data flow.
– Datastore and system configurations
– Multi-user Development
– Creating and managing multiple datastore configurations
– File Formats and what are file formats?
– File format editor ,Creating file formats and Editing file formats

Configure a Job Server
– Changing Job Server options
– Configure an Access Server
– To configure Metadata Integrator
– To select a web application server

Using the Server Manager
– Performing a scripted installation
– Logging in to the Management Console
– Connecting the Data Profiler
– Troubleshooting installation problems
– Running Data Services components in multi-user mode
– Publishing Data Services
Transformations and usage in data services
– Descriptions of transforms.
– Query transforms overview
– Data Quality transforms overview
– Lookup tables and the lookup_ext function

Data flow execution
– Creating and defining data flows
– Calling data flows to perform data movement operations
– Defining the conditions appropriate to run data flows
– Pass parameters to and from data flows

Work Flows and what is a work flow
– How to Creating a work flows
– Steps in a work flow and Order of execution in work flows

Creating real-time jobs
– Real-time source and target objects
– Testing real-time jobs

Overview of variables and parameters
– How to create Variables and Parameters.
– Using local variables and parameters and about global variables.
– Local and global variable rules.

Overview of data quality
– Address Cleanse transformation overview
– Data Cleanse.
– Match
– Design and Debug
– Using View Data to determine data quality
– Using the Validation transform.

Understanding changed-data capture
– Using CDC with Oracle sources
– Using CDC for targets/Sources

Data Services Management Console: Administrator
– Scheduling, monitoring, and executing batch jobs
– Connecting repositories to the Administrator
– Configuring, starting, and stopping real-time services
– Configuring Job Server, Access Server, and repository usage
– Configuring and managing adapters
– Managing users
– Publishing batch jobs and real-time services via Web services

Functions and ProceduresAbout functions
– Descriptions of built-in functions

Raising Exceptions by usingTry/catch blocks
– Catch error functions and other function calls
– Nested try/catch blocks
– If statements to perform different actions for different exceptions

Job Scheduling using scripting and How to use Scripts in BODS
– Data Services Scripting Language
– Python
– Python in Data Services

Batch Jobs
– Executing batch jobs, Scheduling jobs and Monitoring batch jobs

Using the Data Profiler
– Defining the profiler repository
– Column level profiling
– Detail profiling

Recovery Mechanisms.
– Recovering from unsuccessful job execution
– Automatically recovering jobs
– Manually recovering jobs using status tables