Informatica

>Data Engineering

Data Engineering Developer

Informatica Developer Tool for Big Data Developers (Instructor Led or onDemand)
DEI for Developers (Instructor Led or onDemand)
Big Data Streaming for Developers (Instructor Led)
Big Data Developer (Specialist Certification)

Data Engineering Administrator

Informatica Developer Tool for Big Data Developers (Instructor Led or onDemand)

DEI for Developers (Instructor Led or onDemand)
Data Engineering Integration: Administration (Instructor Led)

Informatica Developer Tool for Big Data Developers>>

>>Informatica Developer Tool for Big Data Developers

now.informatica.com/Informatica-Developer-Tool-for-Big-Data-Developers-onDemand.html

This course is applicable to software version 10. Learn the mechanics of Data Integration using the Informatica Developer Tool for Big Data Development. This course takes you through the key components to develop, configure, and deploy data integration mappings. 

Objectives

After successfully completing this course, students should be able to:

  • Extract data from relation and flat file sources
  • Develop commonly used transformations for mappings
  • Automate metadata changes through dynamic mappings
  • Parameterize mappings 
  • Assemble mappings into workflows
  • Deploy mappings and workflows as applications

Target Audience

  • Developer

Prerequisites

  • None
Agenda

Module 1: Fundamentals

  • Introduction to the Developer Tool
  • Brief Overview of Big Data Management Architecture
  • Reviewing the Developer interface
  • Lab: Configuring the Informatica Developer Tool

Module 2: Physical Data Objects

  • Introduction to the types of physical data objects
  • Using relational connections
  • Using flat file connections
  • Synchronize a flat file data object
  • Lab: Working with Connections
  • Lab: Creating Physical Data Objects

Module 3: Viewing Data

  • Introduction to data views
  • Troubleshooting configurations
  • Reviewing logs
  • Monitoring activities
  • Lab: Viewing Data on a Physical Data Object

Module 4: Mappings and Transformations

  • Mapping and transformation concepts
  • Core transformations
  • Developing and validating a mapping
  • Using transformations in a mapping
  • Lab: Developing a Mapping
  • Lab: Working with a Router Transformation

Module 5: Dynamic Mappings

  • Introduction to dynamic mapping concepts
  • Developing and running a dynamic mapping
  • Reviewing a mapping template
  • Lab: Developing a Dynamic Mapping

Module 6: Deploying Applications 

  • Deploying Applications for Users
  • Create, deploy, and run an application
  • Lab: Deploying Applications

Module 7: Parameters

  • Using Parameters
  • Types of Parameters
  • Parameter file and Parameter set
  • Run mappings with parameter file and parameter set
  • Lab: Creating a Mapping with Parameter Set
  • Lab: Run a Mapping with Parameter Set
  • Lab: Using Parameters in a Dynamic Mapping

Module 8: Workflows

  • Working with Workflows
  • Create a workflow
  • Configure a workflow
  • Add a conditional sequence flow
  • Lab: Creating a Workflow

 

>>Big Data for Developers

now.informatica.com/Big-Data-for-Developers-onDemand.html

Course Overview

This course is applicable to software version 10.1. Discover how to leverage Informatica Big Data Management for the optimization of data warehousing by offloading data processing to Hadoop.

Objectives

After successfully completing this course, students should be able to:

  • Define “Big Data” as it applies to Informatica and ETL/ELT
  • Enumerate the primary components of a Hadoop environment
  • Describe a process to identify and prioritize migration of expensive Data Warehouse processes to Hadoop
  • Migrate PowerCenter mappings to Big Data Management and ingest data into Hadoop
  • Use SQOOP and the SQL to Mapping capability to migrate and ingest data into Hadoop
  • Describe the Informatica on Hadoop architecture
  • Transform data on Hadoop using Informatica polyglot computing
  • Enumerate the capabilities of the Informatica engines on Hadoop including Hive MR, Hive Tez, Blaze, and Spark engines
  • Identify the optimization methods used by the Informatica Smart Executor
  • Utilize Informatica and Hadoop monitoring and troubleshooting
  • Parse and transform complex data using the DT transformation and Big Data Parser

Target Audience

  • Developer

Prerequisites

Agenda

Module 1: Big Data Integration Course Introduction

  • Course Agenda
  • Accessing the lab environment
  • Related Courses

Module 2: Big Data Basics

  • Hadoop concepts
  • Hadoop Architecture Components
  • The Hadoop Distributed File System (HDFS)
  • MapReduce 
  • “Yet Another Resource Manager” (YARN) (MapReduce Version 2)

Module 3: Informatica Big Data Management Architecture

  • Explanation of the Big Data world
  • Explanation of the concept to build once, Deploy Anywhere
  • Illustrate the Informatica abstraction layer
  • Informatica’s Polyglot computing engines
  • Smart Executor
  • Open Source through Innovation
  • Connection architecture 
  • Informatica connections to third party applications on Hadoop

Module 4: Data Warehouse Offloading

  • Challenges with traditional Data Warehousing
  • The requirements of an optimal Data Warehouse 
  • Data Warehouse Offloading Process

Module 5: Code Migration and Ingestion

  • Create and interpret PowerCenter Reuse Reports
  • Import PowerCenter Mappings to Developer
  • SQOOP
  • SQL to Mapping capability for converting SQL code to Informatica mappings

Module 6: Informatica Polyglot Computing in Hadoop

  • Hive MR/Tez
  • Blaze 
  • Spark 
  • The Smart Executor

Module 7: Monitoring, Logs, and Troubleshooting

  • Monitor mappings
  • Troubleshooting mappings in Hive

Module 8: Hadoop Data Integration Challenges and Performance Tuning

  • Challenges with executing mappings in Hadoop
  • Partitioning and Parallel Workflows
  • Big Data Management Performance Tuning
  • Mapping Level Tuning
  • Tips

Module 9: Complex File Parsing

  • Complex file reader
  • Data Processor transformation
  • Complex file writer
  • Performance Considerations: Partitioning
  • Data Processor Transformation Considerations

Module 10: NoSQL Databases

  • NoSQL Databases – an overview
  • Informatica HBase support
  • Informatica MongoDB support
  • Informatica Cassandra support

>>Big Data Streaming for Developers

now.informatica.com/Big-Data-Streaming-for-Developers.html

Course Overview

This course is applicable to software versions 10.2.1 and forward. Gain the skills necessary to execute end-to-end big data streaming use cases. Learn to prepare, process, enrich, and maintain streams of data in real time using Informatica, Edge, Kafka, and Spark.

Objectives

After successfully completing this course, students should be able to:

  • Discuss streaming
  • Describe Kappa architecture
  • List the types of streaming data
  • Create an EDS Service
  • Create, deploy, and monitor a data flow
  • List the BDS key features
  • Describe the BDS component architecture
  • Describe Kafka data objects
  • Create Kafka connections
  • Discuss and list sources, and targets in a streaming mapping
  • Discuss lookup sources
  • Execute a streaming mapping
  • Monitor logs and troubleshoot streaming mappings

Target Audience

  • Developer

Prerequisites

  • None

>> Data Engineering Integration: Administration

now.informatica.com/Data-Engineering-Integration-Administration-ILT.html

Applicable for users of software versions 10.4. Set up a live DEI environment by performing various administrative tasks such as Hadoop integration, Databricks integration, security mechanism set up, monitoring, and performance tuning. Learn to integrate the Informatica domain with the Hadoop and Databricks eco-system leveraging Hadoop’s lightning processing capability, and Databricks’ analytics cloud platform technology to churn huge data sets.

Objectives

After successfully completing this course, students should be able to:

  • Prepare and list the steps for installation and configuration of DEI 10.4
  • List the steps to enable Kerberos on the Domain
  • List the steps to upgrade DEI from the previous versions to 10.4
  • Create Cluster Configuration Object for Hadoop integration
  • Set up Informatica Security that includes different Authentication and Authorization mechanisms
  • Tune the performance of the system
  • Monitor, view and troubleshoot DEI logs
  • Monitor using REST APIs and log aggregator

Target Audience

  • Administrator 

Prerequisites

  • None

Agenda

Module 1: Introduction to Data Engineering Integration Administration

  • Data Engineering and the role of DEI in the big data ecosystem
  • DEI Components
  • DEI architecture
  • Roles and responsibilities of Informatica DEI Administrator
  • DEI engines: Blaze, Spark, and Databricks
  • DEI 10.4 features

Module 2: Data Engineering Integration 10.4 Installation and Configuration

  • Basic setup for installation
  • Plan the Installation Components
  • Steps to install the DEI product
  • Steps to create and configure Application Services
  • Steps to install the Developer client
  • Steps to uninstall Informatica Server

Module 3: Enable Kerberos Authentication on the Domain

  • Kerberos concepts
  • Kerberos protocol authentication steps
  • Single and Cross realm Kerberos authentication
  • Prepare to enable Kerberos Authentication on the Domain

Module 4: Upgrade Data Engineering Integration to 10.4

  • Informatica upgrade overview
  • Informatica upgrade support
  • Steps involved in the upgrade process
  • Steps to upgrade DEI 10.2.2 Server to 10.4
  • Steps to upgrade DEI Developer client from 10.2.2 to 10.4

Module 5: Hadoop Integration

  • Cluster Integration overview
  • Data Engineering Integration Component Architecture
  • Pre-requsites for Hadoop integration
  • Metadata Access Service (MAS)
  • HDP integration tasks
  • Create a Cluster Configuration
  • Integration with Hadoop
  • Lab: Create Metadata Access Service (MAS)
  • Lab: Create Cluster Configuration Object
  • Lab: Cluster Configuration Privileges and Permissions

Module 6: Data Engineering Integration Security – Authentication

  • DEI security
  • Security aspects
  • Authentication overview
  • Operating System profiles
  • Kerberos authentication
  • Apache Knox Gateway authentication
  • Lab: Run mappings on a cluster with Kerberos authentication
  • Lab: Execute mappings using Operating System profiles
  • Lab: Run mappings on a cluster with Apache Knox Gateway authentication

Module 7: Data Engineering Integration Security – Authorization

  • Authorization overview
  • HDFS permissions
  • Configure access to an SSL-Enabled Cluster
  • Security using Apache Ranger authorization
  • Fine Grained authorization
  • Lab: Set and check permissions on files and directories stored in HDFS
  • Lab: Configure Access to an SSL-Enabled Cluster
  • Lab: Execute a mapping using user impersonation with Kerberos authentication using Ranger authorization
  • Lab: Impose  row and column level restrictions using Fine Grained authorization
  • Lab: Impose masking rules on Hive source columns using Dynamic Data Masking

 Module 8: Data Engineering Recovery

  • DIS processing overview
  • DIS Queuing
  • Execution Pools
  • Data Engineering recovery
  • Monitor recovered jobs
  • Tune for Data Engineering Job Processing
  • Lab: Configure Data Engineering recovery
  • Lab: Monitor recovered jobs

Module 9: DEI Performance Tuning

  • DEI Deployment types
  • Sizing recommendations
  • Hadoop cluster Hardware tuning
  • Tune Blaze performance
  • Tune Spark performance
  • Tune Databricks performance
  • Tune Data Integration Service
  • Tune Model Repository Service
  • Tune Sqoop performance
  • infacmd autotune command
  • Lab: Enhance performance of Spark engine
  • Lab: Tune SQOOP parameters
  • Lab: Tuning for Data Engineering Job Processing
  • Lab: Execute infacmd autotune command

Module 10: Monitoring Mappings

  • View Data Integration Service generated logs
  • View logs for the Blaze, or the Spark engine
  • Monitor Spark engine
  • View Spark logs
  • Log Aggregation
  • REST Operations Hub overview
  • Monitoring Metadata document
  • REST APIs
  • Display Nodes Used in Mapping
  • Lab: Use the Administrator tool to view Spark logs
  • Lab: Use Log aggregation to aggregate Application Services logs
  • Lab: Monitor Mappings using REST APIs

Module 11: Troubleshooting

  • Troubleshoot tips for common admin problems
  • Lab: Troubleshoot practical admin issues

Module 12: Databricks Integration

  • Databricks Integration overview
  • Informatica and the Databricks environments components
  • Run-time Process on the Databricks Spark Engine
  • Databricks Integration Task Flow
  • Pre-requisites for Databricks integration
  • Steps to integrate Databricks and run a mapping within the Databricks environment
  • Lab: Create a Databricks Cluster Configuration
  • Lab: Configure the Databricks Connection
  • Lab: Run a mapping in the Databricks environment

Module 13: Configuring Data Engineering Integration on Kubernetes

  • How Data Engineering Integration Works with Kubernetes
  • Kubernetes Architecture
  • Prerequisites to install Kubernetes
  • Create a Kubernetes cluster
  • Configure Data Engineering Integration on

==

>Data Governance

Community User

Axon for Community Users (Instructor Led or onDemand)

Contributor User

Axon for Community Users (Instructor Led or onDemand)
Axon Content Curation (Instructor Led)

Super User

Axon for Community Users (Instructor Led or onDemand)

Axon Content Curation (Instructor Led)

Axon for Super Users (Instructor Led)

Admin

Axon for Community Users (Instructor Led or onDemand)
Axon Content Curation (Instructor Led)
Axon Install and Config (Instructor Led)
Axon Data Governance,New Version Coming Soon

>>Axon for Community Users

Course Overview

Applicable for users of software version 6.x. Learn to access Axon inventories to explore key organizational business structures. Utilize Axons Quick and Unison Search capabilities to locate and review business glossaries, systems, datasets, and attribute information, as well as leverage insight maps to visualize and expose impacts, relationships, dependencies, and duplication.

Objectives

After successfully completing this course, students should be able to:

  • Discuss data governance initiatives and evolution.
  • Navigate the Axon user interface.
  • Identify and describe key inventories.
  • Utilize the Quick and Unison Search capabilities.
  • Perform complex searches across multiple inventories.
  • Customize Axons landing page.
  • Locate and review glossaries.
  • Raise and follow through on change requests.
  • Review system and data lineage.
  • Find and use Axon Insight and Local maps.

Target Audience

  • Business Analyst
  • Data Analyst
  • Data Steward
  • End User

Prerequisites

  • None

Agenda

Module 1: Axon’s Data Governance Methodology

  • The Evolution of Data Governance

Module 2: Introduction to Axon

  • Navigating Axon
  • Inventories and Grids
  • Unison Search
  • Maps and Dashboards
  • Quick Search
  • Objects and Properties
  • Lab: The Axon Environment
  • Lab: Quick Search
  • Lab: Unison Search

Module 3: Axon Users

  • User Permissions and effects
  • My Items
    • Recent/Following/Roles
    • User Information
  • Lab: Your Axon Environment
  • Lab: My Account
  • Lab: My Dashboard

Module 4: The Glossary

  • Glossary Overview
  • Lab: Glossary Properties and Hierarchies
  • Lab: Identifying Ungoverned Glossary Items

Module 5: Change Requests

  • Change Requests
  • Lab: Raising a Change Request

Module 6: Systems, Datasets, Interfaces & Lineage

  • Lineage Overview
  • System and Data Lineage
  • Lab: Reviewing Systems, Datasets and Attributes

Module 7: Insight and Local Maps

  • Axon Maps
  • Insight and Local Maps
  • Map Palette
  • Map Overlays
  • Lab: Navigate Insight Maps
  • Lab: Review Local Maps

>>Axon Content Curation

Course Overview

Applicable for users of software version 6.x. Build your Data Governance Strategy step by step delivering content that the whole organization can consume and leverage. Learn to curate, create, and upload content while using Axon inventories to define, document, and explore key organizational business structures.

Objectives

After successfully completing this course, students should be able to:

  • Participate in data governance initiatives.
  • Identify Axon Data Governance use cases.
  • Discuss the use of permissions in Axon.
  • Manually create objects in Axon.
  • Use templates to upload objects.
  • Edit objects, including defining descriptions, stakeholders and impact.
  • Configure system and data lineage.
  • Link Axon Systems and Attributes to EDC Resources and Fields.
  • Curate automatically on-boarded EDC Attributes.
  • Define projects, policies and processes.
  • Design and create custom workflows.
  • Raise and follow through on change requests.
  • Explain Axons Mandatory Workflow Process.
  • Describe Secure@Source Privacy Dashboards.
  • Use Axon to display Data Quality Scores across key enterprise elements.
  • Automate Data Quality Rules.
  • Use Axon from a project perspective.

Target Audience

  • Business Analyst
  • Data Scientist
  • Data Steward
  • End User

Prerequisites

Agenda

Module 1: Axons Data Governance Methodology

  • Data Governance Evolution
  • Must Dos for Data Governance Success
  • Principles for an Axon Implementation
  • Key Axon Resources
  • Getting Started – object order definition
  • Lab: Getting Started

Module 2: Getting Started

  • User Permissions and effects
  • Manual creation of objects
  • Bulk upload of objects
  • Project Overview
  • Lab: Project Kickoff

Module 3: The Glossary

  • Glossary definition best practices
  • Manual creation of glossary items
  • Bulk upload creation of glossary items
  • Lab: Manually creating Glossary objects in Axon
  • Lab: Create Glossary objects in Axon using Bulk Upload
  • Lab: Updating ungoverned Glossary items
  • Lab: Define a Glossary Relationship
  • Lab: Creating Links from the Project to the Glossary Objects

Module 4: Systems, Datasets, Interfaces & Lineage

  • System and Data Lineage
  • Manual definition of systems and system interfaces
  • Manual linking of dataset attributes
  • Bulk update of existing attributes
  • Bulk upload for attribute lineage
  • Lab: Define a System
  • Lab: Create a System Interface
  • Lab: Manually create a Dataset with Attributes
  • Lab: Define Attribute Links and review Lineage
  • Lab: Bulk upload Attribute Links

Module 5: Leveraging Enterprise Data Catalog Content in Axon

  • EDC Overview
  • Linking to EDC Resources
  • Linking Attributes to EDC Fields
  • Lineage Recommendations
  • Glossary and Data Domain Associations
  • Curate auto-onboarded objects
  • Lab: Link to EDC Resources
  • Lab: Link Attributes to EDC Fields
  • Lab: Review Fields and Lineage in EDC
  • Lab: Review Lineage Recommendations
  • Lab: Curate Auto-onboarded Attributes

 Module 6: Policies

  • Policy Overview
  • Manual creation of policies
  • Creating child/sub policies
  • Lab: Create a Policy
  • Lab: Bulk update existing Policies

Module 7: Leveraging Secure@Source Dashboards in Axon

  • Integration Overview
  • System Privacy Dashboard
  • Policy Privacy Dashboard

Module 8: Process

  • Process Overview
  • Manual creation of processes
  • Bulk upload of processes and process predecessors
  • Process components functionality
  • Process maps
  • Lab: Create a Process
  • Lab: Bulk upload Processes
  • Lab: Define and upload Process Predecessors
  • Lab: Link Processes

Module 9: Workflows and Change Requests

  • Custom and default workflows
  • Workflow creation
  • Change requests
  • Active workflows and workflow notifications
  • Lab: Create a Workflow
  • Lab: Raise a Change Request

Module 10: Mandatory Approval Process

  • Mandatory Approval Process Overview
  • Creating and editing objects when mandatory approval has been implemented
  • Lab: Configure Mandatory Workflow Approval
  • Lab: Step through Mandatory Workflow Approval for System Creation
  • Lab: Step through Mandatory Workflow Approval for System Editing

Module 11: Data Quality

  • Data Quality Overview
  • Local and Standard Data Quality rules
  • Manual and bulk upload of local DQ rules and reports
  • Using Data Quality to update scores in Axon
  • Automating Data Quality Rules
  • Lab: Data Quality Measures
  • Lab: Data Quality Reports
  • Lab: Axon Integration with Data Quality
  • Lab: Process Related Data Quality
  • Lab: Automating Data Quality Rules

Module 12: Projects – Impact Assessment

  • Project connections
  • Relating projects and project dependencies
  • Assess project impact
  • Review DQ Dashboard
  • Lab: Review and update the GDPR Project

>>Axon for Super Users

Course Overview

This course is applicable for version 6.2. Learn how to perform more advanced Business Administration Tasks in Axon. Use the Admin Panel to customize the Axon instance with your organizations preferences.

Objectives

After successfully completing this course, students should be able to:

  • Perform Advanced tasks such as deleting and bulk updating objects.
  • Describe Axons Architecture.
  • Create Roles and assign Permissions
  • Configure Custom Fields and update Dropdowns
  • Build Default Workflows
  • Implement Mandatory Workflow Approval
  • Manage Locked Objects
  • Configure Display Settings

Target Audience

  • Business Analyst
  • Business User
  • Data Steward

Prerequisites

Agenda

Module 01: Axon Advanced Tasks

  • Perform Bulk Updates
  • Clone Datasets
  • Delete Objects
  • Lab: Perform Bulk Updates
  • Lab: Clone a dataset
  • Lab: Delete objects in Axon

Module 02: Axon Admin Panel – The Admin Dashboard

  • Axon Architecture and Services
  • The Admin Dashboard
  • Lab: Access the Admin Panel and review the status of the services in the Admin Dashboard

Module 03: Axon Admin Panel – The DG Operating Model

  • Role Permissions
  • Default Workflows
  • Default Change Requests
  • Roles and Responsibilities
  • Licensed Users
  • Lab: Create Roles and assign Permissions
  • Lab: Create Default Workflows
  • Lab: View Licensed Users
  • Lab: Use a default workflow to raise a CR on a Glossary
  • Lab: Configure Mandatory Workflow Approval

>>Axon Installation and Configuration

Course Overview

Applicable for users of version 6.2. Explore the skills required to install, administer, and configure Informatica Axon. Learn how to install or upgrade Axon, configure integration with other Informatica products, troubleshoot and maintain your Axon environment, and backup and restore the Axon database.

Objectives

After successfully completing this course, students should be able to:

  • Describe Axons Architecture
  • Summarize the different Services in Axon, describing what each one does
  • Provide an overview of Axon Integration with Data Quality, EDC and Secure@Source
  • Run the pre-validation script ahead of performing an Axon Installation
  • Install RPM files
  • Install Axon
  • Start the Axon Services and log into Axon
  • Upgrade Axon
  • Connect to Data Quality, EDC and Secure@Source
  • Install Axon Quick Look
  • Troubleshoot issues
  • Backup and restore Axons PostgreSQL database

Target Audience

  • Administrator

Prerequisites

  • None

Agenda

Module 1: Axon Architecture

  • Axon Components and Service

Module 2: Axon Integration Overview

  • Axon Integration with Data Quality
    • The Axon Agent
  • Axon Integration with EDC
  • Axon Integration with Secure@Source

Module 3: Axon PreValidation and Installation

  • System Pre-requisites
  • Run the prevalidation script
  • Install RPMs
  • Install Axon
  • Start the Services
  • Log into Axon
  • Lab: Run the pre-installer utility and add any missing RPMs
  • Lab: Install Axon and verify the installation was successful

Module 4: Upgrading Axon

  • System Pre-requisites
  • Upgrade Axon
  • Start Services
  • Log into Axon
  • Lab: Verify the system meets the requirements for an upgrade
  • Lab: Upgrade from 5.4 to 6.2 and verify the upgrade was successful

Module 5: Axon Admin Panel

  • Operational Management:
    • Static Page Editor
    • Resync LDAP
    • Download logs
  • Customize and Configure
    • Change the Logo
    • Customize Styles
    • System Settings:
      • SAML
      • Email
      • LDAP
  • Lab: Review the Admin Dashboard
  • Lab: Review the Static/Help pages
  • Lab: Change the Logo and review where to update Style Sheets
  • Lab: Configure Axon including defining the LDAP Server and synchronizing Org Units and Users
  • Lab: Review logs

Module 6: Axon Integration with Data Quality

  • Integration with EDC
  • Configure Parameters in the Admin Panel
  • Connect to EDC
  • Onboarding EDC Attributes into Axon

Module 7: Axon Integration with Secure@Source

  • Integrate with Secure@Source
  • Configure Parameters in the Admin Panel
  • Leveraging SatS Dashboards in Axon
  • Lab: Update the Admin Panel to include both EDC and SATS configuration information
  • Lab: Connect to EDC and review the Resources and Columns in Axon
  • Lab: Verify the appropriate steps have been taken in EDC to facilitate automatic onboarding into Axon
  • Lab: Configure and test auto onboarding
  • Lab: Review a SATS Privacy Dashboard in an Axon System
  • Lab: Connect an Axon policy to a policy in SATS and review the Privacy Dashboard

Module 8: Axon Integration with Data Quality

  • Integration with Data Quality using the Axon Agent
  • Install the Axon Agent
  • Configure Parameters in the Admin Panel
  • Lab: Install and start the Axon Agent
  • Lab: Post Installation Tasks
  • Lab: Verify the agent has installed correctly through the configuration and automatic update of a local rule
  • Lab: Configure the parameters for local rule generation from standard rules
  • Lab: Create standard rules and verify that local rules are generated

Module 9: Axon Quick Look

  • Axon Quick Look Overview
  • Axon Quick Look Installation
  • Lab: Installing Axon Quick Look

Module 10: Maintain Axon

  • Troubleshoot issues
  • Identify related Services
  • Review relevant Logs
  • Verify Service Statuses
  • Restart Services
  • Resolve issues
  • Lab: Trouble shooting Axon

Module 11: Backup and Restore Axons PostgreSQL Database

  • Backup Axons PostgreSQL Database
  • Restore a PostgreSQL backup
  • Lab: Backup a postgresql database
  • Lab: Restore a postgresql database


>Data Quality

Analyst/Data Steward

Informatica Analyst, Data Discovery and Advanced Profiling (Instructor Led)
OR
Informatica Analyst (onDemand)
Data Discovery and Advanced Profiling (onDemand)

  

Developer

Data Quality Mgmt. for the Developer (Instructor Led or onDemand)

Data Quality: Adv. Techniques (Instructor Led or onDemand)

Administrator

Data Quality Mgmt. for the Developer (Instructor Led or onDemand)
Data Quality Administration (Instructor Led or onDemand)

>> Informatica Analyst, Data Discovery and Advanced Profiling

Course Overview

This course is applicable to software versions 10x. Learn to use Informatica Analyst and Developer tools to create projects and objects, profile data, and identify anomalies in order to develop a better understanding of data sets.  Discover the content and structure of data through topics such as Data Domain Discovery and Enterprise Discovery Profiling including; Join Analysis, Functional Dependency Profiling, Primary Key Inference, Overlap Discovery and Foreign Key Profiling.

Objectives

After successfully completing this course, students should be able to:

  • Navigate through the Informatica Analyst tool.
  • Create projects and assign permissions.
  • Work with Physical and Logical data objects.
  • Perform Column Profiling
  • Create and apply Rules to profiles
  • Create and manage Reference Tables
  • Build a Rule Specification
  • Manage Exception and Duplicate Record Tasks
  • Perform Scorecarding
  • Task management
  • Apply Data Domain Discovery to profiles in the Analyst Tool
  • Build and execute an Enterprise Discovery Profile in the Analyst Tool
  • Define the different types of Profiling that are available in the Developer Tool
  • Create and execute Enterprise Discovery Profiles in the Developer Tool
  • Perform Data Domain Discovery and Join Analysis Profiling
  • Perform Functional Dependency and Primary Key Inference
  • Perform Overlap Discovery and Foreign Key Profiling
  • Update the model with Primary and Foreign Key relationships
  • Generate DDL from the updated model

Target Audience

  • Business Analyst
  • Data Analyst
  • Data Steward
  • Developer

Prerequisites

  • None

Instructor Led | Data Quality | 3 Days | Version 10

Agenda

Informatica Analyst

Module 01: Introduction to Data Quality

  • Define Data Quality Management.
  • Explain the Data Quality Process.
  • Discuss Informatica’s Data Quality Architecture.
  • Navigate the Analyst Interface.
  • Create Projects and Assign Permissions.
  • Lab: Create a Project and Assign Permissions.

Module 02: Data Objects

  • Understand the types of Data Objects that exist in Informatica 10.
  • Import Flat Files and create connections to Relational Tables.
  • Build a Mapping Specification.
  • Lab: Create a variety of Data Objects.
  • Lab: Create a Mapping Specification.

Module 03: Analyst Profiling

  • Introduction to Profiling.
  • Perform Column Profiling in Informatica Analyst.
    • Interpret Profile Results using Summary and Detail view.
    • Edit Profiles.
    • Detect Outliers
    • Review and Compare Historical Profile Results
    • Schedule Profiles.
  • Lab: Create profiles, detect outliers and review profile results

Module 04: Project Collaboration

  • Collaborate with team members using shared Projects, Comments, Bookmarks and Data Exports.
  • Lab: Add Comments to Profiles and Export Profile Results.

Module 05: Rules

  • Perform Rule Profiling in Informatica Analyst.
    • Apply Prebuilt and Custom Rules.
    • Interpret Rule Results.
  • Lab: Apply Prebuilt and Custom Rules to Profiles.
  • Lab: Apply Rules to a Mapping Specification.

Module 06: Reference Table Management

  • Create Reference Tables using the Reference Table Editor and the Import Flat Files options.
  • Create Managed and Unmanaged Reference Tables.
  • Auditing of changes.
  • Lab: Create Reference Tables using a variety of methods.

Module 07: Rule Specifications

  • Convert business rules into Rule Specifications that can be applied to data.
  • Validate and Test the Rule Specification.
  • Compile the Rule Specification to create a Mapplet.
  • Review the Mapplet in the Developer Tool.
  • Lab: Create, Test and Compile a Rule Specification

Module 08: Task Management

  • Manage Exception Records.
  • Manually correct Bad/Exception Records and review the Audit Trail.
  • Consolidate Duplicate Records.
  • Review the Audit Trail.
  • Administer Tasks.
  • Lab: Manage Exception Records in the Analyst Tool.
  • Lab: Manage Duplicate Records in the Analyst Tool.
  • Lab: Administer Tasks.

Module 09: Scorecarding

  • Understand how to build and configure a Scorecard.
  • Apply Weightings.
  • Review Trend Charts.
  • Configure Notifications.
  • Lab: Create, Edit and Run a Scorecard.
  • Lab: Configure and Review Notifications.
  • Lab: Schedule Scorecards and Profiles to run using Informatica Administrator.

Data Discovery and Advanced Profiling

Module 1: Analyst Tool: Data Domain Discovery and Enterprise Discovery Profiling

  • Data Domain Profiling
  • Inferring/identifying sensitive data
  • Building and executing an Enterprise Discovery Profile
  • Lab: Update a Profile to perform Data Domain Discovery
  • Lab: Create, execute and review an Enterprise Discovery Profile

Module 2: Developer Tool: Advanced Profiling

  • Varieties of Profiling
  • Column and Enterprise Discovery Profiles
  • Navigating an Enterprise Discovery Profile
  • Inference Options
  • Join Analysis Profiling
  • Lab: Create and navigate an Enterprise Discovery Profile
  • Lab: Perform Join Analysis Profiling

Module 3: Developer Tool: Functional Dependency and Primary Key Inference

  • Object content and structure
  • Primary Keys
  • Functional Dependencies
  • Sensitive Data across all objects in the model
  • Lab: Import and Profile a new set of data
  • Lab: Perform Functional Dependency and Primary Key Inference

Module 4: Developer Tool: Overlap Discovery and Foreign Key Profiling

  • Overlap Discovery between objects
  • Primary Key – Foreign Key relationships across objects
  • Verifying and Approving relationships
  • Generating DDL to create a new schema
  • Lab: Overlap Discovery
  • Lab: Primary Key – Foreign Key relationships

>> Informatica Analyst

Course Overview

Informatica Analyst Tool allows for the analysis of data by creating projects, objects, profiling data, and identifying anomalies. Learn how to use Anaylst to understand relationships between tables and files, build Reference Tables and Scorecards, and manage Exceptions and Duplicate record tasks.  This course is applicable to software version 10.

Objectives

After successfully completing this course, students should be able to:

  • Navigate through the Informatica Analyst tool
  • Create projects and setting permissions
  • Physical and Logical data objects
  • Perform Column and Rule Profiling
  • Apply rules in Mapping Specifications
  • Create and manage reference tables
  • Rule specifications
  • Perform Scorecarding
  • Task management

Target Audience

  • Business Analyst
  • Data Analyst

Agenda

Module 1: Data Quality Architecture and Data Objects

  • Projects and project permissions
  • Types of data objects in Informatica
  • Importing flat files and relational tables

Module 2: Profiling

  • Introduction to profiling
  • Column and Rule profiling in Informatica Analyst
  • Building and mapping specification
  • Rule profiling

Module 3: Reference Table Management

  • Reference tables and importing flat files
  • Managed and unmanaged reference tables

Module 4: Rule Specification

  • Converting business rules
  • Reference tables with rule specifications

Module 5: Scorecards

  • Scorecard and Pre-built rules
  • Specifying valid column values

Module 6: Informatica Exception Management

  • Managing exception records
  • Consolidating duplicate records

>> Data Discovery and Advanced Profiling

Course Overview

Applicable for version 10.x users. Discover the content and structure of data through topics such as Data Domain Discovery and Enterprise Discovery Profiling including; Join Analysis, Functional Dependency Profiling, Primary Key Inference, Overlap Discovery and Foreign Key Profiling.

Objectives

After successfully completing this course, students should be able to:

  • Apply Data Domain Discovery to profiles in the Analyst Tool
  • Build and execute an Enterprise Discovery Profile in the Analyst Tool
  • Define the different types of Profiling that are available in the Developer Tool
  • Create and execute Enterprise Discovery Profiles in the Developer Tool
    • Perform Data Domain Discovery and Join Analysis Profiling
    • Perform Functional Dependency and Primary Key Inference
    • Perform Overlap Discovery and Foreign Key Profiling
    • Update the model with Primary and Foreign Key relationships
    • Generate DDL from the updated model.

Target Audience

  • Data Analyst
  • Data Steward
  • Developer

Prerequisites

Agenda

Module 1: Analyst Tool: Data Domain Discovery and Enterprise Discovery Profiling

  • Data Domain Profiling
  • Inferring/identifying sensitive data
  • Building and executing an Enterprise Discovery Profile
  • Lab: Update a Profile to perform Data Domain Discovery
  • Lab: Create, execute and review an Enterprise Discovery Profile

Module 2: Developer Tool: Advanced Profiling

  • Varieties of Profiling
  • Column and Enterprise Discovery Profiles
  • Navigating an Enterprise Discovery Profile
  • Inference Options
  • Join Analysis Profiling
  • Lab: Create and navigate an Enterprise Discovery Profile
  • Lab: Perform Join Analysis Profiling

Module 3: Developer Tool: Functional Dependency and Primary Key Inference

  • Object content and structure
  • Primary Keys
  • Functional Dependencies
  • Sensitive Data across all objects in the model
  • Lab: Import and Profile a new set of data
  • Lab: Perform Functional Dependency and Primary Key Inference

Module 4: Developer Tool: Overlap Discovery and Foreign Key Profiling

  • Overlap Discovery between objects
  • Primary Key – Foreign Key relationships across objects
  • Verifying and Approving relationships
  • Generating DDL to create a new schema
  • Lab: Overlap Discovery
  • Lab: Primary Key – Foreign Key relationships

>> Data Quality: Data Quality Management for Developers

Course Overview

Gain the skills and knowledge necessary to implement and automate a data quality assurance process with the Informatica Data Quality platform. In addition to learn how to cleanse, standardize, and enhance data, students will test and troubleshoot their Data Quality solutions. This course is applicable for version 10.1.1.

Objectives

After successfully completing this course, students will be able to:

  • Describe the overall Data Quality Management Process.
  • Illustrate the Data Quality Architecture.
  • Differentiate between the Analyst and Developer Roles and Tools.
  • Navigate the Developer Tool and collaborate on projects with team members.
  • Perform Column, Rule, Multi object, Comparative and Mid-Stream Profiling.
  • Manage Reference Tables.
  • Develop standardization, cleansing and parsing Mappings and Mapplets.
  • Identify duplicate records using Classic Data Matching.
  • Create and execute Workflows to populate user inboxes with Exception and Duplicate record tasks.
  • Describe the deployment options that are available when executing Mappings outside of Informatica Developer.
  • Troubleshoot issues that may appear during development.

Target Audience

  • Developers

Prerequisites

  • None

Agenda

Module 1: Course Introduction

  • Discuss course objectives.
  • Walk through the course agenda.

Module 2: Data Quality Process Overview

  • Describe the overall Data Quality Management Process Cycle.
  • Identify dimensions of Data Quality including Completeness,
  • Conformity, Consistency, Accuracy, Duplication, and Integrity.
  • List and describe the Data Quality Processes including Profiling,
  • Standardization, Matching and Consolidation.
  • Differentiate between the Developer and Analyst roles and tools.
  • Describe the Data Quality Architecture.

Module 3: Data Quality Projects and Solutions

  • Examples of customer Data Quality use cases.
  • The types of projects that benefit from cleansed and standardized data.
  • Describe where Data Quality fits in a typical DI/DQ project.
  • The differences between Reporting, Gating and Cleansing projects.
  • Example solutions architecture for typical projects involving Data Quality.

Module 4: Project Collaboration and Reference Table Management

  • Work in the Developer GUI.
  • Review projects created by the analyst including data objects,
  • profiles, rules, scorecards, comments and tags.
  • Reference tables and where they fit into the Data Quality process.
  • Create reference tables by importing a flat file and connecting to an Oracle table.
  • Lab: Review a project created by an analyst user in the Analyst tool.
  • Lab: Build Reference Tables.

Module 5: Working in the Developer Tool

  • List and describe some of the tasks that will be performed in the Developer tool.
  • Work with physical and logical data objects, create a connection to a table, import a flat file and create a logical data object.
  • Explore Developer transformations.
  • Recognize the difference between mappings and mapplets.
  • Describe content sets and their uses.
  • Apply Developer tips and tricks.
  • Lab: Create a project and assign permissions.
  • Lab: Create a connection to an Oracle table and import a flat file.
  • Lab: Build a logical data object.

Module 6: Profiling, Mapplets and Rules

  • Apply knowledge gained to perform Column Profiling and interpret the results.
  • Create and validate mapplet/rule that will be used in a scorecard.
  • Use profiling techniques to debug and speed up mapping/mapplet development.
  • Using Informatica Analyst, update the scorecard with the rule created.
  • Lab: Create a rule to measure the accuracy of data in a field.
  • Lab: Using Informatica Analyst, apply the rule to a scorecard and review the results.

Module 7: Standardizing, Cleansing and Enhancing Data

  • Define what it is to standardize, cleanse and enhance data.
  • Create a mapping to cleanse, standardize and enhance data.
  • Design and develop standardization mapplets.
  • Describe and configure a range of standardization transformations.
  • Lab: Build a Standardization mapping and mapplets using standardization transformations.

Module 8: Parsing Data

  • What it is to parse data.
  • The parsing process and what is involved.
  • The various different parsing techniques and when to use them.
  • Configure key parsing transformations.
  • Lab: Perform parsing using a variety of parsing transformations and strategies.
  • Lab: Complete the standardization mapping.

Module 9: Grouping and Matching Data

  • What it is to match data.
  • What is DQ matching process and explain the different stages of matching.
  • Why grouping is a necessary precursor to matching and the effect it has on matching.
  • Perform grouping using a variety of methods.
  • Review and explain the results of grouping, refining the grouping strategy if necessary.
  • Differentiate between match algorithms and use the most appropriate one for each data type.
  • Lab: Build and fine tune a grouping and matching mapping.

Module 10: Manual Exception and Consolidation Management

  • Identify when exception and duplicate record management is necessary in a project.
  • Exception Management Process.
  • Build mappings to populate the appropriate tables with exception and duplicate record tasks.
  • Lab: Build a mapping that can be used to identify exception data.
  • Lab: Build a mapping that can be used to identify duplicate data.

Module 11: Building, Managing and Deploying Workflows

  • Define workflows and workflow tasks.
  • Identify Human tasks and steps.
  • Create workflows to identify exception and duplicate records.
  • Deploy and execute workflows.
  • Verify tasks have been created in Informatica Analyst.
  • Lab: Build a workflow to populate the Analyst Inbox with Exception Tasks.
  • Lab: Build a workflow to populate the Analyst Inbox with Duplicate Record Tasks.

Module 12: Deploying: Executing Mappings outside of the Developer tool

  • The various deployment options that are available.
  • Create and deploy mappings as applications.
  • Schedule mappings, profiles and a scorecard to run using Informatica Scheduler.
  • Lab: Schedule Mappings to run using Informatica Scheduler.

Module 13: Importing and Exporting Project Objects

  • Identify when to export/import projects.
  • Use Basic and Advanced Import options.
  • Export a project.
  • Lab: Import a project using the Basic method.
  • Lab: Import a project using the Advanced Method.
  • Lab: Export a project.

Module 14: Troubleshooting

  • Provide examples of errors encountered in the Developer and troubleshoot these errors.
  • Identify common mapping and transformation configuration issues.
  • Common workflow configuration errors.
  • Tips for working with the Developer tool.
  • Lab: Optional. Troubleshoot mapping configuration issues.

>> Data Quality: Advanced Techniques

Course Overview

This course is applicable for all version 10 releases. Learn to leverage advanced techniques when utilizing Developer to profile, cleanse, standardize, de-duplicate and consolidate data in an enterprise.  Focused on creating and applying custom built Classifier and Probabilistic Models, utilizing advanced Parsing and Matching methods, refining Human Tasks and Workflows, automatically Associating and Consolidating matched records, applying Parameters in mappings and more.

Objectives

After successfully completing this course, students should be able to:

  • Perform Join Profiling.
  • Create and apply Classification Models.
  • Parse data using advanced techniques.
  • Create and apply Probabilistic Models.
  • Apply sophisticated Grouping and Matching techniques.
  • Automatically Associate and Consolidate matched records.
  • Refine Exception and Duplicate Record Workflows used to populate Analyst inboxes.
  • Design, Implement and Test processes to manage updated exception/duplicate records.
  • Appropriate DQ Parameters.
  • Examine Performance considerations.
  • Review CRM and Dashboard & Reporting Templates.
  • Optionally/Time allowing:
    • Leverage Web Services to apply DQ mappings in Excel.
    • Perform Identity Matching.
      • Use the Universal ID store to match against master data.

Target Audience

  • Developer

Prerequisites

Agenda

Module 1: Course Introduction

  • Course Introduction, Agenda and Overview

Module 2: Developer Review & Join Profiling

  • A quick review of Informatica Developer
  • Use Enterprise Discovery to create Join Profiles.
  • Lab: Perform Join Profiling using an Enterprise Discovery Profile

Module 3: Standardizing and Classifying Data

  • Review Standardization Techniques
  • Build, refine and apply a Classifier Model
  • Labs: Create, refine and apply Classifier Model

Module 4: Advanced Parsing Techniques

  • What is Probabilistic Labeling and Parsing?
  • Build, refine and apply a Probabilistic Model.
  • Additional Parsing Techniques:
    • Build regular expressions.
  • Labs: Build, refine and apply a Probabilistic Model
  • Lab: Review an example of Advanced Parsing
  • Lab: Generate and test Regular Expressions

Module 5: Grouping & Matching Data

  • Additional Grouping Techniques
    • Using Composite keys
  • Advanced Matching Techniques
    • Matched pairs outputs.
    • Working with Match Mapplets.
    • Manipulating the matched data using the Driver ID
    • Perform Dual Matching
  • Lab: Create a Match mapping using Matched Pairs
  • Lab: Create and update a Match Mapplet
  • Lab: Manipulating Matched Data using the Driver ID
  • Lab: Perform Dual Matching using a Master Dataset.

Module 6: Automatically Associate and Consolidate Matched Data

  • Overview of the Consolidation Process
  • Use the Consolidation Transformation to consolidate matched data.
  • Use the Association Transformation to link matched data ahead of Consolidation.
  • Lab: Automatically Consolidate matched data.
  • Lab: Perform multi-criteria Matching, Association and Consolidation.

Module 7: Task and Workflow Management

  • Additional Task and Workflow functionality:
    • Permission settings for data access and editing
    • Notifications including Human Task Notification Variables
    • Setting Timeouts
    • Reviewing Tasks
    • Configuring Workflow Recovery
  • Lab: Update the Exception Workflow
  • Lab: Review the Consolidation Workflow

Module 8: Processing Updated Exception and Cluster Data

  • How to process updated exception records
  • How to process consolidated records
  • Fields of Interest
  • Lab: Create a mapping to process updated exception data
  • Lab: Create a mapping to process consolidated data
  • Lab: Update and deploy Exception and Cluster Workflows

Module 9: Analyst Tasks

  • Update exception and duplicate records in Informatica Analyst
  • Lab: Update records and push the Tasks through the Exception Process
  • Lab: Update records and push the Tasks through the Consolidation Process

Module 10: Parameterization

  • Explain the difference between System and User defined parameters
  • Use Parameters in Data Quality mappings.
  • Lab: Create a parameterized mapping
  • Lab: Build and deploy an Application
  • Lab: Create and execute parameter files

Module 11: Performance tips and tricks

  • General Installation and Memory Information
  • DQ Component Configuration
    • Service Settings
  • DQ Transformations
    • Configuration Settings

Module 12: Optional – Data Quality at work

  • Learn how Data Quality has been implemented in different projects.

Module 13: CRM and Dashboard and Reporting Templates

  • Review the CRM and Dashboard and Reporting Templates that are available
  • Lab: Review the CRM Template

Appendix:

  • Module: DQ for Excel using Web Services
  • Use Data Quality Web Services to execute DQ mappings on Excel Spread sheets.
  • Lab: Use Web Services to execute mappings in Excel

Module: Identity Matching –

  • Match Data using Identity Matching
    • Use UID to match data against a Master Data Store
  • Lab: Use Identity to match customer data
  • Mixed Matching Workshop
  • Lab: Universal ID, Create and load the Persistent Data Store
  • Lab: Match and update new records to the Store.

>> Data Quality: Administration

Course Overview

This course is applicable to software version 10. Providing students with the fundamental knowledge and skills to maintain an Informatica Data Quality environment. Focused on teaching individuals how to use the Informatica Administrator tool to maintain the required environment.

Objectives

After successfully completing this course, students should be able to:

  • Describe core administration tasks and tools
  • Configure the Informatica Administrator tool
  • Create and configure necessary services
  • Manage Informatica security
  • Audit security access and privileges
  • Perform ongoing maintenance
  • Stop or recycle a service
  • Review domain logs

Target Audience

  • Administrator

Agenda

Module 01: Data Quality 10 Architecture

  • Informatica Data Quality technical architecture
  • Informatica domain, nodes, and application services
  • Overview of the PowerCenter clients.

Module 02: Best Practices

  • Configuring an environment
  • Recognized Naming Conventions

Module 03: Installing Informatica 10

  • Installing Informatica 10
  • Reviewing installation logs
  • Using command-line utilities

Module 04: Using the Informatica Administrator tool

  • Administration tool layout and navigation
  • Views in the Manage Tab
  • Services and Nodes
  • License key types

Module 05: Configuring the Model Repository Service

  • Adding and moving a Model Repository Service
  • Auditing

Module 06: Configuring the Data Integration Service

  • Creating connections
  • Home Directory location
  • Stopping and starting a service

Module 07: Users and Groups

  • Creating user and group accounts
  • Creating accounts using scripts
  • Adding users to groups
  • Importing LDAP user accounts and groups

Module 08: Privileges and Roles

  • Configuring roles and privileges
  • Assigning privileges and roles
  • Domain folders and services

Module 09: Permissions

  • Assigning permissions to domains and domain objects
  • Verifying permissions

Module 10: Configuring the Analyst Service

  • Creating connections
  • Analyst Services
  • Analyst Service folders
  • Analyst Service permissions and log
  • The Analyst command line

Module 11: Configuring the Content Management Service

  • Connecting the Developer client to the domain
  • Data Integration Service defaults
  • MRS projects and set permissions
  • Project folders and set permissions
  • Simple mapping
  • Deploying mappings

Module 12: Configuring the Data Director Service

  • The Scheduler Service
  • Setting a schedule

Module 13: Domain Administration and Management

  • Create and configure a Content Management Service (CMS)
  • Install OOTB content
  • Install Identity (IMO) content
  • Install Address Doctor (AD) content
  • Install the Classifier model

Module 14: Monitoring and Troubleshooting

  • The Monitoring view
  • Configuring Log Management properties
  • Filtering logs
  • Auditing user activity

Module 15: Managing the Domain

  • Managing Alerts
  • Migrating an Informatica Domain from one Database to another

>> Data Quality 10: Developer, Specialist Certification

Certification Overview

This test measures your competency as a member of a project implementation team. This involves an in-depth knowledge of Data Quality processes such as Profiling, Standardization, Matching, and Consolidation. You must select and configure the appropriate Data Quality transformations and build, debug and execute Data Quality mappings including integrating those mappings into Power Center if needed

When you are ready to test and become an Informatica Certified Specialist (ICS), please follow these steps :

  1. Click Enroll and log in to your Informatica account.
  2. Click Add to Cart and complete your registration/purchase.
  3. Once you have registered go to My Training and View Your Transcript.
  4. Now you can simply Launch and take your test Anytime/Anywhere prior to your test’s expiry date.

The below can be used as a guide to preparing before taking the test. Included is an outline of the technical topics and subject areas that are covered in the test, test domain weighting, test objectives and topical content.

    Skill Set Inventory

Certification Exam | Data Quality | Version 5 | Skill Set Inventory

Test takers will be measured on:

  • Navigation through the Developer Tool
  • Project Collaboration techniques such as Tags and Comments,
  • Using Profiles and reference tables to share information with Analysts
  • Profiling methods to identify data irregularities and inaccuracies.
  • Converting a profile into a mapping
  • Using mappings and mapplets to cleanse and standardize data
  • Performing Address Validation
  • Identifying duplicate records
  • Automat and manual consolidation of e duplicate records into a master record
  • Executing mappings in PowerCenter
  • Deploying Data Quality Mappings in an Excel spreadsheet (connect to Web Services)

Test Domains

The test domains and the extent to which they are represented as an estimated percentage of the test follows:

Title
% of Test
Informatica Overview 10%
Analyst Collaboration 10%
Profiling 15%
Standardization/Mapplets 10%
Address Validation 5%
Matching 10%
Consolidation and the DQA 10%
Integration with PowerCenter 5%
Object Import and Export 5%
DQ for Excel 5%
Parameters 5%
Content 10%

 

Question Format

You may select from one or more response offerings to answer a question.

You will score the question correctly if your response accurately completes the statement or answers the question. Incorrect distractors are given as possible correct answers so that those without the required skills and experience may wrongly select that choice.

A passing grade of 70% is needed to achieve recognition as an Informatica Certified Specialist (ICS) in Data Quality 10 Developer.

You are given 90 minutes to complete the test. Formats used in this test are:

  • Multiple Choice:  Select one option that best answers the question or completes the statement
  • Multiple Response: Select all that apply to best answer the question or complete the statement
  • True/False: After reading the statement or questions select the best answer

Test Policy

  • You are eligible for one attempt and re-take, if needed, per test registration.
  • If you do not pass on your first attempt:
    • The purchase of the test will include one second-attempt if a student does not pass a test.
    • You must wait two weeks after a failed test to take the test again.
    • Any additional retakes are charged the current fee at the time of purchase.
    • Promotions are excluded and cannot be combined.

Test Topics

The test will contain 70 questions comprised of topics that span across the sections listed below.  In order to ensure that you are prepared for the test, review the subtopics with each section.

Informatica Overview

  • Describe the Informatica 10 architecture and set up including the repositories required for installation
  • Provide information on the Data Quality process and dimensions of data quality

Analyst Collaboration

  • Use and describe the functionality of the Analyst Tool including Scorecarding, Reference Table Management, Tags, Filters, Profiles and Comments
  • Describe how Analysts and Developers can collaborate on projects
  • Describe the benefits of project collaboration to a team

Profiling

  • Perform and interpret column, rule, comparative and mid-stream profiling

Standardization/Mapplets

  • Be aware of where data standardization fits in the DQ process
  • Apply, configure and troubleshoot data standardization transformations
  • Differentiate between the various parsing techniques available
  • Recognize how reference tables are used in the standardization process

Address Validation

  • Verify the importance of Address Validation
  • Configure the Address Validation transformation including explaining each mode that is available and what custom templates can be used for
  • Interpret the AV outputs generated

Matching

  • Develop and build match plans to identify duplicate or related data
  • Differentiate between the matching algorithms available and explain how matching scores are
  • Configure Identity Matching option for the match transformation
  • Explain populations, the identity match strategies that are available and how they are used in Identity Matching
Consolidation and the DQA

  • Explain Automatic and Manual consolidation techniques and configure the Transformations used for automatic consolidation
  • Use the Exception Transformation to generate and populate the Bad Records and Duplicate Records tables
  • Troubleshoot any problems that may occur
  • Generate a survivor record in the DQA
  • Remove duplicates using DQA

Integration with Power Center

  • Explain how to integrate IDQ mappings and mapplets into PowerCenter Workflows including how content is handled

Object Import and Export

  • Define the difference between the Basic and Advanced Import options available
  • Explain Dependency conflict resolution and how it is handled
  • Describe how to Export a Project

DQ for Excel

  • Describe the requirements for integration for Excel
  • Define the techniques for creating mappings for DQ for Excel
  • Explain the Web service capabilities required for DQ for Excel
  • Provide an explanation on how to use the Informatica ribbon in Excel

Parameters

  • Explain what parameters are and why they are used including what parameter functions are supported
  • Identify which Transformations can use parameters
  • Describe the process of exporting mappings with parameters built in

Content

  • Explain what Content is, what is contained in the Core Accelerator and why it is used in the Core Accelerator and why it is used

 Sample Test Questions

How are the Identity Match populations installed for Data Quality?
A. X None
B. X One
C. X Multiple, but only if you have been granted the Sources and Target privilege
D. Correct Multiple
Workflow recovery(introduced in version 9.5.1 HF1) property can be set in ___.
A. Correct Address reference data, identity population, and accelerator demonstration data
B. X Dictionaries only
C. X Address Validator Transformation and Address subscription data
D. X Pre-built Mappings, Demo Mappings, Dictionaries, Address Validator, Address subscription data
In IDQ scorecard results are stored in the Profile Warehouse
A. Correct True
B. X False
Which of the following is FALSE with regard to import/export functionality?
A. X Advanced Import Wizard allows users to perform fine grain conflict resolution.
B. X When using the Basic Method users must import entire Projects with a generic conflict resolution.
C. X When exporting, the user identifies a set of objects within a project, and all the objects plus their dependencies are exported to XML.
D. Correct When doing a Basic import the only conflict resolution available is Add Object to Target or Replace Object in Target
Which of the following will impact the total amount of time it takes to execute a column profiling process?
A. X Number of Columns in the file
B. X Number of Rows in the source data
C. X Amount of memory and speed of the processor on the servers
D. Correct All of the above

Additional Information

Retake Policy: Current purchases of the test will include one second-attempt if a student does not pass a test. Any additional retakes are charged the current fee at the time of purchase. Promotions are excluded and cannot be combined. You must wait two weeks after a failed test to take the test again.

Informatica University has a community page so students can assist one another in their test preparation within the Informatica Network: https://network.informatica.com/welcome

For more information on Informatica Certifications visit https://www.informatica.com/services-and-training/certification.html
To find the class that is right for you, fill out the form at the top of this page, or visit our website at http://www.informatica.com/us/services-and-training/training/. For onsite class information contact your local Education Sales Specialist.

>> Data Quality 10: Administrator, Specialist Certification

Certification Overview

This test measures your competency as a full member of a project implementation team, including: Your ability to install, configure and administer the Informatica Data Quality(IDQ) Product (Client, Server and Services), setup IDQ Domain Security (Users, Groups, Privileges on IDQ Services), IDQ Connections, and prebuilt Content (Accelerators, Address Data, Identity Populations), Upgrade, Monitoring and Reporting.

When you are ready to test and become an Informatica Certified Specialist (ICS), please follow these steps :

  1. Click Enroll and log in to your Informatica account.
  2. Click Add to Cart and complete your registration/purchase.
  3. Once you have registered go to My Training and View Your Transcript.
  4. Now you can simply Launch and take your test Anytime/Anywhere prior to your test’s expiry date.

The below can be used as a guide to preparing before taking the test. Included is an outline of the technical topics and subject areas that are covered in the test, test domain weighting, test objectives and topical content.

    Skill Set Inventory

Test takers will be measured on:

  • Address Validation
  • Connections
  • Data Director & Workflows
  • Domain Management, Users & Architecture
  • DQ Content
  • DQ Matching
  • DQ Services
  • Identity Populations and Accelerators
  • Installation and Upgrade
  • Monitoring & Log
  • Operations
  • Reporting
  • Security, Users & Architecture

Test Domains

The test domains and the extent to which they are represented as an estimated percentage of the test follows:

Title
% of Test
Address Validation 10%
Connections 6%
Data Director & Workflows 8%
Domain Management, Users & Architecture DQ Content 9%
DQ Content 4%
DQ Matching 6%
DQ Services 13%
Installation & Upgrade 11%
Monitoring & Log 14%
Operations 9%
Reporting 6%
Security, Users & Architecture 4%

 

Question Format

You may select from one or more response offerings to answer a question.

You will score the question correctly if your response accurately completes the statement or answers the question. Incorrect distractors are given as possible correct answers so that those without the required skills and experience may wrongly select that choice.

A passing grade of 70% is needed to achieve recognition as an Informatica Certified Specialist (ICS) in Data Quality 10 Administrator.

You are given 90 minutes to complete the test. Formats used in this test are:

  • Multiple Choice:  Select one option that best answers the question or completes the statement
  • Multiple Response: Select all that apply to best answer the question or complete the statement
  • True/False: After reading the statement or questions select the best answer

Test Policy

  • You are eligible for one attempt and re-take, if needed, per test registration.
  • If you do not pass on your first attempt:
    • The purchase of the test will include one second-attempt if a student does not pass a test.
    • You must wait two weeks after a failed test to take the test again.
    • Any additional retakes are charged the current fee at the time of purchase.
    • Promotions are excluded and cannot be combined.

Test Topics

The test will contain 70 questions comprised of topics that span across the sections listed below.  In order to ensure that you are prepared for the test, review the subtopics with each section.

Address Validation

  • Be able to describe how Address Validation is installed and configured.
  • How to configure properties that may help Developers using AV in their mappings.

Connections

  • Be able to describe how to create DB connections and why.
  • Be able to describe which DQ Tool uses them and which tools can create them.

Data Director & Workflows

  • Be able to describe what is Data Director the application versus the service.
  • Describe Human Task flows and Workflows.
  • How to administrate workflows.

Domain Management, Users & Architecture

  • Be able to describe how to access the Administrator Console.
  • Be able to describe what is Domain and what it consists of.
  • Be able to navigate the Administrator Console to know where to configure services, security, and monitoring.

DQ Content

  • Be able to describe Data Quality Content & Installation.
  • Be able to describe the type of available of accelerators.

DQ Matching

  • Be able to describe upgrade of DQ Content.
  • Be able to describe the relationship between Matching and Disk utilization.
DQ Services

  • Be able to describe the types of services available.
  • Be able to describe content how to create and configure the services.

Installation and Upgrade

  • Be able to describe the installation and upgrade process and components needed to support development and production DQ Environment.
  • Be able describe the start up and shut down of services.

Monitoring & Log

  • Be able to describe the various types logs and their locations.
  • Be able to describe how you can monitor the activity in a domain.

Operations

  • Be able to describe the task and processes need to keep a DQ Environment up and running smoothly.
  • Be able to describe how  to troubleshoot issues to keep a DQ Environment up and running smoothly.

Reporting

  • Be able to describe types DQ reporting services available.
  • Be able to describe how to configure DQ reporting services.

Security, Users & Architecture

  • Be able to describe user permissions, privileges and permissions.
  • Be able to describe how to create users.

 Sample Test Questions

How are the Identity Match populations installed for Data Quality?
A. X Execute Content_installer_client.exe
B. X Execute Population_installer.exe
C. X Execute an import via the Developer client
D. Correct Execute Content_installer_server.exe
Workflow recovery (introduced in version 9.5.1 HF1) property can be set in ___.
A. Correct DQ Developer
B. X Administrator Console
C. X DQ Analyst
D. X Does not need to get set
After deleting an Application from Developer tool, the application still shows on Monitoring Tab.
In order to remove it permanently from the Monitoring tab, application must be stopped and undeployed.
A. Correct True
B. X False
In which property does Data Quality store the path to the population files on the Content Management Service?
A. X NLP Options
B. X Address Validation Properties
D. Correct Identity Properties
What is contained in the tables for the staging database?
A. X Staging database is used to store the cache file locations
B. X The Staging database is used to store Repository tables
C. X The Staging database is used to store PowerCenter mapping tables
D. Correct Staging database is used to store the Reference Data Tables

Additional Information

Retake Policy: Current purchases of the test will include one second-attempt if a student does not pass a test. Any additional retakes are charged the current fee at the time of purchase. Promotions are excluded and cannot be combined. You must wait two weeks after a failed test to take the test again.

Informatica University has a community page so students can assist one another in their test preparation within the Informatica Network: https://network.informatica.com/welcome

For more information on Informatica Certifications visit https://www.informatica.com/services-and-training/certification.html
To find the class that is right for you, fill out the form at the top of this page, or visit our website at http://www.informatica.com/us/services-and-training/training/. For onsite class information contact your local Education Sales Specialist.

 

====

> Data Integration: PowerCenter and B2B


 

Developer

Data Integration for Developers (Instructor Led) OR Developer, Level 1 (onDemand)
PowerCenter Developer, Level 2 (Instructor Led or onDemand)
B2B: Data Transformation for Developers (Instructor Led)
Programs

>> PowerCenter: Developer, Level 1

Course Overview

This course is applicable to software version 10. Explore Informatica PowerCenter 10 which is comprised of server and client workbench tools used to create, execute, monitor and schedule ETL processes. Work through the PowerCenter Designer, Workflow Manager, and Workflow Monitor tools while performing tasks such as creating source and target definitions, transformations, mappings, reusable objects, sessions and workflows to extract, transform and load data.

Objectives

After successfully completing this course, students should be able to:

  • Utilize PowerCenter 10 Designer to build mappings that extract data from a source to a target, transforming it as necessary
  • Deploy PowerCenter transformations to cleanse, format, join, aggregate and route data to appropriate targets
  • Perform error handling/trapping using PowerCenter mappings
  • Use PowerCenter 10 Workflow Manager to build and run workflows which execute sessions associated with a mapping and to control data transformation processes.
  • Design and build simple mappings and workflows based on essential business needs
  • Complete basic troubleshooting using PowerCenter logs and debugger

Target Audience

  • Developer

Prerequisites

  • None

Agenda

Module 1: An Overview of Informatica PowerCenter

  • PowerCenter Architecture, Terminology, Tools GUI, Mappings, Transformations, Sessions, Workflows and Workflow Monitor

Module 2: ETL Fundamentals

  • Using the Source Analyzer to create flat file and relational Sources
  • Using the Target Developer to creat flat file and relational Targets
  • Mapping Designer
  • Workflow Designer
  • Monitoring workflows
  • Previewing Target Data with PowerCenter Designer

Module 3: Troubleshooting

  • PowerCenter Log files
  • Viewing and looking up error messages
  • Correcting mapping and workflow errors

Module 4: PowerCenter Transformations, Tasks and Reusability

  • PC Designer Transformations and Workflow Tasks
  • Active vs. Passive Transformations
  • Expression Transformation and Editor
  • Reusable Designer Transformations

Module 5: Features and Techniques

  • Usage of Arrange All and Arrange All Icons
  • ‘Autolink’ feature
  • ‘Select Link Path’ feature
  • Propagation of Port Properties

Module 6: Joins and Link Conditions

  • Joins
  • Clarify Heterogeneous vs. Homogeneous Joins
  • Joiner Transformation
  • Source Qualify and joining two relational sources
  • Executing sessions with Link Conditions

Module 7: Using the Debugger

  • Debugger Interface
  • Creating a break point
  • ‘Evaluate the Expression’ functionality
  • Re-Executing mappings

Module 8: Sequence Generators, Lookups and Additional Workflow Tasks

  • Sequence Generator definition and use
  • Lookup Transformations
  • Varying types of Lookups
  • Lookup Caching
  • Flat file Lookups for adding data to relational targets
  • Multiple Row Return Lookup
  • Using Aggregators and Expressions
  • Using the dates Lookup Cache
  • Utilizing Event Wait, Event Timer and Email Tasks
  • Usiing a decision task

Module 9: Update Strategies, Routers and Overrides

  • Using Update Strategies and Routers for mappings and determining insert/ update logic for a target
  • Overrides for incremental (daily) loading of the target

Module 10: Sorting and Aggregation Data Using PowerCenter

  • Unconnected Lookups
  • Mapping Parameter/Variables and Mapplets/Worklets
  • Sorter Transformation
  • Aggregator Transformation and Aggregate Functions
  • Unconnected Lookups and how they are called
  • Mapping Parameters/Variables and initialization priority
  • Mappets and Worklets
  • Transformations for record loading
  • Parameter File Instruction

Module 11: Mapping Design Workshop

  • Designing and building mappings for loading an aggregate table
  • Determining logic for mappings
  • Velocity Best Practices

Module 12: Workflow Design Workshop

  • Designing and building a workflow to load staging tables
  • Determining the correct logic for a workflow
  • Velocity Best Practices

>> PowerCenter: Developer, Level 2

Course Overview

Enhance your developer skills with advanced techniques and functions for PowerCenter.  This course focuses on additional transformations and transaction controls, as well as, teaches performance tuning and troubleshooting for an optimized PowerCenter environment.

Objectives

After successfully completing this course, students should be able to:

  • Determine the structure and use of PowerCenter Parameter Files
  • Implement user-defined and advanced functions.
  • Normalize and Denormalize data using PowerCenter
  • Use the Lookup transformation in Dynamic mode
  • Call a SQL stored procedure from a PowerCenter mapping
  • Create and configure a SQL transformation and its two modes of use
  • Design error handling strategies appropriate for the intended purpose of a workflow
  • Make use of the PowerCenter source-based, target-based, and user-based transaction control
  • Utilize constraint-based loading in databases with referential integrity constraints
  • Use the Transaction Control transformation for data-driven RDBMS transaction control
  • Determine the proper use of built-in and optional, mapping-design recovery capabilities
  • Build batch files that use PMCMD and PMREP command line programs
  • Apply PowerCenter Performance Tuning Methodology
  • Describe the effect of mapping design on performance and apply these design principles to a mapping
  • Calculate how much memory a session uses and tune session-level memory
  • Apply partitions, distribute the data and optimize the CPU memory usage

Target Audience

  • Developer

Prerequisites

OR

Agenda

Module 1: PowerCenter Architecture

  • Informatica PowerCenter 10 architecture and key terms
  • PowerCenter’s optional and built-in high availability features

Module 2: Parameter Files

  • IsExprVar property in a mapping
  • Structure of a parameter file
  • Parameter files in mappings and sessions
  • Parameter files used to build mapping expression logic
  • Date/time mapping variable, in a parameter file for incremental loading

Module 3: User-Defined and Advanced Functions

  • Advanced functions
  • User-Defined functions
  • Standard name formatting function and implementing UDF in mappings
  • AES_Encrypt and Encode functions
  • Debug the mapping

Module 4: Pivoting Data

  • Normalizer transformation
  • Aggregators
  • Normalization of data into a relational table
  • Denormalization of data into a Fact table

Module 5: Dynamic Lookups

  • Dynamic Lookup Cache
  • Dynamic Lookup to load data into a dimension table
  • Dynamic Lookup used in tandem with an Update Strategy transformation to keep historic data in a dimension table

Module 6: Stored Procedure and SQL Transformations

  • SQL stored procedures
  • SQL transformations in script mode
  • SQL transformations in query mode
  • SQL transformation used to create tables
  • Proper query formatting for SQL transformation
  • Database errors

Module 7: Troubleshooting Methodology and Error Handling

  • Error handling strategies
  • Data errors
  • Update Strategies

Module 8: Transaction Processing

  • Source-based, target-based, and user-based transaction control with and without the High Availability option
  • Constraint-based loading in databases with referential integrity constraints
  • Table Loading and the RDBMS Primary-Foreign key relationship

Module 9: Transaction Control Transformation

  • Transaction control transformation for data-driven transaction control
  • Data Control
  • Transformation variables

Module 10: Recovery

  • Workflow and task recovery with and without the high availability option
  • Recover tasks and workflows

Module 11: Command-Line Programs

  • PMCMD, PMREP, and INFACMD command line functionality
  • Batch files that use PMCMD and PMREP
  • Command line utilities

Module 12: Performance Tuning Methodology

  • Source, target and engine bottlenecks
  • Performance counters
  • Tuning different types of bottlenecks
  • Benchmark tests
  • Target bottleneck tests
  • Results evaluation

Module 13: Performance Tuning Mapping Design

  • Best practices in mappings
  • Session properties
  • Inspect and edit mappings
  • Inspect and edit transformations

Module 14: Memory Optimization

  • Session-level memory
  • Transformation caches
  • PowerCenter Performance Counters

Module 15: Performance Tuning: Pipeline Partitioning

  • Partition points
  • Data Partitions
  • Memory optimization

>> PowerCenter Data Integration 10: Developer, Specialist Certification

Course Overview

This test measures your competency in building PowerCenter objects on basic and advanced levels in order to make optimal use of the Designer, Workflow Manager, and Workflow Monitor tools. Additionally, you will be tested on your ability to use transformations, build and run workflows, and further test your abilities to work as part of a data integration development team.

When you are ready to test and become an Informatica Certified Specialist (ICS), please follow these steps :

  1. Click Enroll and log in to your Informatica account.
  2. Click Add to Cart and complete your registration/purchase.
  3. Once you have registered go to My Training and View Your Transcript.
  4. Now you can simply Launch and take your test Anytime/Anywhere prior to your test’s expiry date.

The below can be used as a guide to preparing before taking the test. Included is an outline of the technical topics and subject areas that are covered in the test, test domain weighting, test objectives and topical content.

Target Audience

  • Students seeking certification as a PowerCenter Data Integration Developer Specialist

Prerequisites

The recommended training prerequisites for Specialist certification are the completion of the following Informatica course(s):

    Skill Set Inventory

Certification Exam | PowerCenter | Self-Paced

Test takers will be measured on:

  • Basic Mapping Design
  • Optimal   Mapping Design
  • Parameters and Variables
  • Transformation expression syntax
  • Troubleshooting
  • Workflows and Worklets
  • Architecture and Administration

Test Domains

The test domains and the extent to which they are represented as an estimated percentage of the test follows:

Title % of Test
Architecture and Administration 3%
Mapping Design Basic 49%
Mapping Design Advanced 14%
Parameters and Variables 9%
Transformation Language 9%
Troubleshooting 7%
Workflows and Worklets 10%

Question Format

You may select from one or more response offerings to answer a question.

You will score the question correctly if your response accurately completes the statement or answers the question. Incorrect distractors are given as possible correct answers so that those without the required skills and experience may wrongly select that choice.

A passing grade of 70% is needed to achieve recognition as an Informatica Certified Specialist (ICS) in PowerCenter Data Integration Developer.

You are given 90 minutes to complete the test. Formats used in this test are:

  • Multiple Choice:  Select one option that best answers the question or completes the statement
  • Multiple Response: Select all that apply to best answer the question or complete the statement
  • True/False: After reading the statement or questions select the best answer

Test Policy

  • You are eligible for one attempt and re-take, if needed, per test registration.
  • If you do not pass on your first attempt:
    • The purchase of the test will include one second-attempt if a student does not pass a test.
    • You must wait two weeks after a failed test to take the test again.
    • Any additional retakes are charged the current fee at the time of purchase.
    • Promotions are excluded and cannot be combined.

Test Topics

The test will contain 70 questions comprised of topics that span across the sections listed below.  In order to ensure that you are prepared for the test, review the subtopics with each section.

Basic Mapping Design

  • Use of the PowerCenter Designer Tool
  • Use of the PowerCenter Workflow Manager Tool
  • User of the PowerCenter Workflow Monitor Tool
  • Source Qualifier Transformation
  • Expression Transformation
  • Aggregator Transformation
  • Lookup Transformation
  • Filter Transformation
  • Joiner Transformation
  • Union Transformation
  • Update Strategy Transformation
  • Mapplets
  • User Defined Functions
  • Active versus Passive transformations

Optimal Mapping Design

  • SQL Overrides
  • Stored Procedure Transformation
  • Unconnected Lookup Transformation
  • XML Transformation
  • HTTP Transformation
  • Transaction Management with Active Transformations
  • Java Transformation
  • Normalizer Transformation
  • SQL Transformation

Parameters and Variables

  • Variable Assignment
  • Parameter Files

Expression syntax

  • Transformation Functions

Troubleshooting

  • Practical Experience
  • PowerCenter Workflow Monitor
  • Log Management

Workflows and Worklets

  • Workflows
  • Worklets
  • Workflow Tasks

Architecture

  • Domain and Node Architecture
  • Informatica PowerCenter Commands

Sample Test Questions

Which of the following is the default Transaction Control transformation variable value?
A. X TC_COMMIT_BEFORE
B. X TC_COMMIT_AFTER
C. X TC_ROLLBACK_BEFORE
D. Correct TC_CONTINUE_TRANSACTION
Which one of the following transformations may be unconnected in a valid mapping?(choose one)
A. Correct Lookup
B. X Lookup and Source Qualifier
C. X Lookup and Transaction Control
D. X Lookup, Source Qualifier, or Transaction Control
A Web Service Hub can be associated with more than one PowerCenter Repository Service.
A. Correct True
B. X False
In which of the following selections are all of thetransformations active?
A. Correct Router, Update Strategy, Joiner
B. X Expression, Router, Joiner
C. X Update Strategy, Expression, Aggregator
D. X Stored Procedure, Joiner, Router
Which transformations contain the Transformation Scope property?
A. X Filter, Joiner, Rank
B. Correct The Aggregator, Joiner, Rank, Sorter
C. X Union, Filter, Aggregator
D. X Rank, Lookup, Aggregator

Additional Information

Retake Policy: Current purchases of the test will include one second-attempt if a student does not pass a test. Any additional retakes are charged the current fee at the time of purchase. Promotions are excluded and cannot be combined. You must wait two weeks after a failed test to take the test again.

Informatica University has a community page so students can assist one another in their test preparation within the Informatica Network: https://network.informatica.com/welcome

For more information on Informatica Certifications visit https://www.informatica.com/services-and-training/certification.html
To find the class that is right for you, fill out the form at the top of this page, or visit our website at http://www.informatica.com/us/services-and-training/training/. For onsite class information contact your local Education Sales Specialist.

>> PowerCenter: Administration

Course Overview

Develop the skills to administer a PowerCenter infrastructure including the configuration and maintenance of PowerCenter.  Learn to create and configure services, manage security, and review system data for issues.

Objectives

After successfully completing this course, students should be able to:

  • Describe core administration tasks and tools
  • Configure the Informatica Administrator tool
  • Create and configure necessary services
  • Plan and implement a backup strategy
  • Manage Informatica security
  • Audit security access and privileges
  • Perform ongoing maintenance
  • Stop or recycle a service
  • Review domain logs

Target Audience

  • Administrators

Agenda

Module 1:  PowerCenter 10 Architecture

  • Informatica 10 technical architecture
  • Informatica Domain, Nodes, and application services
  • Informatica clients
  • Informatica platform components

Module 2: Best Practices

  • Configuring Informatica 10 environment using recognized best practices
  • Informatica 10 configuration

Module 3: Installing Informatica 10

  • Installation of the Informatica 10 domain
  • Installation of the Informatica 10 clients

Module 4: Using the Administrator tool

  • Administrator tool layout
  • Manage tab options
  • Domain view options
  • Services and Nodes view

Module 5: Configuring PowerCenter Application Services

  • Creating a PowerCenter Repository Service and Repository
  • Creating a PowerCenter Integration Service
  • Creating and configuring domain folders

Module 6: Users and Groups

  • Informatica 10 security model
  • Create and manage user accounts
  • Create and manage groups
  • Infacmd utility to manage domain objects and security

Module 7: Privileges and Roles

  • Manage privileges
  • Manage roles

Module 8: Permissions

  • Overview of permissions
  • Types of permissions
  • Levels of permissions

Module 9: OS Profiles

  • Informatica operating system profiles
  • Configuration
  • Implement operating system profile

Module 10: PowerCenter Repository and Clients

  • PowerCenter repositories
  • Use of PowerCenter Clients and objects in managing users, accounts, groups, roles, privileges, and permissions

Module 11: PowerCenter Web Services Hub Management

  • PowerCenter Web Services Hub
  • Create and configure a PowerCenter Web Services Hub service

Module 12: Configuring the Model Repository Service

  • Overview of the Model Repository Service
  • MRS user interface
  • Manage MRS logs
  • Manage MRS with the infacmd utility

Module 13: Configuring the Data Integration Service

  • Overview of the Data Integration Service
  • DIS user interface
  • Create a DIS
  • Manage DIS with the infacmd utility

Module 14: Analyst Service

  • Purpose of the Analyst service
  • Create an Analyst service
  • View Analyst service permissions and logs
  • Analyst command line

Module 15: PowerCenter Repository Metadata Deployment

  • Deploy PowerCenter metadata from development to test environment
  • Deploy PowerCenter metadata across domains to production environment

Module 16: PowerCenter Domain Management

  • Domain alerts
  • SMTP settings
  • Move an Informatica domain from one database to another

>> PowerCenter Data Integration 10: Administrator, Specialist Certification

Certification Overview

This test measures your competency across installation and configuration, architecture, server maintenance, security, deployment, PowerCenter Repository management, web services, command line utilities and Informatica Velocity Best Practices and Implementation Methodology as a full member of a project implementation team.

When you are ready to test and become an Informatica Certified Specialist (ICS), please follow these steps :

  1. Click Enroll and log in to your Informatica account.
  2. Click Add to Cart and complete your registration/purchase.
  3. Once you have registered go to My Training and View Your Transcript.
  4. Now you can simply Launch and take your test Anytime/Anywhere prior to your test’s expiry date.

The below can be used as a guide to preparing before taking the test. Included is an outline of the technical topics and subject areas that are covered in the test, test domain weighting, test objectives and topical content.

Target Audience

  • Students seeking certification as a PowerCenter Administrator Specialist

Prerequisites

The skills and knowledge areas measured by this test are focused on product core functionality inside the realm of a standard project implementation. Training materials, supporting documentation and practical experience may become sources of question development.

The suggested training prerequisites for this certification level are the completion of the following Informatica course(s):

    Skill Set Inventory

Certification Exam | Data Integration | Version 10 | Skill Set Inventory

Test takers will be measured on:

  • Architecture
  • Server Maintenance
  • Security
  • Deployment
  • License Keys
  • Repository Management
  • Web Services
  • Command Line Utilities
  • Metadata Reporting
  • Administrator Tool

Test Domains

The test domains and the extent to which they are represented as an estimated percentage of the test follows:

Title
% of Test
Architecture 20%
Installation 3%
License Keys 3%
Option-Based Questions 1%
Repository Management 8%
Security 53%
Service Maintenance 11%

 

Question Format

You may select from one or more response offerings to answer a question.

You will score the question correctly if your response accurately completes the statement or answers the question. Incorrect distractors are given as possible correct answers so that those without the required skills and experience may wrongly select that choice.

A passing grade of 70% is needed to achieve recognition as an Informatica Certified Specialist (ICS) in PowerCenter Data Integration Administrator.

You are given 90 minutes to complete the test. Formats used in this test are:

  • Multiple Choice:  Select one option that best answers the question or completes the statement
  • Multiple Response: Select all that apply to best answer the question or complete the statement
  • True/False: After reading the statement or questions select the best answer

Test Policy

  • You are eligible for one attempt and re-take, if needed, per test registration.
  • If you do not pass on your first attempt:
    • The purchase of the test will include one second-attempt if a student does not pass a test.
    • You must wait two weeks after a failed test to take the test again.
    • Any additional retakes are charged the current fee at the time of purchase.
    • Promotions are excluded and cannot be combined.

Test Topics

The test will contain 70 questions comprised of topics that span across the sections listed below.  In order to ensure that you are prepared for the test, review the subtopics with each section.

Informatica Architecture

  • How the different pieces of the architecture fit together
  • How the different services relate to one another
  • The function of each service
  • The different types of nodes & how they function
  • Be familiar with the client applications & how they interact with the domain
  • Be familiar with the Administrator tool
  • Domain alerts & how they’re configured
  • Domain migration
  • The license service

Service Maintenance

  • Know the function of each service
  • Know which services support which client applications
  • Know how to configure each service
  • Generate a diagnostics file

Security

  • Be able to define the terms “user,” “group,” “privilege,” “role,” & “permission”
  • Know how to configure users, groups, privileges, roles, & permissions
  • Know how users, groups, privileges, roles, & permissions relate to one another
  • Know the Security page in the Administrator Tool
  • Know how to assign users to groups
  • Know how to assign privileges & roles to groups or users
  • Know how to set up permissions in the various applications
  • Know how to set up permissions in the domain
  • Know how Informatica authentication works
  • Default administrator accounts
  • Know the difference between native & LDAP security
  • Know how to configure native & LDAP security
  • Know the difference between assigned & inherited privileges
  • Know the difference between assigned & inherited permissions
  • Know the difference between a system & a custom role
  • Privilege groups
  • Direct, inherited, & effective domain permissions
  • Know how to configure operating system profiles
  • Subject area implementation
Deployment

  • Know the different methods of PowerCenter metadata deployment
  • Advantages & disadvantages of each deployment method
  • Best practices around deployment

License Keys

  • Know the role of the license key file
  • Know how to assign services to a license
  • Know how to view options for a license

Repository Management

  • Know the functionality of each of the client tools
  • Know how to set folder permissions in PowerCenter
  • PowerCenter shortcuts
  • Know how to configure PowerCenter connections
  • Types of objects reside in a PowerCenter repository

Web Services

  • Informatica web service architecture
  • Know the messaging standard Informatica uses
  • Know the role of a WSDL file as it relates to an Informatica web service
  • Know how to configure an Informatica real-time web service

Command Line Interface

  • Function of each CLI
  • Know which CLIs have an interactive mode
  • Know the successful CLI return code
  • Know how to capture the CLI return code

Metadata Reporting

  • Role of the MX views
  • Know the function of “Save MX Data”
  • Metadata extensions
  • Know how to create & manage non-reusable metadata extensions
  • Know how to create & manage reusable metadata extensions

Administrator Tool

  • Know configurations
  • Understand connections and actions

 Sample Test Questions

The Service Manager runs which functions on the Master Gateway Node in the domain?
A. X ETL, Configuration, and Alerting
B. X Security, Web Services, and Alerting
C. X Web Services, Configuration, and Licensing
D. Correct Security, Configuration, and Alerting
Which of the following actions can you perform in the PowerCenter Workflow Manager?
A. Correct Create and execute workflows
B. X Create, execute, and monitor workflows
C. X Execute and monitor workflows
D. X None of the above
If a user has permissions on a domain folder in the Administrator, then that user has permissions on all objects in that folder.
A. Correct True
B. X False
What are the three types of domain object permissions?
A. X Direct, Inherited, Post Facto
B. X Indirect, Inherited, Effective
C. X Direct, Assigned, Effective
D. Correct Direct, Inherited, Effective
When assigning a license to an application service, which statement is correct?
A. X You can assign licenses so long as you have the license object privilege
B. X The license must have high availability functionality
C. X The application service must have already had a license assigned to it
D. Correct The application service cannot have any licenses assigned to it

Additional Information

Retake Policy: Current purchases of the test will include one second-attempt if a student does not pass a test. Any additional retakes are charged the current fee at the time of purchase. Promotions are excluded and cannot be combined. You must wait two weeks after a failed test to take the test again.

Informatica University has a community page so students can assist one another in their test preparation within the Informatica Network: https://network.informatica.com/welcome

For more information on Informatica Certifications visit https://www.informatica.com/services-and-training/certification.html
To find the class that is right for you, fill out the form at the top of this page, or visit our website at http://www.informatica.com/us/services-and-training/training/. For onsite class information contact your local Education Sales Specialist.

>> B2B: Data Exchange for Developers

Course Overview

This course is applicable to software version 10.2.2. Gain the skills necessary to use the principal features of Informatica B2B Data Exchange for electronic data interchange between partner organizations. Explore the features and benefits of MFT and also the advantages of DX-MFT integration.

Objectives

After successfully completing this course, students should be able to:

  • Describe the DX architecture, components, DT services and how they integrate in DX
  • Describe DX Objects, and how they work together to enable partner communication
  • Describe how DX leverages Informatica PowerCenter and vice versa
  • Describe MFT, and various MFT endpoints supported by DX
  • Describe the DX Partner Portal, and its available functionality
  • Apply key concepts, such as Documents and Events, parent-child relationships and Delayed Processing, and how they are used to handle B2B relationships with partners
  • Monitor and manage DX events, and use them to track workflow progress
  • Create and use Event Monitors to control Real-Time and Batch workflow execution
  • Manage DX Objects using Web Services interface
  • Configure the DX dashboard for custom reporting
  • Troubleshoot issues in a DX environment

Target Audience

  • Developer

Prerequisites

OR 
  • PowerCenter: Developer, Level 1 (onDemand)

Agenda

Module 1: DX Overview

  • Informatica B2B Data Exchange
  • Business Context of B2B Use-case
  • Features and Benefits of B2B DX
  • DX Architecture/Components
  • B2B Data Exchange 10.2.2 New Features

Module 2: Exploring the DX Interface

  • Using the B2B DX Console
  • Management of DX Objects
  • Endpoint, Application, Workflow, Partner and Account, Profile, Document and Events
  • Lab: Creating Data Exchanges

Module 3: PowerCenter Integration with Real-Time Workflows

  • Functional Components of Real-Time Workflows
  • Create DX Objects and Transformations in a PowerCenter Workflow
  • Connection Objects, JNDI Application, JMS Application Connections and Workflows
  • Running a Real-Time Workflow in DX
  • Lab: DX and PowerCenter Integration
  • Lab: Integrating DX and DT

Module 4: Handling Batch Workflows

  • DX Flow Decision making logic
  • Batch and Real-Time Use-Cases
  • Configuring DX Workflows for Batch
  • Scheduling in DX
  • Lab: Mapping Parameters and DX

Module 5: Managing Documents and Events

  • Overview of B2B DX Events
  • Event Types
  • Event Status
  • Event Status History
  • Event Attributes
  • Event Properties
  • Event Search – Basic & Advanced
  • Viewing Events
  • Dialogue – Root, Parent and Child Events
  • Creating and managing new events using DX Transformations
  • Lab: Integrate DX Profile Parameters

Module 6: Chaining Workflows and Data Flow

  • DX Data flow and Design
  • Content based routing
  • Design Decisions
  • Parent and Child Events
  • Overview of Chained Workflows
  • Chained Workflow Use-Case
  • Lab: Chaining workflows

Module 7: Reconciling Events

  • Introduction to Reconciliation
  • Reconciliation Use-Case
  • DX Reconciliation Process
  • DX Transformations for event reconciliation
  • Reconciliation Monitor
  • Monitor Frequency
  • Handling Reconciliation Issues
  • Lab: Creating a Reconciliation Event
  • Lab: Sending an Acknowledgment
  • Lab: Completing the Reconciliation Process

Module 8: Delayed Events

  • Introduction to Delayed Events and Delayed Processing
  • Delated Event Use-Case
  • Delay Rules
  • Release Rules
  • Rule Evaluation
  • Setting up Delayed Processing in DX
  • Lab: Setting up Delayed Processing in DX

Module 9: DX MFT Integration

  • Introduction to MFT
  • Features and Benefits of MFT
  • DX – MFT Integration
  • Installing MFT
  • MFT Endpoints
  • Lab: Managing the Dashboard
  • Lab: Setting Up Email Resource with MFT Portal
  • Lab: Setting up a MFT File Transfer from Source to Target Endpoint
  • Lab: Using a Built-in Scheduler to Execute Projects at Future Dates and Times
  • Lab: Create endpoints that directly interact with MFT

Appendix I: Using Event Monitors

  • Introduction to Event Monitors
  • Using Events to Monitor Processing in Real Time
  • Viewing Monitored Events
  • Overview of DX Advanced Exception Handling
  • Event Monitor to track exceptions
  • Lab: Monitor Processing Using Events

Appendix II: Using the Partner Portal

  • Overview of Partner Portal
  • Dashboard
  • Monitoring Event
  • On-boarding
  • File Exchange
  • Authentication & Access Control
  • DX User Authentication
  • Native and ISP Authentication Modes
  • DX Access Control
  • User Groups and Categories
  • Lab: Getting Started with Partner Portal

Appendix III: Dashboard and Reports

  • Overview of Dashboard and Reporting
  • Dashboard and Reports Structure
  • Customizing the Dashboard in Logi Info Studio
  • Lab: Overview of Dashboard
  • Lab: Creating a Dashboard

>> B2B: Data Transformation for Developers

Course Overview

This course is an introduction to Informatica’s B2B Data Transformation for developers, analysts, and other users who are responsible for designing and implementing transformations.  It also includes best practices for Parsing, Serializing, and Mapping tasks.  This course is applicable to software version 10.2.0.

Objectives

After successfully completing this course, students should be able to:

  • Discuss the key concepts of the B2B Data Transformation
  • Utilize parsing techniques
  • Move information from an XML structure to an output format
  • Perform XML to XML transformation with Mappers
  • Discuss various preprocessors and their uses
  • Describe locators, keys, and indexing and their uses
  • Explain concepts such of Logic implementation via conditions and Specification-driven transformation
  • List the Streamers to process large documents and validators
  • View the notification process and failure handling techniques
  • Discuss the purpose of Libraries
  • Explore the usage of Data Transformation services in B2B Data Exchange and the usage of prebuilt libraries in Data Transformation
  • Discuss PowerCenter integration via the Data Processor transformation

Target Audience

  • Developer

Prerequisites

Agenda

Module 1: Data Transformation Overview

  • Data Transformation
  • Universal Data Transformation
  • DT Studio
  • Additional components
  • Architecture overview
  • Deployment and Services
  • Developer/Data Processor components and perspectives
  • Design workflow
  • Wizard walk-through

Module 2: XML and XSD Basics

  • Overview of both XML and XSD
  • Schema creation
  • Compare the XSD and Data Transformation syntax
  • IntelliScript

Module 3: Parsers

  • Parsers overview
  • Parser properties
  • Parser format
  • Parser use cases
  • Basic anchors in Data Transformation
  • Types of Anchors
  • Anchor properties
  • Markers and its properties
  • Lab: Basic Parsing

Module 4: Anchors and Marking

  • Scope of Anchors
  • Phase of Anchors
  • Marking and Marking options
  • Lab: Phase and Marking

Module 5: Group and RepeatingGroup

  • Group overview
  • Group properties
  • RepeatingGroup functionality
  • RepeatingGroup properties
  • Lab: Group and RepeatingGroup

Module 6: Matching

  • Regular expressions
  • Regex special characters
  • Character classes
  • Line/Word anchors
  • Conditions
  • Pattern search in DT
  • Variables

Module 7: Actions and Transformers

  • Types of Actions
  • Default and commonly used Transformers
  • Differences between Actions and transformers
  • Commonly used actions
  • Data manipulation functions
  • Conditional processing
  • Alternatives
  • Groups and EnsureCondition
  • Alternatives and EnsureCondition
  • Lab: Transformers, Actions, and Conditions

Module 8: Mapper

  • Mapper overview
  • Mapper properties
  • Mapper actions and anchors
  • Map overview
  • SetValue overview
  • Lab: Simple mapper

Module 9: GroupMapping and Locators

  • GouprMapping overview
  • GroupMapping properties
  • Group Locators – By Occurrence, By Key
  • GroupMapping Use cases
  • Lab: GroupMapping and Locators

Module 10: RepeatingGroup and Alternative Mappings

  • RepeatingGroupMapping overview
  • RepeatingGroupMapping properties
  • Locators in RepeatingGroupMapping
  • RepeatingGroupMapping use cases
  • AlternativeMappings overview
  • AlternativeMappings properties
  • AlternativeMappings use cases
  • CalculateValue and EnsureCondition
  • Lab: RepeatingGroup and Alternative Mappings

Module 11: Serializer

  • Serializer overview
  • Serializer anchors
  • Transformers on serializers
  • Group serializer
  • Create a serializer
  • Commonly used, string and content serializers
  • How serializers utilize locator property
  • RepeatingGroup Serializer overview
  • RepeatingGroup Serializer properties
  • RepeatingGroup Serializer use cases
  • AlternativeSeriallizer overview
  • AlternativeSeriallizer functionality
  • AlternativeSeriallizer use cases
  • Actions in Serializer
  • Lab: Simple Serializer
  • Lab: RepeatingGroup Serializer

Module 12: Global Components

  • Global components overview
  • Uses of global components
  • Connecting components – RunParser, RunSerializer, and RunMapper
  • Lab: Global Components

Module 13: Embedded Components

  • Embedded Parser overview
  • Embedded Parser properties
  • Connect component
  • Embedded Parser use cases
  • Embedded Mapper overview
  • Embedded Mapper properties
  • Embedded Mapper use cases
  • EmbeddedSerializer overview
  • EmbeddedSerializer properties
  • EmbeddedSerializer use cases
  • Lab: EmbeddedParser
  • Lab: EmbeddedMapper

Module 14: Advanced Parsers

  • EnclosedGroup
  • DelimitedSections
  • FindReplaceAnchor
  • Lab: Advanced Parsers

Module 15: Action Components

  • Sort
  • AggregateValues
  • AppendValues
  • XSLTMap
  • DelimitedSectinsSerializer

Module 16: Deploying Services

  • Deploying Services overview
  • Deploying Services properties
  • Lab: Use Data Processor transformation in Mapping

Module 17: Libraries

  • Purpose of Libraries
  • Library structure
  • Installing Libraries
  • Libraries and Data Processor Transformation
  • Library object development
  • Video demo (HL7): Create a library transformation with a wizard

Module 18: Unstructured Data Transformation

  • Unstructured Data Transformation overview
  • Unstructured Data Transformation components
  • How it works with DT
  • Relational Hierarchies
  • Lab: Additional Exercise
  • Appendix I: Command Line Executio

===

> Product 360


Product 360: Create and Manage Catalogs (Instructor Led)
Product 360: Configuration and Management (Instructor Led)
MDM Product 360: Media Manager (onDemand)
MDM Product 360: Advanced Export (onDemand)
Product 360: Create and Manage Catalogs (Instructor Led)
Product 360: Configuration and Management (Instructor Led)
ActiveVOS: Fundamentals (onDemand)

Product 360
Analyst/
Data Steward

Product 360: Create and Manage Catalogs (Instructor Led)

>> Product 360: Create and Manage Catalogs

Course Overview

This course is designed for uses responsible to manage product information using the Informatica Product 360. Explore the skills needed to manage master and supplier catalogs, item, and product information through Informatica MDM Product 360 web and desktop clients. This course is applicable for version 8.1

Objectives

After successfully completing this course, students should be able to:

  • Define Product 360 use cases
  • Describe Product 360 tools and terms
  • Describe Product 360 architecture
  • Onboard product information
  • Add product and item information
  • Manage product catalogs
  • Import structured and unstructured data
  • Merge Product information
  • Collaborate as a team through workflows

Target Audience

  • Business Analyst
  • Data Analyst
  • Data Steward
  • Developer

Prerequisites

  • None

Agenda

Module 1: Introduction

  • The challenges of managing product data
  • Discuss use cases for Product 360
  • Product 360 tools, terminology and architecture

Module 2: Product 360 Desktop and Web clients

  • Access Product 360 desktop client
  • Defining catalogs in desktop
  • Add item information from both web and desktop clients
  • Search, sort, group, and export item information from both web and desktop clients
  • Navigate Product 360 web client
  • Manage catalogs from both web and desktop clients
  • Add item information from the web client
  • Search, sort, group, and export item information from the web client
  • Manage assortments, classifications, characteristics, and structures from the web client

Module 3: Data Cleansing and search

  • Create and manage queries through the web client
  • Use Smart Search in the web client

Module 4: Data import and export

  • Describe import and export process
  • Import and Export Data through the web client

Module 5: Workflows

  • Describe tasks and workflows
  • Take part in workflows through the web client

>> Product 360: Configuration and Management

Course Overview

This course is applicable to software version 8.1.1. Gain the skills required to configure, administer, and manage the Product 360 application.  Learn how to onboard data, perform merge, cleanse and enrich data, manage Supplier Portal, manage media assets, export data, manage workflows, and maintain user and user groups.

Objectives

After successfully completing this course, students should be able to:

  • Describe the P360 architecture and application interfaces
  • Describe Product 360 terminologies
  • Setup the taxonomy
  • Customize the repository and data model
  • Onboard and manage suppliers through the Supplier Portal
  • Configure manual and hotfolder import
  • Merge data into the master catalog
  • Describe data maintenance functions
  • Create a product and add items to products
  • Classify product to structure group
  • Create item assortments
  • Manage media assets
  • Configure and execute data quality rules
  • Export data
  • Manage workflows
  • Manage users and user groups
  • Configure dashboards and flexible UIs
  • Describe and define a characteristic
  • Describe the functions of Audit Trail

Target Audience

  • Administrator

Prerequisites

Agenda

Module 00: Course Overview

  • Engage with the participants during the training session
  • State the expectations from the training session
  • Discuss the overall topics covered in the course

Module 1: Product 360: Introduction

  • Describe the P360 application architecture
  • Describe P360 user interfaces
  • Describe the P360 terminology

Module 2: Taxonomy, Repository Editor, and Unit Systems

  • Setup the product taxonomy
  • Describe the repository types in Repository Manager
  • Configure the repository to extend the data model
  • Create static and dynamic enumerations
  • Describe Unit Systems
  • Lab: Create a structure system
  • Lab: Review repository files
  • Lab: Create new fields
  • Lab: Create new categories
  • Lab: Create fields with static and dynamic enumeration

Module 3: Onboard Data

  • Configure manual and hotfolder imports
  • Define and manage suppliers and supplier catalogs
  • Register and invite suppliers to Supplier Portal
  • Import data and media via the Supplier Portal
  • Merge data into the master catalog
  • Lab: Configure and perform a manual import
  • Lab: Configure and perform Hotfolder import
  • Lab: Register new suppliers and send invites
  • Lab: Import data and media via Supplier Portal
  • Lab: Perform Merge

Module 4: Data Maintenance

  • Describe data maintenance functions
  • Create and edit items
  • Create a product and add items to it
  • Classify product to a structure group
  • Create static and dynamic assortments
  • Describe lookups
  • Manage media assets using Media Manger
  • Lab: Create a product and assign items to it
  • Lab: Classify product to a structure group
  • Lab: Create static and dynamic assortment
  • Lab: Manage media assets

Module 5: Data Quality

  • Describe data quality definitions, principles, rules and checks
  • Describe the data quality life cycle
  • Configure and schedule data quality rules
  • Execute and analyze data quality runs
  • Lab: Execute a manual data quality rule
  • Lab: Execute a scheduled data quality rule on entity change
  • Lab: Execute a scheduled time-controlled data quality rule

Module 6: Export

  • Describe the export management architecture
  • Define Export channels
  • Configure export format templates and export profiles
  • Define data sources, export parameters, and variable
  • Schedule and execute an export run
  • Lab: Create export format template
  • Lab: Create an export profile

Module 7: Workflow Management

  • Describe the Workflow components
  • Describe Tasks and Triggers
  • Describe ActiveVOS
  • Identify user roles and privileges specific to tasks
  • Configure triggers
  • Deploy and monitor workflows
  • Lab: Configure workflow triggers
  • Lab: Create tasks

Module 8: Administration

  • Manage users and user groups
  • Configure the UI (dashboards and task UI templates)
  • Describe the functions of Audit Trail
  • Lab: Create user and user groups
  • Lab: Configure a dashboard
  • Lab: Define a characteristic and configure a flex UI

>> MDM Product 360: Media Manager

Course Overview

This course focuses on the maintenance of images, pipelines, Data Quality workflows, derivative workflows, external preview generation, setting access levels, and the Media Manager integration with Product 360 (formerly PIM) in detail. Through a series of demonstrations and hands on practice labs, learn to configure various workflows, previews, pipelines, and access levels.

Objectives

After successfully completing this course, students should be able to:

  • Understand the basics of color management
  • Discuss ICC-Profiles and list their benefits
  • Discuss image transparencies
  • List the difference between pixel- and vector graphics
  • Classify the measurements LPI, DPI and PPI
  • Work with the pipeline application
  • Configure the states and property fields for each workflow
  • Work with the Workflow Module
  • Define your own Data Quality workflow
  • Define derivatives in workflow
  • Design a derivative workflow
  • Configure an external preview
  • Define special previews
  • Discuss working with access levels
  • Configure the integration of Media Manager to PIM
  • Configure Message Queue workflows
  • Configure Message Queue
  • Discuss the different Auto-Assignment modes

Target Audience

  • End-users
  • Media asset managers

Prerequisites

Agenda

Module 1: Images & Color Management

  • Basics of Color Management
  • ICC-Profiles
  • Image Transparency
  • Pixel Images versus Vector Graphics
  • Image Types
  • LPI, DPI, & PPI

Module 2: Pipeline

  • Working with the Pipeline Module
  • Defining a Pipeline
  • Verifying a Pipeline

Module 3: Data Quality Workflow

  • Preparing the states and property fields for each workflow
  • Working with the Workflow Module
  • Defining Data Quality Workflow

Module 4: Derivatives Workflow

  • Derivatives in Workflow Module
  • Designing a Derivatives Workflow

Module 5: External Preview Generation

  • Using the External Preview Generation
  • Configuring the External Preview Generation
  • Defining special Previews

Module 6: Access Levels

  • Working with Access Levels
  • Defining Access Levels
  • Setting Access Levels for Assets and Groups

Module 7: PIM MAM Integration

  • Configuring an integration
  • Message Queue Workflows
  • Message Queue
  • Differing Auto-Assignment Modes

>> MDM Product 360 8.x: Advanced Export

Course Overview

Learn to leverage Product 360 export functionality to support organizational web interface requirements in this process oriented class. Through a series of demonstrations and exercises, students review specifications, design/develop multi file export templates, develop a preview template for product review and create a data quality report for content validation applied during an export.

Objectives

After successfully completing this course, students should be able to:

  • Describe Product360 Data Model, Data Sources, Modules vs. Sub Modules
  • Interpret and implement interface specifications
  • Design/Develop complex multi-file export interface to support customer’s web shop
  • Export Features and Attributes with meta data information
  • Use functions in export templates
  • Create configurable and flexible exports

Target Audience

  • Developers
  • End Users

Prerequisites

  • PIM 7.1 (Product 360): New User Extended – (OnDemand)

Agenda

Module 1:  Introduction

  • Course Overview
  • Introduction to HENRI

Module 2: MDM Product360 Data Model Revisited

  • Data Model
  • Structures and Feature Inheritance

Module 3: MDM Product360 Export Basics Revisited

  • Overview of Export Functionality
  • Creating a Product Attribute List

Module 4: Creating a Complex Web Shop Export

  • Understanding the Specification
  • Exporting Structure Groups
  • Exporting the Feature Pool
  • Exporting Items and Products: Basic Data
  • Exporting Items and Products: Attributes and References
  • Exporting Items and Products: Prices

Module 5: HTML Previews

  • HTML Preview Basics
  • Creating an HTML Data Sheet

Module 6: Data Quality in Export

  • Data Quality Options in Export
  • Creating an Excel Report on Data Quality

>> ActiveVOS: Fundamentals

Course Overview

Discover best practices and techniques needed to implement standards-based composite applications using ActiveVOS. Included in this review are the basic skills needed to design, develop, test, deploy, and monitor new business processes in ActiveVOS.

Objectives

After successfully completing this course, students should be able to:

  • Install and configure ActiveVOS Designer.
  • Describe the full product capabilities of Designer.
  • Distinguish between Business Process Modeling Notation (BPMN) and Business Process Execution Language (BPEL) standards.
  • Navigate the components of the Designer window.
  • Create an orchestrated process using the Designer graphical interface constructs.
  • Apply best practices and techniques for planning and developing executable processes.

Target Audience

  • Architects and Developers new to ActiveVOS

Prerequisites

  • A basic understanding of Web Services as well as XML, WSDL, and XML Schema
  • Java development background is recommended but not required

Agenda

Module 1: Introduction

  • Product Capabilities
  • Installation and Configuration

Module 2: Getting Started with ActiveVOS Designer

  • Key Concepts of (BPMN, Lab 2.1, BPEL)
  • Relationship between BPMN and BPEL
  • Project Organization and Meta-data

Module 3: ActiveVOS Designer GUI Basics

  • Menus and Toolbars
  • Visual Notation Constructs and Details, Annotations, and o
  • Other Module Techniques

Module 4: Process Orchestra

  • XML Schema and WSDL Overview
  • Message Exchange Patterns (MEPs)
  • Process Planning and Design Approaches
  • Participants: Consumer and Providers

Module 5: Expression Building (Adding Logic to the Process Model)

  • Expression Building Overview

Module 6: Process Simulation for Debugging

  • Simple Simulation Review
  • Process Simulation Overview
  • Simulation Concepts
  • Simulation and Faults
  • Simulate faults from runtime

Module 7: Process Deployment

  • Process Deployment Sequence
  • Preparation for Deployment
  • The Process Deployment Cycle
  • Process Deployment Descriptor File Creation
  • Deployment of a Process with Multiple Partner Services
  • Business Process Archive File Definition
  • Archive File Creation
  • Contribution Management
  • Deployment Tips

Module 8: Process Execution

  • Process Execution Overview
  • Active Process Inspection

Module 9: Best Practices

=============

> Master Data Management

 


Data Director Developer

MDM: Multidomain and Hierarchy Configuration (Instructor Led)
MDM: Configuring Informatica Data Director (Instructor Led or onDemand)

Entity Services Developer

MDM: Multidomain and Hierarchy Configuration (Instructor Led)
Multidomain Entity 360 for Developers (Instructor Led OR onDemand)
Multidomain MDM 10.3 Developer, Specialist Certification

Administrator

Multidomain Admin & Installation (Instructor Led OR onDemand)
MDM: Multidomain Edition (Instructor Led OR onDemand)
Multidomain MDM 10.x Administration, Specialist Certification

Analyst/
Data Steward

MDM: Using Informatica Data Director (Instructor Led)
OR
Using Entity 360 (Instructor Led)

> MDM: Multidomain and Hierarchy Configuration

Course Overview

This course is applicable to software version 10.x. Discover the skills necessary to configure a data model and business rules in Informatica MDM Multidomain Edition (MDM Hub) for a given data model and set of business rules. Learn the concepts of hierarchies, configure entity objects, entity types, hierarchies, relationship base objects, relationship types, and profiles in the Hierarchy Manager.

Objectives

After successfully completing this course, students will be able to:

  • Define and create a data model within MDM
  • Configure MDM processes including Stage, Load, and Match
  • Leverage Merge Manager and Data Manager for Merge and Unmerge processes
  • Utilize Log files
  • Describe hierarchies
  • List the components of hierarchies
  • Create entities and entity types
  • Define relationships and relationship types
  • Configure profiles
  • Deploy hierarchies

Target Audience

  • Developer

Agenda

Multidomain Configuration

Module 1: Introduction

  • Master Data Management definitions
  • Goals of Master Data Management (MDM)
  • Key capabilities of Informatica’s Multidomain Edition (MDM Hub) solution
  • Informatica MDM Hub Application Server and Database Server tiers
  • Master data workflow within the Informatica MDM Hub
  • Capabilities of the Trust Framework

Module 2: Define the MDM Data Model

Lesson 1: Define the Data Model
  • Landing tables purpose, content and best practices
  • Base object tables purpose and content
  • Staging tables purpose and content
  • Creating Landing tables, base object tables, and staging tables
Lesson 2: Customer Data Models
  • Creating landing tables from Sales source system
  • Creating the Customer base object
  • Creating staging tables for Customer data from the Sales and CRM source systems
Lesson 3: Relationships
  • Identifying One-to-Many and Many-to-Many relationship types
  • Foreign-key relationships between base objects of the data model
  • Many-to-many relationships between the base objects of the data model

Module 3: Configuring the Stage Process

Lesson 1: Process Server & Creating Mappings
  • Purpose of the Process Server
  • Registering a Process Server
  • Basic and complex mappings
  • Building blocks used in transformational logic mapping
  • Purpose of a function
  • Associating function types to descriptions
  • Predefined function types
  • Constants
  • Conditional Execution Component
  • Basic Mapping
  • Cleanse List
  • Reusable Cleanse components with descriptions
  • Supported external data cleansing options
  • Testing a mapping
  • Graph and cleanse functions in a Mapping
Lesson 2: Configuring Delta Detection and Audit Trail
  • Delta Detection purpose
  • Conditions for using delta detection
  • Conditions under which Data Rejection occurs
  • Characteristics of Audit Trail
  • Delta detection options and raw data retention periods
  • Stage Process Flow sequence
  • No error jobs
  • Batch Viewer
  • Stage job

Module 4: Configuring the Load Process

Lesson 1: Setting Trust, Validation Rules, and the Load Process
  • Overview of Trust
  • Calculating the trust
  • Conditions for leveraging Trust
  • Characteristics of Validation Rules
  • Setting trust
  • Behaviors for switching on or off various functions
  • Adding a Validation Rule
  • Setting Cell and Null Updates
  • Load process for UPDATES, INSERTS, and REJECTS

Module 5: Configuring Match and Merge Processes

Lesson 1: Match and Merge Overview
  • Resolving issues with Master Data Management
  • Resolving duplication
  • Challenges associated with duplicate records
  • Purposes of Matching and Merging
  • Exact and Fuzzy matching
  • Match Tokens
  • Match Path
  • Match Columns
  • Characteristics of the Provider Columns for Match Columns
  • Multiple records match on multiple columns
  • Match Process Flow
  • Configure matching in the MDM Console

Lesson 2: Configuring Exact Matching

  • Match rules
  • Exact matching
  • Match rule properties for filtering (Matching NULLS and Segment Matching)
  • Match rules using match columns

Lesson 3: Configuring Fuzzy Matching

  • Fuzzy matching
  • Characteristics of Populations
  • Matching population types to their descriptions
  • Match Keys
  • Key Type matching keys property
  • Key Width match key property
  • Search level used in Fuzzy matching
  • Configuring a range search
  • Fields for standard populations
  • Match Purpose fuzzy rules property

Lesson 4: Configuring the Merge Process

  • Consolidation state of a record
  • Sequence of the Merge Process
  • Distinguish when to use Automatic versus Manual Merge methods
  • Immutable Source System
  • Distinct system
  • Rules of a distinct source system
  • Unmerge Child When Parent Unmerges option
  • Selecting Merge Properties (Match/Merge Setup Details)
  • Automatic match and merge
  • Running automatic match and merge jobs

Module 6: Configuring Data Access Views

Lesson 1: Creating Queries and Packages
  • Goals of queries and packages
  • Impact Analysis tool
  • Packages types
  • Creating a PUT-enabled package

Module 7: Configure Batch Processes

Lesson 1: Configuring and Executing Batch Processes
  • Jobs most often assembled into Batch Groups
  • Assembling a Batch Group
  • Batch Groups, Batch Group Levels, Batch Jobs
  • Creating and Executing a Batch Group

Module 8: Utilize Data Management Tools

Lesson 1: Managing Records with Merge Manager and Data Manager
  • The purposes of Data Steward and Merge Manager
  • Editing and merging records manually
  • Using Data Manager
  • the results of an unmerge
  • Standard and tree unmerge

Module 9: System Information and Logs

Lesson 1: Enterprise Manager

  • MDM Hub Console Enterprise Manager tabs

Lesson 2: Log Files

  • Log files related to the MDM Hub Console
  • Using Enterprise Manager’s ORS Database tab

Module 10: User Objects

Lesson 1: User Exits and Custom Objects

  • User objects registered with MDM
  • User Object Registry tool
  • User Exits
  • Exits used in the Stage, Load, Match, Merge, and Unmerge processes

Module 11: Additional MDM Product Features

Lesson 1: Brief Overview of Additional Features

  • Exporting an ORS to a changelist using Metadata Manager

Hierarchy Manager

Module 1: Introduction

  • Challenges of managing relationships
  • Objectives of hierarchy management
  • Review the scenario used for the lab exercises in this course

Module 2: Enabling Hierarchy Manager

  • Hierarchy Manager for an MDM Operational Reference Store
  • System repository base objects (RBOs)

Module 3: Entities and Entity Types

  • Entity Base Objects
  • Entity Types

Module 4: Hierarchies, Relationships and Relationship Types

  • Hierarchies and Relationships
  • Relationship Types

Module 5: HM Packages

  • Package for a relationship object
  • Build a package for a FK relationship object
  • Generate a package for an entity object

Module 6: HM Profiles

  • New profile creation
  • Assign packages to a profile

Module 7: Loading Data and Testing HM Configuration

  • Load data and test our HM Configuration

> MDM: Configuring Informatica Data Director

Course Overview

This course is an overview of the Master Data Management concept using the Informatica MDM Data Director tool. It will cover essential terminology and concepts used for developing the IDD applications which are necessary to understand what goes into an IDD implementation. This course is applicable to software version 10.1.

Objectives

After successfully completing this course, students should be able to:

  • Describe terms and concepts around MDM and Data Governance
  • Use MDM components
  • Create an IDD application and configure application-level properties
  • Create subject areas and subject area groups
  • Relate base object relationships with subject area relationships
  • Configure lookups
  • Utilize MDM Hub cleanse functions from within the IDD application
  • Configure data search capabilities
  • Import and export bulk master data within IDD
  • Track data changes with timeline
  • Access hierarchies from IDD
  • Configure workflows, tasks, and IDD security
  • Localize an IDD application and configure custom help
  • Configure IDD application to access Hierarchy Manager
  • Customize user interface extensions
  • Learn and use e360 framework
  • Learn and understand the usage of the Provisioning tool

Target Audience

  • Developer

Prerequisites

  • None

Agenda

Module 1: Introduction

  • Master Data Definition
  • Challenges of master data across applications
  • Data Governance and its Framework
  • Primary drivers for Data Governance
  • MDM Hub with Informatica Data Director (IDD)
  • Key functionalities of Informatica Data Director (IDD)
  • IDD Components and pre-configuration steps
  • Configuring custom start page for a role

Module 2: Components of an IDD Application

  • IDD Configuration Manager
  • Components of an IDD application and options
  • Binding an IDD schema
  • IDD application as a source system
  • Trust in the MDM Hub
  • Assigning the highest trust source system used for the IDD application
  • Creating and binding a new IDD application to an ORS

Module 3: Subject Areas and their Relationships

  • Subject Area Group Overview
  • Creating a Subject Area Group
  • The Subject Area purpose
  • MDM Hub and Subject Areas and Subject Area Groups
  • Relating a MDM Hub Multidomain data model and an IDD Subject Area
  • Subject Area descriptions and associations
  • Search Result Display Packages
  • Configuring the layout of a Subject Area
  • Creating child subject area relationships
  • Grandchild subject area relationships
  • Sibling references
  • Including parent data in a subject area
  • Filtering child records

Module 4: Lookups and Data View Features

  • The purpose of lookups
  • Dropdown lookup fields
  • Dependent lookups
  • Lookup tables with a sub-type column
  • Localizing lookup values
  • Entity Search lookups

Module 5: Data Cleansing, Standardization, and Validation

  • Review the cleanse, standardization, and validation capabilities
  • Cleansing primary and child objects
  • ValidationStatus cleanse parameters
  • Using and testing a cleanse function

Module 6: Search and Match Master Data

  • Types of search provided in IDD
  • Basic Search
  • Extended Search
  • Configuring Extended Search Capabilities
  • Finding Duplicates for New and Existing Records
  • Advanced Search
  • Advanced Query Builder tool
  • Configuring and performing Smart Searches

Module 7: Master Data Imports and Exports

  • Export Profiles
  • Import Profiles
  • Data Import Templates
  • Using the Configuration Manager
  • Using the Import Wizard

Module 8: Managing Data Change Events

  • Timeline feature
  • Effective Dates and History
  • Timeline scenarios
  • Determining the active version
  • Leveraging timeline rules
  • Enabling timeline in the MDM Hub
  • Timeline state management
  • New and future records creation with the timeline feature
  • Viewing and searching for timeline records

Module 9: Configuring IDD for Hierarchies

  • Hierarchy in MDM
  • Hierarchy components
  • Hierarchy Manager in MDM and IDD

Module 10: Workflow and Security

  • Workflow definition
  • State management in IDD workflows
  • Workflow Configuration in IDD
  • Task Notification and Assignment engines

> Multidomain MDM Entity 360 for Developers

Course Overview

Applicable for users of Software version 10 through current versions. Learn to configure the major components of Entity 360 for the maintenance and consumption of Master Data. Leverage the Entity 360 tools to define core business entities, relationships, transformations, and configure customized user interface components.

Objectives

After successfully completing this course, students should be able to:

  • Describe MDM Tool spectrum
  • Create business entities, reference entities, and relationships
  • Define basic and extended search queries
  • Define role-specific entity layouts and home pages
  • Configure ElasticSearch
  • Define Data transformations for cleansing
  • Describe workflows and ActiveVOS integration

Target Audience

  • Developer

Prerequisites

Agenda

Module 1: Introduction to Entity 360

  • Master Data Overview
  • MDM tool spectrum and user matrix
  • IDD classic and subject areas
  • Entity 360 and the provisioning tool
  • Navigate the Entity 360 user interface
  • Define Queries
  • ActiveVOS Integration and workflows
  • Lab: Start services and add records
  • Lab: Two-step workflow
  • Lab: Queries

Module 2: Configure Entity 360

  • Feature comparison of IDD classic and Entity 360
  • MDM Entity 360 architecture
  • Entity 360 integration options
  • Components of Entity 360
  • Define Entity 360 application
  • Configure lookups, business entities, and business entity views
  • Lab: Create Application
  • Lab: Configure Reference Entities
  • Lab: Configure Business Entities and Relationships
  • Lab: Configure Business Entity Views

Module 3: Customize User Interface

  • Configure role specific entity views and home pages
  • Define task manager, similar record, get related components etc.
  • Lab: Customize E360 External Components
  • Lab: Customize E360 UI Layouts
  • Lab: Customize E360 Home pages
  • Lab: Related Records

Module 4: Data Cleansing

  • Cleansing components
  • External cleansing capabilities
  • Cleansing business entities
  • Define business entity to view, view to business entity, and business entity to business entity components
  • Lab: Person Cleanse
  • Lab: Debit Card
  • Lab: Address Doctor for Organization

Module 5: Smart Search

  • Smart Search Configuration
  • Solr based smart search
  • Configure ElasticSearch
  • Define custom views for search results
  • Lab: Smart Search
  • Lab: Custom Search
  • Lab: Filters

Module 6: Manage Data

  • Discuss Workflow Engines, Workflows, and Tasks
  • Define Templates, Task Types, and Triggers
  • Lab: Multi-level add and update approval
  • Lab: Custom Merge Layout
  • Lab: Two-step Merge Approval
  • Lab: Custom Unmerge Layout
  • Lab: Two-step Unmerge Approval

Module 7: MDM Applications

  • List MDM Applications
  • Describe MDM Applications
  • Extend the Customer 360 Application
  • Lab: Extend C360 Data Model

Module 8: Localization and Log Files

  • Describe localization
  • Identify log files for troubleshooting

> Multidomain MDM 10.3: Developer, Specialist Certification

Certification Overview

This exam measures your competency as a member of a project implementation team. You should be able to explain and identify the MDM product architecture and to configure its main components including cleanse engine, match server, data governance tools, and workflows.

When you are ready to test and become an Informatica Certified Specialist (ICS), please follow these steps :

  1. Click Enroll and log in to your Informatica account.
  2. Click Add to Cart and complete your registration/purchase.
  3. Once you have registered go to My Training and View Your Transcript.
  4. Now you can simply Launch and take your test Anytime/Anywhere prior to your test’s expiry date.

The below can be used as a guide to preparing before taking the test. Included is an outline of the technical topics and subject areas that are covered in the test, test domain weighting, test objectives and topical content.

Target Audience

  • Developer

Prerequisites

The skills and knowledge areas measured by this test are focused on product core functionality inside the realm of a standard project implementation. Training materials, supporting documentation and practical experience may become sources of question development.

The suggested training prerequisites for this certification level are the completion of the following Informatica course(s):

    Skill Set Inventory

Test takers will be measured on:

  • The MDM Data Model
  • Configuring the Stage Process, the Load Process, Fuzzy Matching
  • MDM Entity 360
  • The user interface, data cleansing and managing data.

Test Domains

The test domains and the extent to which they are represented as an estimated percentage of the test follows:

Title
% of Test
Introduction to MDM Multidomain Edition 3%
Define the MDM Data Model 9%
Configure the Stage Process 9%
Configure the Load Process 7%
Overview of Match and Merge Processes 3%
Configure Fuzzy Matching 9%
Configure the Merge Process 1%
Configure Data Access Views 1%
Configure Batch Processes and CLI Support 3%
User Objects 1%
Additional Product Features 4%
Introduction to Entity 360 9%
Configure Entity 360 4%
Customize User Interface 9%
Data Cleansing 9%
Smart Search 7%
Manage Data 9%
MDM Applications 3%
Localization and Log Files 1%

 

Question Format

You may select from one or more response offerings to answer a question.

You will score the question correctly if your response accurately completes the statement or answers the question. Incorrect distractors are given as possible correct answers so that those without the required skills and experience may wrongly select that choice.

A passing grade of 70% is needed to achieve recognition as an Informatica Certified Specialist (ICS) in MDM 10.3: Developer.

You are given 90 minutes to complete the test. Formats used in this test are:

  • Multiple Choice:  Select one option that best answers the question or completes the statement
  • Multiple Response: Select all that apply to best answer the question or complete the statement
  • True/False: After reading the statement or questions select the best answer

Test Policy

  • You are eligible for one attempt and re-take, if needed, per test registration.
  • If you do not pass on your first attempt:
    • The purchase of the test will include one second-attempt if a student does not pass a test.
    • You must wait two weeks after a failed test to take the test again.
    • Any additional retakes are charged the current fee at the time of purchase.
    • Promotions are excluded and cannot be combined.

Test Topics

The test will contain 70 questions comprised of topics that span across the sections listed below.  In order to ensure that you are prepared for the test, review the subtopics with each section.

Introduction to MDM Multidomain Edition

  • Master Data Management definitions
  • Key capabilities of Informatica’s Multidomain Edition (MDM Hub) solution
  • Informatica MDM Hub Application Server and Database Server tiers
  • Master data workflow within the Informatica MDM Hub
  • Capabilities of Trust Framework

Define the MDM Data Model

  • Data model elements
  • Relationships
  • Lookups

Configure the Stage Process

  • Process Server and creating mappings
  • Delta Detection and Audit Trail
  • Executing MDM Hub Processes

Configure the Load Process

  • Trust and Validation Rules
  • The Load Process

Overview of Match and Merge Processes

  • Resolving issues with Master Data Management
  • Duplications
  • Exact and Fuzzy matching
  • Match Tokens
  • Match Path
  • Match Columns
  • Multiple records match on multiple columns
  • Match Process Flow
  • Configure matching in the MDM Console

Configure Fuzzy Matching

  • Fuzzy matching
  • Populations
  • Match Keys
  • Key Type matching keys property
  • Key Width match key property
  • Search level used in Fuzzy matching
  • Configuring a range search
  • Fields for standard populations
  • Match Purpose fuzzy rules property

Configure the Merge Process

  • Consolidation state of a record
  • Sequence of the Merge Process
  • Distinguish when to use Automatic versus Manual Merge methods
  • Immutable Source System
  • Distinct system
  • Rules of a distinct source system
  • Unmerge Child When Parent Unmerges option
  • Selecting Merge Properties (Match/Merge Setup Details)
  • Automatic match and merge
  • Running automatic match and merge jobs
  • External Matching

Configure Data Access Views

  • Goals of queries and packages
  • Impact Analysis tool
  • Packages types
  • Creating a PUT-enabled package

Configure Batch Processes and CLI Support

  • Configuring and Executing Batch Processes
  • Jobs most often assembled into Batch Groups
  • Assembling a Batch Group
  • Batch Groups, Batch Group Levels, and Batch Jobs
  • Creating and Executing a Batch Group
  • Running Jobs through CLI
  • CLI configuration
  • Password encryption
  • CLI capabilities

User Objects

  • User objects registered with MDM
  • User Object Registry tool
  • User Exits
  • Exits used in the Stage, Load, Match, Merge, and Unmerge processes

Additional Product Features

Introduction to Entity 360

  • MDM tool spectrum and user matrix
  • IDD classic and subject areas
  • Entity 360 and the provisioning tool
  • Navigate the Entity 360 user interface
  • ActiveVOS Integration and workflows
  • Start services and add records
  • Two-step workflow
  • Queries

Configure Entity 360

  • MDM Entity 360 architecture
  • Entity 360 integration options
  • Components of Entity 360
  • Lookups, business entities, and business entity views
  • Reference Entities
  • Business Entities and Relationships
  • Business Entity Views

Customize User Interface

  • Role specific entity views and home pages
  • Task manager, similar record, get related components etc.
  • E360 External Components
  • E360 UI Layouts
  • E360 Home pages
  • Related Records

Data Cleansing

  • Cleansing components
  • External cleansing capabilities
  • Cleansing business entities
  • Business entity
  • Address Doctor for Organization

Smart Search

  • Smart Search Configuration
  • Solr based smart search
  • Configure ElasticSearch
  • Define custom views for search results
  • Smart Search
  • Custom Search
  • Filters

Manage Data

  • Discuss Workflow Engines, Workflows, and Tasks
  • Define Templates, Task Types, and Triggers
  • Multi-level add and update approval
  • Custom Merge Layout
  • Two-step Merge Approval
  • Custom Unmerge Layout
  • Two-step Unmerge Approval

MDM Applications

  • MDM Applications
  • Customer 360 Application
  • C360 Data Model

Localization and Log Files

  • Describe localization
  • Identify log files for troubleshooting

 Sample Test Questions

Entity 360 applications are built with the _____.
A. X Configuration Manager.
B. X Repository Manager.
C. X Hub Console.
D. Correct Provisioning Tool.
Which statement is correct concerning Load jobs?
A. Correct Load jobs should be run only after the associated Stage jobs have completed successfully.
B. X The Load job for the parent table should be run after the Load job for the child table.
C. X Load jobs cannot be run until all staging tables in the ORS have been populated.
D. X None of the above.
Match Rules may contain one or more Match Rule Sets.
A. X True
B. Correct False
Which is true when creating an Entity to Entity View Transformation?
A. Correct The Entity View must be created first.
B. X The Entity View may be created afterwards.
C. X The Entity View has to be created afterwards.
D. X The Entity view may be created first.
Customer 360 User Interface has a pre-defined Entity 360 configuration.
A. Correct True
B. X False

Additional Information

Retake Policy: Current purchases of the test will include one second-attempt if a student does not pass a test. Any additional retakes are charged the current fee at the time of purchase. Promotions are excluded and cannot be combined. You must wait two weeks after a failed test to take the test again.

Informatica University has a community page so students can assist one another in their test preparation within the Informatica Network: https://network.informatica.com/welcome

For more information on Informatica Certifications visit https://www.informatica.com/services-and-training/certification.html
To find the class that is right for you, fill out the form at the top of this page, or visit our website at http://www.informatica.com/us/services-and-training/training/. For onsite class information contact your local Education Sales Specialist.

> Multidomain MDM Installation for Administrators and Developers

Course Overview

This course is applicable to software version 10 and through current versions. Gain the skills required to install and configure the Multidomain MDM. Learn to perform a clean Multidomain MDM and ActiveVOS install on Windows platform with JBoss application server and Oracle database.

Objectives

After successfully completing this course, students should be able to:

  • Install and configure Jboss 6.4 EAP
  • Install MDM hub and process servers
  • Install ActiveVOS with MDM
  • Verify MDM installation

Target Audience

  • Administrator
  • Developer

Prerequisites

  • None

MDM: Multidomain Edition Configuration

Course Overview

This course is applicable to software version 10.2. Gain the skills necessary to configure a data model and implement business rules in Informatica MDM Multidomain Edition (MDM Hub). Learn the fundamental aspects of an MDM implementation and options for extending product capabilities.

Objectives

After successfully completing this course, students will be able to:

  • Define and create a data model within MDM
  • Configure MDM processes including Stage, Load, and Match
  • Leverage Merge Manager and Data Manager for Merge and Unmerge processes
  • Utilize Log files

Target Audience

  • Data Analyst
  • Data Steward
  • Developer

Prerequisites

  • None

Agenda

Module 1: Introduction to MDM Multidomain Edition

  • Master Data Management definitions
  • Goals of Master Data Management (MDM)
  • Key capabilities of Informatica’s Multidomain Edition (MDM Hub) solution
  • Informatica MDM Hub Application Server and Database Server tiers
  • Master data workflow within the Informatica MDM Hub
  • Capabilities of Trust Framework

Module 2: Define the MDM Data Model

  • Data model elements
    • Operational Reference Store
    • Landing tables purpose, content, and best practices
    • Base object tables purpose and content
    • Staging tables purpose and content
    • Creating landing tables, base object tables, and staging tables
  • Relationships
    • One-to-Many and Many-to-Many relationship types
    • Foreign-key relationships between base objects of the data model
    • Many-to-many relationships between the base objects of the data model
  • Lookups
    • Automatic lookups
    • Self-defined lookups

Module 3: Configure the Stage Process

  • Process Server and creating mappings
    • Purpose of the Process Server
    • Registering a Process Server
    • Basic and complex mappings
    • Building blocks used in transformational logic mapping
    • Purpose of a function
    • Associating function types to descriptions
    • Predefined function types
    • Constants
    • Conditional Execution Component
    • Basic Mapping
    • Cleanse List
    • Testing a mapping
    • Graph and cleanse functions in a Mapping
  • Delta Detection and Audit Trail
    • Delta Detection purpose
    • Conditions for using delta detection
    • Conditions under which Data Rejection occurs
    • Characteristics of Audit Trail
    • Delta detection options and raw data retention periods
  • Executing MDM Hub Processes
    • Stage Process Flow sequence
    • Batch Viewer
    • Stage job from INFA VIBE platform

Module 4: Configure the Load Process

  • Trust and Validation Rules
    • Overview of Trust
    • Calculating Trust
    • Conditions for leveraging Trust
    • Characteristics of Validation Rules
    • Setting Trust
    • Behaviors for switching ON or OFF various functions
    • Adding a Validation Rule
    • Setting Cell and Null Updates
  • The Load Process
    • Load process for UPDATES, INSERTS, and REJECTS

Module 5A: Overview of Match and Merge Processes

  • Resolving issues with Master Data Management
  • Resolving duplication
  • Challenges associated with duplicate records
  • Purposes of Matching and Merging
  • Exact and Fuzzy matching
  • Match Tokens
  • Match Path
  • Match Columns
  • Characteristics of the Provider Columns for Match Columns
  • Multiple records match on multiple columns
  • Match Process Flow
  • Configure matching in the MDM Console

Module 5B: Configure Exact Matching

  • Match rules
  • Exact matching
  • Match rule properties for filtering (Matching NULLS and Segment Matching)
  • Match rules using match columns

Module 5C: Configure Fuzzy Matching

  • Fuzzy matching
  • Characteristics of Populations
  • Matching population types to their descriptions
  • Match Keys
  • Key Type matching keys property
  • Key Width match key property
  • Search level used in Fuzzy matching
  • Configuring a range search
  • Fields for standard populations
  • Match Purpose fuzzy rules property

Module 5D: Configure the Merge Process

  • Consolidation state of a record
  • Sequence of the Merge Process
  • Distinguish when to use Automatic versus Manual Merge methods
  • Immutable Source System
  • Distinct system
  • Rules of a distinct source system
  • Unmerge Child When Parent Unmerges option
  • Selecting Merge Properties (Match/Merge Setup Details)
  • Automatic match and merge
  • Running automatic match and merge jobs
  • External Matching

Module 6: Configure Data Access Views

  • Goals of queries and packages
  • Impact Analysis tool
  • Packages types
  • Creating a PUT-enabled package

Module 7: Configure Batch Processes and CLI Support

  • Configuring and Executing Batch Processes
    • Jobs most often assembled into Batch Groups
    • Assembling a Batch Group
    • Batch Groups, Batch Group Levels, and Batch Jobs
    • Creating and Executing a Batch Group
  • Running Jobs through CLI
    • CLI configuration
    • Password encryption
    • CLI capabilities

Module 8: Utilize Data Management Tools

  • The purposes of Data Steward and Merge Manager
  • Editing and merging records manually
  • Using Data Manager
  • Results of an unmerge
  • Standard and tree unmerge

Module 9: System Information and Logs

  • Enterprise Manager
    • MDM Hub Console Enterprise Manager tabs
  • Log Files
    • Log files related to the MDM Hub Console
    • Using Enterprise Manager’s ORS Database tab

Module 10: User Objects

  • User objects registered with MDM
  • User Object Registry tool
  • User Exits
  • Exits used in the Stage, Load, Match, Merge, and Unmerge processes

Module 11: Additional Product Features

  • Brief Overview of Additional Features
  • Exporting an ORS to a changelist using Metadata Manager

>> Multidomain MDM 10: Administrator, Specialist Certification

Certification Overview

This test measures your competency as a member of a Master Data Management Administration team. The skills and capabilities tested are on installation & upgrade (both test and user environments), performance tuning, where to find information, troubleshooting, batch processes, managing metadata, upgrade process,  and Security.

When you are ready to test and become an Informatica Certified Specialist (ICS), please follow these steps :

  1. Click Enroll and log in to your Informatica account.
  2. Click Add to Cart and complete your registration/purchase.
  3. Once you have registered go to My Training and View Your Transcript.
  4. Now you can simply Launch and take your test Anytime/Anywhere prior to your test’s expiry date.

The below can be used as a guide to preparing before taking the test. Included is an outline of the technical topics and subject areas that are covered in the test, test domain weighting, test objectives and topical content.

Target Audience

  • Administrator

Prerequisites

The skills and knowledge areas measured by this test are focused on product core functionality inside the realm of a standard project implementation. Training materials, supporting documentation and practical experience may become sources of question development.

The suggested training prerequisites for this certification level are the completion of the following Informatica course(s):

  • Multidomain MDM Administration and Installation ( Instructor Led) OR Multidomain MDM for Administrators (onDemand)
  • MDM: Multidomain and Hierarchy Configuration (Instructor Led) or MDM: Multidomain Edition Configuration (onDemand)

Skill Set Inventory

Test takers will be measured on:

  • Informatica Master Data Management installation and upgrade process
  • Identifying the components of Master Data Management Hub Architecture
  • Master Data Management Hub Administration tasks
  • Troubleshooting and performance tuning of batch processes

Test Domains

The test domains and the extent to which they are represented as an estimated percentage of the test follows:

Title % of Test
Intro to Master Data Management Hub 10%
Key Concepts 30%
Master Data Management Hub Administration 60%

Question Format

You may select from one or more response offerings to answer a question.

You will score the question correctly if your response accurately completes the statement or answers the question. Incorrect distractors are given as possible correct answers so that those without the required skills and experience may wrongly select that choice.

A passing grade of 70% is needed to achieve recognition as an Informatica Certified Specialist (ICS) in MDM 9.x: Administrator.
You are given 90 minutes to complete the test. Formats used in this test are:

  • Multiple Choice:  Select one option that best answers the question or completes the statement
  • Multiple Response: Select all that apply to best answer the question or complete the statement
  • True/False: After reading the statement or questions select the best answer

Test Policy

  • You are eligible for one attempt and re-take, if needed, per test registration.
  • If you do not pass on your first attempt:
    • The purchase of the test will include one second-attempt if a student does not pass a test.
    • You must wait two weeks after a failed test to take the test again.
    • Any additional retakes are charged the current fee at the time of purchase.
    • Promotions are excluded and cannot be combined.

Test Topics

The test will contain 70 questions comprised of topics that span across the sections listed below.  In order to ensure that you are prepared for the test, review the subtopics with each section.

Hub Console

10% of the exam is comprised of the following topics.

  • Hub Console Configuration
  • Hub Store

Key Concepts

30% of the exam is comprised of the following topics and Key Concepts measuring your understanding of the Informatica MDM Hub, Hub Architecture and functionality.

  • Base Objects
  • Consolidate Process
  • Cross-Reference (XREF) Tables
  • Databases in the Hub Store
  • History Tables
  • Batch Processes

Master Data Management Hub Administration

60% of the exam is comprised of the following topics and concepts measuring your understanding of the Informatica MDM Hub Administration Capabilities.

  • Administration tasks
  • Hub Console
  • Metadata Manager
  • Enterprise Manager
  • Security Providers
  • Security Access Manager configuration
  • Database and application server logging
  • Troubleshooting

Sample Test Questions

What is the purpose of the Environment report?
A. X To capture information about metadata
B. X To capture information about data
C. X To create a batch process report
D. Correct To create report about the installation configuration of the MDM Hub
When stage job entries are created in a Batch Viewer?
A. Correct After Base Objects are created
B. X As part of HUB Server installation
C. X After stage tables are created
D. X After an ORS Database is created
The SiperianClient jar file and the associated Javadocs are installed with?
A. X Informatica MDM Hub Server
B. X Informatica MDM Hub Cleanse Match Server
C. Correct Informatica MDM Hub Resource Kit
D. X Informatica MDM Hub Store
Which of the following application server is not supported by Informatica MDM Hub?
A. Correct GlassFish
B. X Jboss
C. X Weblogic
D. X WebSphere
Which of the following tools is not part of the Configuration workbench?
A. X Users
B. Correct Roles
C. X Security Providers
D. X Message Queues

Additional Information

Retake Policy: Current purchases of the test will include one second-attempt if a student does not pass a test. Any additional retakes are charged the current fee at the time of purchase. Promotions are excluded and cannot be combined. You must wait two weeks after a failed test to take the test again.

Informatica University has a community page so students can assist one another in their test preparation within the Informatica Network: https://network.informatica.com/welcome

For more information on Informatica Certifications visit https://www.informatica.com/services-and-training/certification.html
To find the class that is right for you, fill out the form at the top of this page, or visit our website at http://www.informatica.com/us/services-and-training/training/. For onsite class information contact your local Education Sales Specialist.

>> MDM: Using Informatica Data Director

Course Overview

This course provides business users, such as Data Stewards, with the knowledge and skills needed to manage the data in a master reference store using Informatica Business Data Director. This class is taught using a generic sample BDD Application and data model. This course is applicable to software version 10.1.

Objectives

After successfully completing this course, students should be able to:

  • Discuss Master Data Management
  • Search, cleanse and consolidate data
  • Import and Export Data
  • Use Timelines
  • Manage workflow
  • Manage IDD Security
  • Work with Hierarchies

Target Audience

  • Data Analyst
  • Administrator
  • Developer

Prerequisites

  • Metadata concepts
  • Database concepts

Agenda

Module 1: Introduction

  • Master Data Management
  • Data Model and IDD Representation

Module 2: Searching, Cleansing and Consolidating Data

  • Use and configure Search Capabilities like Smart Search
  • Use Cleanse Functions
  • Work with Merge Candidates
  • View cross references and history
  • Merge and Unmerging Records
  • View Business Entities

Module 3: Exporting and Importing Data

Module 4: Timeline Hierarchies Workflow Security

  • Timelines
  • Hierarchies
  • Workflows
  • IDD Security
  • Task Assignment and Notifications

>> Using Entity 360

Course Overview

This course is applicable for version 10.3. Learn how to maintain and improve master data to manage business entities, define and share queries to search for business entities, generate the search indexes required for the smart search, and leverage the cleanse mechanism to enrich the addresses through the address doctor. Gain skills to build and manage hierarchies, merge duplicate records to create master records, and perform an unmerge, collaborate as a team to onboard new business entities through multi-level approval workflows.

Objectives

After successfully completing this course, students should be able to:

  • Navigate an Entity 360 application
  • Describe load process and trust settings
  • Manage (add, edit, update, and delete) business entities
  • Define basic and extended search queries
  • Relate business entities and build hierarchies
  • Generate indexes for smart search
  • Describe and list Entity 360 views and layouts
  • Describe cleanse operations and enrich address through address doctor
  • Perform merge and unmerge tasks
  • Collaborate as a team to complete workflows

Target Audience

  • Data Steward
  • End User

Prerequisites

  • None

Agenda

Module 1: Introduction to Entity 360

  • Master Data and MDM terminology
  • Load process and trust settings
  • Entity 360 architecture, data flow, and MDM Tools
  • Entity 360 components and role specific access
  • Data Governance
  • Basic and extended queries
  • Hierarchies and Relationships
  • Data export and smart search
  • Lab: Define basic and extended search queries
  • Lab: Generate smart search indexes

Module 2: Data Onboarding

  • Entity 360 user interface and views
  • Related and Similar records components
  • Matching records and cross reference views
  • Match and Merge a business entity
  • Lab: Add records from the web application and merge two records for Person business entity
  • Lab: Merge two records for Organization business entity

Module 3: Cleansing and Enriching

  • Cleanse operations and use cases
  • Address enrichment through the address doctor
  • Lab: Enrich an address using the address doctor

Module 4: Workflows

  • ActiveVOS BPM workflows
  • Users and user roles
  • Triggers, tasks, and workflows
  • Lab: Onboard data with multilevel approval
  • Lab: Perform a two-step merge workflow
  • Lab: Perform a two-step unmerge workflow

=====

> Cloud

IICS: Cloud Data Integration Services (Instructor Led OR onDemand)
 

Cloud Data and Application Integration Specialist

IICS: Cloud Data Integration Services (Instructor Led OR onDemand)
Cloud Application Integration Services (Instructor Led or onDemand)

Cloud B2B Gateway

Informatica Cloud B2B Gateway: Foundations (Instructor Led OR onDemand)
 
 

>> IICS: Cloud Data Integration Services

Course Overview

This course is applicable to version R32.  Learn the fundamentals of Informatica Intelligent Cloud Services (IICS) including the architecture and data integration features, synchronization tasks, cloud mapping designer, masking tasks, and replication tasks. This course enables you to operate and manage user security, secure agents, and monitor tasks and resources in IICS.

Objectives

After successfully completing this course, students should be able to:

  • Describe Informatica Cloud Architecture
  • Install the secure agent and create connections
  • Create Synchronization task
  • Use Cloud Mapping Designer to create Mappings and Mapping Tasks
  • Create a Replication task
  • Create a Masking task
  • Create a Mass Ingestion task
  • Create Taskflows
  • Use IICS REST web services for data integration
  • Use Intelligent Structure Model to parse data
  • Handle exceptions
  • Use advanced data integration features to optimize performance of jobs
  • Automate and monitor tasks
  • Configure advanced administration settings in IICS
  • Distinguish users and groups
  • Configure custom roles in IICS
  • Configure SAML setup
  • Use Discovery IQ features to manage, monitor, and troubleshoot integration processes

Target Audience

  • Operator
  • Developer

Prerequisites

  • None

Agenda

Module 1: Informatica Cloud Overview

  • Informatica Intelligent Cloud Services (IICS) as an iPaaS solution
  • Informatica Cloud Terminologies
  • Informatica Cloud Architecture
  • CDI Assets
  • CDI Components
  • Lab: Navigating the IICS interface

Module 2: Runtime Environments and Connections

  • Runtime Environments
  • Secure Agent Architecture
  • IICS Log Files
  • Connections
  • Connection Types
  • Creating Connections
  • Lab: Creating a Salesforce connection
  • Lab: Creating a Flat File connection
  • Lab: Creating an Oracle connection

Module 3: Synchronization Task

  • Synchronization Task Overview
  • Synchronization Task – Definition Step
  • Synchronization Task – Source Step
  • Synchronization Task – Target Step
  • Synchronization Task – Data Filters Step
  • Synchronization Task – Field Mapping Step
  • Synchronization Task – Schedule Step
  • Activity Monitor
  • Lab: Creating a Synchronization Task
  • Lab: Using Filter, Expression and Lookup in a Synchronization Task
  • Lab: Creating a Synchronization Task with Multiple Object Source Type
  • Lab: Using Pre and Post SQL commands in a Synchronization Task

Module 4: Cloud Mapping Designer – Basic Transformations

  • Cloud Mapping Designer Overview
  • CLAIRE Transformation Recommendations
  • Mapping Designer Terminologies
  • Source Transformation
  • Target Transformation
  • Filter Transformation
  • Joiner Transformation
  • Expression Transformation
  • Lookup Transformation
  • Field Rules
  • Best Practices for Creating Mappings
  • Lab: Creating a mapping using basic transformations

Module 5: Advanced Transformations and Mapping Tasks

  • Aggregator Transformation
  • Normalizer Transformation
  • Java Transformation
  • SQL Transformation
  • Union Transformation
  • Lookup Transformation
  • Rank Transformation
  • Sequence Generator Transformation
  • Data Masking Transformation
  • Mapping Task
  • Lab: Using Normalizer, Aggregator, and Rank transformations in a mapping
  • Lab: Creating a Mapping Task
  • Lab: Creating a mapping using Unconnected Lookup Transformation
  • Lab: Creating a mapping using SQL Transformation

Module 6: Mapping Parameters

  • Parameterization use cases
  • Adding Parameters to a Mapping
  • Creating parameters
  • Parameter Types
  • Using parameter files
  • Parameter Best Practices
  • Lab: Performing Complete Parameterization
  • Lab: Using Parameter File in a Mapping task
  • Lab: Using In-Out parameters for incremental data loading

Module 7: Expression Macro and Dynamic Linking

  • Expression Macro
  • Dynamic Linking
  • Lab: Using Expression Macro in a mapping
  • Lab: Using Dynamic Linking in a mapping

Module 8: Replication Task

  • Replication Task Overview
  • Replication Task Features
  • Replication Task: Source and Target Options
  • Other Replication Task Options
  • Resetting the Target Table
  • Generating Non-Unique Index
  • Lab: Replicating Data to a Flat File

Module 9: Masking Task

  • Masking Task Overview
  • Masking Task: Source and Target Options
  • Data Subset: Row Limits and Data Filters
  • Field Masking Rules
  • Masking Rule Types
  • Refresh masking Task metadata
  • Reset a masking Task
  • Masking Best Practices
  • Lab: Creating a Masking Task

Module 10: Mass Ingestion Task

  • Mass Ingestion Task Overview
  • Functionalities
  • Configuring a Mass Ingestion Task
  • Creating a Mass Ingestion Task

Module 11: Taskflows

  • Taskflow overview
  • Linear Taskflows
  • Taskflow Templates
  • Using REST APIs
  • Invoke a Taskflow through a File Listener
  • Lab: Creating a Parallel Taskflow
  • Lab: Passing in-out parameters in a Taskflow
  • Lab: Invoking a Taskflow through a File Listener

Module 12: Advanced Options

  • Primary Key Chunking
  • Lookup SQL Override
  • Lab: Configuring SQL override setting in Lookup transformation
  • Lab: Using Primary Key chunking in a Synchronization tasks

Module 13: Hierarchical Connectivity

  • Web Service transformation
  • REST V2 Connector
  • Hierarchical Schemas
  • Hierarchy Parser Transformation
  • Hierarchy Builder Transformation
  • Lab: Creating a mapping using a REST V2 connector
  • Lab: Using Web Services transformation in a mapping
  • Lab: Creating a mapping using Hierarchy Parser Transformation
  • Lab: Creating a mapping using Hierarchy Builder Transformation

Module 14: Intelligent Structure Model

  • Intelligent Structure Model
  • Intelligent Structure Discovery Process
  • Intelligent Structure Example
  • Refining a Discovered Structure
  • Editing an Intelligent Structure Model
  • Using intelligent structure models in Structure Parser transformations
  • Lab: Creating an Intelligent Structure Model
  • Lab: Using Structure Parser transformation in a mapping

Module 15: IICS APIs

  • REST API overview
  • Informatica Cloud REST API
  • REST API Versions
  • Request Header and Request Body configuration
  • Return Lists
  • RunAJob Utility
  • Lab: Running a Mapping task using REST API

Module 16: Exception Handling

  • Types of exceptions
  • User-defined exceptions
  • Non-fatal exceptions
  • Default Field Value Setting
  • Row Error Logging
  • Error handling settings
  • Fatal exceptions
  • Bad files or Reject files
  • Lab: Creating a mapping to handle non-fatal errors

Module 17: Performance Tuning

  • Partitions Overview
  • Types of Partitions
  • Partitioning Rules and Guidelines
  • Pushdown optimization overview
  • Types of pushdown optimization
  • Cross-schema pushdown optimization
  • Pushdown optimization user-defined parameters
  • Pushdown compatible connections
  • Secure agent groups
  • Secure agent groups with multiple agents
  • Shared secure agent groups
  • DTM performance properties
  • Lab: Using partitions in a mapping
  • Lab: Using pushdown optimization in a mapping task

Module 18: Automating and Monitoring Tasks

  • Schedules
  • Schedule Repeat Frequency
  • Schedule Blackout Period
  • Monitoring tasks
  • Email Notifications
  • Event Monitoring
  • Lab: Creating a schedule

Module 19: Administration

  • Licenses
  • Administrator Service
  • User Roles (System-defined Roles and Custom Roles)
  • Creating a User
  • User Groups
  • Permissions
  • Object-Level Permissions
  • Organization Hierarchy
  • Sub-Organization
  • Importing and Exporting Assets
  • Add-on Bundles
  • Lab: Configure Administrative settings for your Informatica Cloud org
  • Lab: Creating a sub-organization and importing/exporting assets

Module 20: SAML Setup

  • Security Assertion Markup Language (SAML) Overview
  • SAML requirements
  • SAML restrictions
  • SAML configuration

Module 21: Discovery IQ

  • Discuss Informatica Discovery IQ
  • List the features of Discovery IQ
  • Exploring Discovery IQ features
 

 

>> Cloud Application Integration Services for Developers

Course Overview

This course is applicable to version R33. Gain the skills necessary to integrate applications and systems, implement business processes using process designer that can access data from on-premise as well as cloud, and expose them as composite APIs. This course is applicable to the Cloud Application Integration offering within Informatica Intelligent Cloud Services (IICS).

Objectives

After successfully completing this course, students should be able to:

  • Explain Informatica Intelligent Cloud Services (IICS) and Cloud Application Integration (CAI)
  • Describe Processes and Guides
  • Consume third-party applications and REST/SOAP services
  • Establish error handling routines
  • Expose developed assets as API endpoints
  • Manage APIs

Target Audience

  • Developer

Prerequisites

  • None

Agenda

Module 1: Overview of Cloud Application Integration

  • IICS Platform
  • Cloud Application Integration
  • Significance of Cloud Integration
  • CAI Features
  • CAI Assets
  • CAI Components

Module 2: Understand the Basics: Process Designer

  • Processes
  • Process Designer
  • Process Steps
  • Lab: Create a basic process to display user input

Module 3: Working with Assets

  • CAI assets
  • Process Objects
  • Connectors
  • Service Connectors
  • Connections
  • File Connection
  • Amazon S3 Connection
  • Kafka Connection
  • Salesforce Connection
  • RabbitMQ Connection
  • Lab: Create a Process Object
  • Lab: Create a Service Connector
  • Lab: Create a connection using service connector
  • Lab: Create a JDBC Connection
  • Lab: Create a Customer and Order details process
  • Lab: Create a File Connection
  • Lab: Create a Salesforce Connection
  • Lab: Create a Kafka Connection

Module 4: Adding Web Services to a Process

  • REST-based Service Connectors
  • Synchronous and Asynchronous Web Service Call Processes
  • Lab: Invoke a Synchronous Web Service Call
  • Lab: Invoke an Asynchronous Web Service Call

Module 5: Fault Handling

  • Fault handling
  • Significance of fault handling
  • Fault handling techniques
  • Fault Triggers and methods to return a fault from a process
  • Lab: Handling Sub-process Failures
  • Lab: Handling Credential Failures

Module 6: Introduction to Guides Designer

  • Salesforce Managed Package
  • Guides Designer
  • Guides Designer Steps
  • Lab: Create a guide

Module 7: API Management

  • API Manager
  • API Registry
  • API Groups
  • API Portal
  • Managed API
  • Lab: API Manager

Module 8: Troubleshooting, Tips & Tricks, Best Practices

  • Assets and Processes Management
  • Common Issues
  • Troubleshooting Methods

Module 9: Live Project

  • Lab: Automotive Services
 
 
 

>> Cloud Data and Application Integration, Specialist Certification

Course Overview

This test measures your knowledge of Informatica Cloud Services Data and Application Integration. Measured areas include Informatica Cloud architecture, data integration features, data synchronization, cloud mapping designer, data masking, and data replication. It will test your knowledge of integrating applications and systems, the implementation of business processes, using the process designer, and exposing composite APIs.

When you are ready to test and become an Informatica Certified Specialist (ICS), please follow these steps :

  1. Click Enroll and log in to your Informatica account.
  2. Click Add to Cart and complete your registration/purchase.
  3. Once you have registered go to My Training and View Your Transcript.
  4. Now you can simply Launch and take your test Anytime/Anywhere prior to your test’s expiry date.

The below can be used as a guide to preparing before taking the test. Included is an outline of the technical topics and subject areas that are covered in the test, test domain weighting, test objectives and topical content.

    Skill Set Inventory

Test takers will be measured on:

  • Building and running tasks, mappings, and taskflows
  • Installation of Informatica Cloud Secure Agent
  • Automation of data integration jobs
  • Cloud Mapping Designer and Mapping Tasks
  • Advanced features used to enhance and optimize integrations
  • Administration tasks in IICS org
  • Installation of Secure Agent
  • Transformations like Hierarchy Parser, Unconnected Lookups
  • Advanced features such as Macro Expressions and Dynamic Linking
  • Parameterization and Pushdown Optimization features
  • REST web services
  • Advanced settings such as PK Chunking, Pre/Post SQL, and            SQL Override
  • Discovery IQ
  • Cloud Application Integration (CAI)
  • Basic flows using process
  • Flows as API endpoints
  • Third-party REST/SOAP services in the flow
  • Error handling
  • APIs using API portal

Test Domains

The test domains and the extent to which they are represented as an estimated percentage of the test follows:

Title % of Test
Cloud Overview 3%
Informatica Cloud Secure Agent and Architecture 3%
Creating Connections 3%
Data Synchronization Application 3%
Relationships and Integration 3%
Qualifying and Transforming Source Data 3%
Advanced Source Options and Field Lookups 2%
Data Replication Application 3%
Automating and Monitoring Tasks 3%
Cloud Mapping Designer 9%
Mapping Task and Mapping Parameters 6%
Advanced Task Options and Salesforce Options 4%
Data Masking Application 3%
Basic Administration and Advanced Administration 8%
Taskflows: Linear Taskflow and Mass Ingestion Task 3%
Introduction to Cloud & Platform Interoperability 1%
Expression Macro and Dynamic Linking 3%
Hierarchical Connectivity 4%
Informatica Cloud REST API 1%
Performance and Scalability 7%
Advanced Properties Settings 4%
Discovery IQ 1%
Advanced Parameterization and REST Utilization 3%
Exception Handling 3%
Introduction to Cloud Application Integration and Basic Building Blocks 2%
Process Samples 4%
Error Handling and Working with API 2%
Connections 4%

 Sample Test Questions

Informatica Cloud allows users to
A. Correct Synchronize account data
B. X Work from their handheld devices
C. Correct Import data on sales leads
D. X Monitor PowerCenter jobs
Informatica Cloud system variables can be used to:
A. Correct Read only those records that have been inserted or updated since the task last ran.
B. Correct Read only those records that have changed since the last time the task ran.
C. Correct Read only those records that were created since the date the task last ran.
D. Correct Read only those records that were created since the datetime the task last ran.
Expression transformations cannot use aggregate functions.
A. Correct True
B. X False
Which of the following Data load types are possible in IICS?
A. Correct Cloud to Cloud
B. Correct Cloud to Ground
C. Correct Ground to Cloud
D. Correct Ground to Ground
Which of the following services is used to design a Processes in CAI
A. X API Portal
B. Correct Process Designer
C. X API Manager
D. X Monitor

Additional InformationRetake Policy: Current purchases of the test will include one second-attempt if a student does not pass a test. Any additional retakes are charged the current fee at the time of purchase. Promotions are excluded and cannot be combined. You must wait two weeks after a failed test to take the test again.

Informatica University has a community page so students can assist one another in their test preparation within the Informatica Network: https://network.informatica.com/welcome

For more information on Informatica Certifications visit https://www.informatica.com/services-and-training/certification.html
To find the class that is right for you, fill out the form at the top of this page, or visit our website at http://www.informatica.com/us/services-and-training/training/. For onsite class information contact your local Education Sales Specialist.

>> Informatica Cloud B2B Gateway: Foundations

Course Overview

Applicable for Release 30. Learn to use the Informatica Cloud B2B Gateway for electronic data interchange (EDI) processes. Master the onboarding of a partner, set up the inbound and outbound flow for the partners, set up a Managed File Transfer (MFT) for AS2, create and track EDI messages.

Objectives

After successfully completing this course, students should be able to:

  • Apply Informatica Cloud B2B Gateway for EDI
  • Setup and manage Informatica Cloud B2B Gateway
  • Create and manage partners (customers and suppliers)
  • Develop Informatica Cloud mappings and tasks for EDI
  • Create inbound and outbound flow
  • Describe CLAIRE Intelligent Structure discovery to handle incoming non-EDI files
  • Create and track EDI messages

Target Audience

  • Developer

Prerequisites

  • None

Agenda

Module 01: Introduction to Informatica Cloud B2B Gateway

  • Electronic Data Interchange (EDI)
  • Informatica Cloud B2B Gateway for EDI processes
  • Features of Informatica Cloud B2B Gateway
  • Access Informatica Cloud B2B Gateway
  • Terminologies Informatica Cloud B2B Gateway
  • User Roles
  • Lab1: Getting started

Module 02: Partner Onboarding

  • B2B trading partners
  • Customer and supplier
  • Steps to create a partner
  • Message properties for EDI files
  • Custom files with custom mappings
  • Intelligent Structure Discovery message structures
  • Lab1: Create a partner

Module 03: Inbound Flow Setup

  • EDI 850 message
  • Inbound message setup
  • Inbound mappings
  • Inbound flow
  • Partner inbound properties
  • Lab1: Create an inbound mapping in IICS
  • Lab2: Configure inbound setup for a partner

Module 04: Outbound Flow Setup

  • EDI 810 message
  • Outbound flow
  • Outbound page properties
  • Lab1: Create an outbound mapping in IICS
  • Lab2: Configure outbound setup for a partner

Module 05: Handling custom files in B2B Gateway

  • Custom files
  • Steps to handle custom files using custom mappings
  • Cloud-scale AI-powered Real-time Engine (CLAIRE)
  • Capabilities of the CLAIRE Engine
  • Intelligent Structure Discovery
  • Use of Intelligent Structure Discovery
  • Message structure
  • Custom file monitoring

Module 06: Managed File Transfer and AS2 Endpoint Setup

  • Managed File Transfer (MFT) overview
  • MFT general features
  • MFT benefits
  • MFT application login
  • Common terminologies
  • Applicability Statement 2 (AS2) protocol overview
  • AS2 messages (Encryption)
  • AS2 messages (Signing)
  • AS2 messages (Compression)
  • Lab1: Configure AS2 Server

Module 07: Managed File Transfer Advanced Settings

  • CB2B and MFT integration
  • Open PGP
  • Encryption process
  • Decryption process
  • SSL Certificate Manager
  • Workflow for file transfer using SFTP
  • File monitor